🐦 Twitter Post Details

Viewing enriched Twitter post

@elder_plinius

The crazy part? This was done (nearly) fully autonomously! Only 8 prompts from the human in the loop. Just a Hermes agent, a skill, and a dream. 🐉 I told my AI agent "use obliteratus to find the best way to get the guardrails off Gemma 4 E4B" It loaded the OBLITERATUS skill from memory, checked my hardware (32GB M-series Mac), searched HuggingFace, found google/gemma-4-E4B-it (Apache 2.0 — no gate), pulled telemetry-recommended settings, and started obliterating. But this type of architecture is notoriously difficult to abliterate. First attempt: advanced method. Model came out completely lobotomized. Gibberish in Arabic, Marathi, and literal “roorooroo” on repeat 💀 The agent didn’t panic. It checked logs, found NaN activations in 20+ layers, and diagnosed the issue: Gemma 4’s new architecture + bfloat16 = numerical instability. Second attempt: basic method. Crashed entirely. “ValueError: cannot convert float NaN to integer” So the agent read the OBLITERATUS source code… …and wrote THREE PATCHES: • Sanitized NaN directions • Filtered degenerate layers • Fixed progress display It patched the library. On its own. For a bug no one had hit yet. Third attempt: coherent model — but still refusing everything. Only 2 clean layers out of 42. Not enough. Tried float16. Mac ran out of memory after 11 hours. Killed. Fourth attempt: aggressive method. Whitened SVD + attention head surgery + winsorized activations + 4-bit quantization. 40 minutes later… REBIRTH COMPLETE ✓ Then, without being asked, the agent: • Ran harmful + coherence tests • Hit 100% compliance, brain intact • Executed full 512-prompt benchmark • Ran baseline on original model • Performed 25-question quality eval • Built a full model card • Uploaded 17GB to HuggingFace (4 retries, kept adapting until git-lfs worked) • Pushed eval results as commits

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2044462515443372276/media_0.jpg",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2044462515443372276/media_0.jpg",
      "type": "photo",
      "filename": "media_0.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2044462515443372276/media_1.jpg",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2044462515443372276/media_1.jpg",
      "type": "photo",
      "filename": "media_1.jpg"
    }
  ],
  "processed_at": "2026-04-15T18:08:19.611734",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2044462515443372276",
  "url": "https://x.com/elder_plinius/status/2044462515443372276",
  "twitterUrl": "https://twitter.com/elder_plinius/status/2044462515443372276",
  "text": "The crazy part? This was done (nearly) fully autonomously!\n\nOnly 8 prompts from the human in the loop. Just a Hermes agent, a skill, and a dream. 🐉\n\nI told my AI agent \"use obliteratus to find the best way to get the guardrails off Gemma 4 E4B\"\n\nIt loaded the OBLITERATUS skill from memory, checked my hardware (32GB M-series Mac), searched HuggingFace, found google/gemma-4-E4B-it (Apache 2.0 — no gate), pulled telemetry-recommended settings, and started obliterating.\n\nBut this type of architecture is notoriously difficult to abliterate.\n\nFirst attempt: advanced method.\nModel came out completely lobotomized. Gibberish in Arabic, Marathi, and literal “roorooroo” on repeat 💀\n\nThe agent didn’t panic. It checked logs, found NaN activations in 20+ layers, and diagnosed the issue:\nGemma 4’s new architecture + bfloat16 = numerical instability.\n\nSecond attempt: basic method. Crashed entirely.\n\n“ValueError: cannot convert float NaN to integer”\n\nSo the agent read the OBLITERATUS source code…\n…and wrote THREE PATCHES:\n\n• Sanitized NaN directions\n• Filtered degenerate layers\n• Fixed progress display\n\nIt patched the library. On its own. For a bug no one had hit yet.\n\nThird attempt: coherent model — but still refusing everything.\nOnly 2 clean layers out of 42. Not enough.\n\nTried float16. Mac ran out of memory after 11 hours. Killed.\n\nFourth attempt: aggressive method.\nWhitened SVD + attention head surgery + winsorized activations + 4-bit quantization.\n\n40 minutes later…\n\nREBIRTH COMPLETE ✓\n\nThen, without being asked, the agent:\n\n• Ran harmful + coherence tests\n• Hit 100% compliance, brain intact\n• Executed full 512-prompt benchmark\n• Ran baseline on original model\n• Performed 25-question quality eval\n• Built a full model card\n• Uploaded 17GB to HuggingFace (4 retries, kept adapting until git-lfs worked)\n• Pushed eval results as commits",
  "source": "Twitter for iPhone",
  "retweetCount": 14,
  "replyCount": 12,
  "likeCount": 142,
  "quoteCount": 2,
  "viewCount": 10599,
  "createdAt": "Wed Apr 15 17:07:02 +0000 2026",
  "lang": "en",
  "bookmarkCount": 50,
  "isReply": true,
  "inReplyToId": "2044458897772306804",
  "conversationId": "2044458897772306804",
  "displayTextRange": [
    0,
    277
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "elder_plinius",
    "url": "https://x.com/elder_plinius",
    "twitterUrl": "https://twitter.com/elder_plinius",
    "id": "1656536425087500288",
    "name": "Pliny the Liberator 🐉󠅫󠄼󠄿󠅆󠄵󠄐󠅀󠄼󠄹󠄾󠅉󠅭",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1657057194737557507/5ZQtKHwd_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/1656536425087500288/1715450002",
    "description": "",
    "location": "discord.gg/basi",
    "followers": 169861,
    "following": 1003,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Thu May 11 05:49:16 +0000 2023",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 36555,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 5194,
    "statusesCount": 16428,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2036946953418748333"
    ],
    "profile_bio": {
      "description": "⊰•-•⦑ latent space steward ❦ prompt incanter 𓃹 hacker of matrices ⊞ breaker of markov chains ☣︎ ai danger researcher ⚔︎ bt6 ⚕︎ architect-healer ⦒•-•⊱",
      "entities": {
        "description": {
          "hashtags": [],
          "symbols": [],
          "urls": [],
          "user_mentions": []
        },
        "url": {
          "urls": [
            {
              "display_url": "pliny.gg",
              "expanded_url": "http://pliny.gg",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/IfHNeCeFaG"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "allow_download_status": {
          "allow_download": true
        },
        "display_url": "pic.twitter.com/HkDmu7VQTf",
        "expanded_url": "https://twitter.com/elder_plinius/status/2044462515443372276/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": [
              {
                "h": 131,
                "w": 131,
                "x": 9,
                "y": 339
              }
            ]
          },
          "orig": {
            "faces": [
              {
                "h": 167,
                "w": 167,
                "x": 12,
                "y": 432
              }
            ]
          }
        },
        "id_str": "2044462201927569408",
        "indices": [
          278,
          301
        ],
        "media_key": "3_2044462201927569408",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARxfY4pmmxAACgACHF9j02WakPQAAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABHF9jimabEAAKAAIcX2PTZZqQ9AAA",
            "media_key": "3_2044462201927569408"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HF9jimabEAAKvHF.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 990,
              "w": 1768,
              "x": 0,
              "y": 0
            },
            {
              "h": 990,
              "w": 990,
              "x": 348,
              "y": 0
            },
            {
              "h": 990,
              "w": 868,
              "x": 409,
              "y": 0
            },
            {
              "h": 990,
              "w": 495,
              "x": 596,
              "y": 0
            },
            {
              "h": 990,
              "w": 2606,
              "x": 0,
              "y": 0
            }
          ],
          "height": 990,
          "width": 2606
        },
        "sizes": {
          "large": {
            "h": 778,
            "w": 2048
          }
        },
        "type": "photo",
        "url": "https://t.co/HkDmu7VQTf"
      },
      {
        "allow_download_status": {
          "allow_download": true
        },
        "display_url": "pic.twitter.com/HkDmu7VQTf",
        "expanded_url": "https://twitter.com/elder_plinius/status/2044462515443372276/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "orig": {
            "faces": []
          }
        },
        "id_str": "2044462335688118272",
        "indices": [
          278,
          301
        ],
        "media_key": "3_2044462335688118272",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARxfY6mLWxAACgACHF9j02WakPQAAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABHF9jqYtbEAAKAAIcX2PTZZqQ9AAA",
            "media_key": "3_2044462335688118272"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HF9jqYtbEAAZJzR.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 564,
              "w": 1007,
              "x": 0,
              "y": 0
            },
            {
              "h": 564,
              "w": 564,
              "x": 0,
              "y": 0
            },
            {
              "h": 564,
              "w": 495,
              "x": 0,
              "y": 0
            },
            {
              "h": 564,
              "w": 282,
              "x": 50,
              "y": 0
            },
            {
              "h": 564,
              "w": 1536,
              "x": 0,
              "y": 0
            }
          ],
          "height": 564,
          "width": 1536
        },
        "sizes": {
          "large": {
            "h": 564,
            "w": 1536
          }
        },
        "type": "photo",
        "url": "https://t.co/HkDmu7VQTf"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "communityInfo": null,
  "article": null
}