🐦 Twitter Post Details

Viewing enriched Twitter post

@iScienceLuvr

Copy Suppression: Comprehensively Understanding an Attention Head website: https://t.co/szSr7L6KZ1 abs: https://t.co/yD459BioOs "To the best of our knowledge, this is the most comprehensive description of the complete role of a component in a language model to date." https://t.co/fQUUvc2ypp

Media 1

copy-suppression.streamlit.app

The content requires JavaScript to function, indicating it's an interactive application....

• JavaScript is necessary to run the app.

• The content is not accessible without enabling JavaScript.

arXiv

Copy Suppression: Comprehensively Understanding an Attention Head

This article explores the role of an attention head in GPT-2 Small that suppresses naive copying behavior, enhancing model calibration and self-repair...

• Introduces copy suppression in GPT-2 Small.

• Explains self-repair mechanisms in language models.

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1711656632940371968/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1711656632940371968/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    }
  ],
  "nlp": {
    "sentiment": "positive",
    "processed_at": "2025-08-06T12:44:32.564339"
  },
  "score": 1.0,
  "scored_at": "2025-08-09T13:46:07.542454",
  "import_source": "manual_curation_2023",
  "score_components": {
    "author": 0.09,
    "engagement": 0.1125210001154447,
    "quality": 0.16000000000000003,
    "source": 0.15,
    "nlp": 0.1,
    "recency": 0.010000000000000002
  },
  "source_tagged_at": "2025-08-09T13:42:53.305009",
  "enriched": true,
  "enriched_at": "2025-08-09T13:42:53.305012",
  "enriched_links": [
    {
      "url": "https://t.co/szSr7L6KZ1",
      "title": "",
      "description": "The content requires JavaScript to function, indicating it's an interactive application.",
      "content_type": "article",
      "author": null,
      "site_name": "copy-suppression.streamlit.app",
      "image_url": null,
      "key_points": [
        "JavaScript is necessary to run the app.",
        "The content is not accessible without enabling JavaScript.",
        "No additional information is provided in the visible content."
      ],
      "enriched_at": "2025-08-10T10:28:59.093850"
    },
    {
      "url": "https://t.co/yD459BioOs",
      "title": "Copy Suppression: Comprehensively Understanding an Attention Head",
      "description": "This article explores the role of an attention head in GPT-2 Small that suppresses naive copying behavior, enhancing model calibration and self-repair mechanisms.",
      "content_type": "article",
      "author": "Callum McDougall, Arthur Conmy, Cody Rushing, Thomas McGrath, Neel Nanda",
      "site_name": "arXiv",
      "image_url": null,
      "key_points": [
        "Introduces copy suppression in GPT-2 Small.",
        "Explains self-repair mechanisms in language models.",
        "Demonstrates impact of attention head on model calibration."
      ],
      "enriched_at": "2025-08-10T10:29:03.217650"
    }
  ],
  "llm_enriched": true,
  "llm_enriched_at": "2025-08-10T10:29:03.697027",
  "original_structure": "had_media_only",
  "enhanced_from_raw_response": true,
  "enhanced_at": "2025-08-14T03:18:40.269041"
}

🔧 Raw API Response

{
  "user": {
    "created_at": "2011-12-20T03:45:50.000Z",
    "default_profile_image": false,
    "description": "PhD at 19 |\nFounder and CEO at @MedARC_AI |\nResearch Director at @StabilityAI | \n@kaggle Notebooks GM |\nBiomed. engineer @ 14 |\nTEDx talk➡https://t.co/DwMkst4bnG",
    "fast_followers_count": 0,
    "favourites_count": 60004,
    "followers_count": 45437,
    "friends_count": 995,
    "has_custom_timelines": true,
    "is_translator": false,
    "listed_count": 703,
    "location": "",
    "media_count": 1203,
    "name": "Tanishq Mathew Abraham, PhD",
    "normal_followers_count": 45437,
    "possibly_sensitive": false,
    "profile_banner_url": "https://pbs.twimg.com/profile_banners/441465751/1675968078",
    "profile_image_url_https": "https://pbs.twimg.com/profile_images/1553508977735962624/nnlSwBmu_normal.jpg",
    "screen_name": "iScienceLuvr",
    "statuses_count": 12087,
    "translator_type": "none",
    "url": "https://t.co/nNzCz2VVd1",
    "verified": false,
    "withheld_in_countries": [],
    "id_str": "441465751"
  },
  "id": "1711656632940371968",
  "conversation_id": "1711656632940371968",
  "full_text": "Copy Suppression: Comprehensively Understanding an Attention Head\n\nwebsite: https://t.co/szSr7L6KZ1\nabs: https://t.co/yD459BioOs\n\n\"To the best of our knowledge, this is the most comprehensive description of the complete role of a component in a language model to date.\" https://t.co/fQUUvc2ypp",
  "reply_count": 0,
  "retweet_count": 20,
  "favorite_count": 137,
  "hashtags": [],
  "symbols": [],
  "user_mentions": [],
  "urls": [
    {
      "url": "https://t.co/szSr7L6KZ1",
      "expanded_url": "https://copy-suppression.streamlit.app/",
      "display_url": "copy-suppression.streamlit.app"
    },
    {
      "url": "https://t.co/yD459BioOs",
      "expanded_url": "https://arxiv.org/abs/2310.04625",
      "display_url": "arxiv.org/abs/2310.04625"
    }
  ],
  "media": [
    {
      "media_url": "https://pbs.twimg.com/media/F8EGjbmaAAAQL6K.jpg",
      "type": "photo"
    }
  ],
  "url": "https://twitter.com/iScienceLuvr/status/1711656632940371968",
  "created_at": "2023-10-10T08:15:16.000Z",
  "#sort_index": "1711656632940371968",
  "view_count": 17225,
  "quote_count": 0,
  "is_quote_tweet": false,
  "is_retweet": false,
  "is_pinned": false,
  "is_truncated": false,
  "startUrl": "https://twitter.com/iscienceluvr/status/1711656632940371968"
}