🐦 Twitter Post Details

Viewing enriched Twitter post

@AlphaSignalAI

Transformers just got a serious rival. Allen AI just open-sourced a 7B model that beats its own transformer. OLMo Hybrid mixes standard attention with linear RNN layers into one architecture. > Same accuracy, half the training data > Long-context jumps from 70.9% to 85.0% > Beats the pure transformer on every eval domain > Fully open: base, fine-tuned, and aligned versions The trick is a 3:1 pattern. Three recurrent layers handle most of the sequence processing cheaply. One attention layer then catches what the recurrent state missed. This cuts 75% of the expensive attention operations while keeping precision where it matters. Building long-context apps used to mean paying the full cost of attention across every layer. Now you can get better long-context performance with a leaner architecture, and the theory proving why it scales better is released alongside the weights. https://t.co/bxZ7ckAOq4

Media 1
Media 2

πŸ“Š Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029724012088009006/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029724012088009006/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029724012088009006/media_1.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029724012088009006/media_1.jpg?",
      "type": "photo",
      "filename": "media_1.jpg"
    }
  ],
  "processed_at": "2026-03-06T14:09:23.307554",
  "pipeline_version": "2.0"
}

πŸ”§ Raw API Response

{
  "type": "tweet",
  "id": "2029724012088009006",
  "url": "https://x.com/AlphaSignalAI/status/2029724012088009006",
  "twitterUrl": "https://twitter.com/AlphaSignalAI/status/2029724012088009006",
  "text": "Transformers just got a serious rival. Allen AI just open-sourced a 7B model that beats its own transformer.\n\nOLMo Hybrid mixes standard attention with linear RNN layers into one architecture.\n\n> Same accuracy, half the training data\n> Long-context jumps from 70.9% to 85.0%\n> Beats the pure transformer on every eval domain\n> Fully open: base, fine-tuned, and aligned versions\n\nThe trick is a 3:1 pattern. Three recurrent layers handle most of the sequence processing cheaply. One attention layer then catches what the recurrent state missed.\n\nThis cuts 75% of the expensive attention operations while keeping precision where it matters.\n\nBuilding long-context apps used to mean paying the full cost of attention across every layer. Now you can get better long-context performance with a leaner architecture, and the theory proving why it scales better is released alongside the weights.\n\nhttps://t.co/bxZ7ckAOq4",
  "source": "Twitter for iPhone",
  "retweetCount": 6,
  "replyCount": 0,
  "likeCount": 47,
  "quoteCount": 2,
  "viewCount": 3950,
  "createdAt": "Fri Mar 06 01:01:29 +0000 2026",
  "lang": "en",
  "bookmarkCount": 26,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2029724012088009006",
  "displayTextRange": [
    0,
    285
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "AlphaSignalAI",
    "url": "https://x.com/AlphaSignalAI",
    "twitterUrl": "https://twitter.com/AlphaSignalAI",
    "id": "114783808",
    "name": "AlphaSignal AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": "Business",
    "profilePicture": "https://pbs.twimg.com/profile_images/2014100845189529600/Ff1Xc28-_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/114783808/1769023589",
    "description": "The latest news from the top 100 companies in AI. Over 280,000 devs read our newsletter.",
    "location": "Signup β†’",
    "followers": 11621,
    "following": 277,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Tue Feb 16 16:04:01 +0000 2010",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {
        "urls": [
          {
            "display_url": "AlphaSignal.ai",
            "expanded_url": "http://AlphaSignal.ai",
            "indices": [
              0,
              23
            ],
            "url": "https://t.co/Fnj7bpl3qS"
          }
        ]
      }
    },
    "fastFollowersCount": 0,
    "favouritesCount": 2493,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 37,
    "statusesCount": 168,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1888966579125153939"
    ],
    "profile_bio": {},
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.x.com/fUd5gMeA1J",
        "expanded_url": "https://x.com/AlphaSignalAI/status/2029724012088009006/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "medium": {
            "faces": []
          },
          "orig": {
            "faces": []
          },
          "small": {
            "faces": []
          }
        },
        "id_str": "2029724009965785089",
        "indices": [
          286,
          309
        ],
        "media_key": "3_2029724009965785089",
        "media_results": {
          "result": {
            "media_key": "3_2029724009965785089"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HCsHO64XkAEc91m.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 661,
              "w": 1180,
              "x": 20,
              "y": 0
            },
            {
              "h": 661,
              "w": 661,
              "x": 360,
              "y": 0
            },
            {
              "h": 661,
              "w": 580,
              "x": 400,
              "y": 0
            },
            {
              "h": 661,
              "w": 331,
              "x": 525,
              "y": 0
            },
            {
              "h": 661,
              "w": 1200,
              "x": 0,
              "y": 0
            }
          ],
          "height": 661,
          "width": 1200
        },
        "sizes": {
          "large": {
            "h": 661,
            "resize": "fit",
            "w": 1200
          },
          "medium": {
            "h": 661,
            "resize": "fit",
            "w": 1200
          },
          "small": {
            "h": 375,
            "resize": "fit",
            "w": 680
          },
          "thumb": {
            "h": 150,
            "resize": "crop",
            "w": 150
          }
        },
        "type": "photo",
        "url": "https://t.co/fUd5gMeA1J"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [
      {
        "display_url": "huggingface.co/collections/al…",
        "expanded_url": "https://huggingface.co/collections/allenai/olmo-hybrid",
        "indices": [
          890,
          913
        ],
        "url": "https://t.co/bxZ7ckAOq4"
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}