🐦 Twitter Post Details

Viewing enriched Twitter post

@dair_ai

NEW paper from Apple. Interesting idea: "Attention to Mamba". The paper introduces a two-stage recipe for cross-architecture distillation from Transformers into Mamba. Naive distillation collapses teacher performance. Their trick: first distill the transformer into a linearized-attention student using a kernel adaptation, then transfer that student into a pure Mamba with no attention blocks. On a 1B model trained on 10B tokens, the Mamba student hits 14.11 perplexity against a 13.86 Pythia-1B teacher, nearly matching quality at linear-time inference cost. If you can reliably convert trained transformers into state-space models without retraining from scratch, the entire open-weights ecosystem becomes cheaper to serve at long context. This is the kind of quiet infrastructure work that decides which architectures actually get deployed in agent stacks. Paper: https://t.co/h7k7OrG8Qj Learn to build effective AI agents in our academy: https://t.co/LRnpZN7L4c

Media 1

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2045600012860801113/media_0.jpg",
      "filename": "media_0.jpg"
    }
  ],
  "processed_at": "2026-04-18T20:31:17.555981",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2045600012860801113",
  "url": "https://x.com/dair_ai/status/2045600012860801113",
  "twitterUrl": "https://twitter.com/dair_ai/status/2045600012860801113",
  "text": "NEW paper from Apple.\n\nInteresting idea: \"Attention to Mamba\".\n\nThe paper introduces a two-stage recipe for cross-architecture distillation from Transformers into Mamba.\n\nNaive distillation collapses teacher performance. Their trick: first distill the transformer into a linearized-attention student using a kernel adaptation, then transfer that student into a pure Mamba with no attention blocks.\n\nOn a 1B model trained on 10B tokens, the Mamba student hits 14.11 perplexity against a 13.86 Pythia-1B teacher, nearly matching quality at linear-time inference cost.\n\nIf you can reliably convert trained transformers into state-space models without retraining from scratch, the entire open-weights ecosystem becomes cheaper to serve at long context. This is the kind of quiet infrastructure work that decides which architectures actually get deployed in agent stacks.\n\nPaper: https://t.co/h7k7OrG8Qj\n\nLearn to build effective AI agents in our academy: https://t.co/LRnpZN7L4c",
  "source": "Twitter for iPhone",
  "retweetCount": 1,
  "replyCount": 0,
  "likeCount": 2,
  "quoteCount": 0,
  "viewCount": 157,
  "createdAt": "Sat Apr 18 20:27:03 +0000 2026",
  "lang": "en",
  "bookmarkCount": 2,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2045600012860801113",
  "displayTextRange": [
    0,
    270
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "dair_ai",
    "url": "https://x.com/dair_ai",
    "twitterUrl": "https://twitter.com/dair_ai",
    "id": "889050642903293953",
    "name": "DAIR.AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1643277398522187778/31dedbLo_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/889050642903293953/1773242460",
    "description": "",
    "location": "",
    "followers": 108263,
    "following": 1,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Sun Jul 23 09:12:45 +0000 2017",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 4354,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 198,
    "statusesCount": 3104,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2045139481892880892"
    ],
    "profile_bio": {
      "description": "Democratizing AI research, education, and technologies. New AI learning portal: https://t.co/LRnpZN7L4c",
      "entities": {
        "description": {
          "urls": [
            {
              "display_url": "academy.dair.ai",
              "expanded_url": "https://academy.dair.ai/",
              "indices": [
                80,
                103
              ],
              "url": "https://t.co/LRnpZN7L4c"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "dair.ai",
              "expanded_url": "https://www.dair.ai/",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/lkqPZtMU5s"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.twitter.com/tt1P4XWOJK",
        "expanded_url": "https://twitter.com/dair_ai/status/2045600012860801113/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": [
              {
                "h": 84,
                "w": 84,
                "x": 196,
                "y": 794
              }
            ]
          },
          "orig": {
            "faces": [
              {
                "h": 84,
                "w": 84,
                "x": 196,
                "y": 794
              }
            ]
          }
        },
        "id_str": "2045600009824198656",
        "indices": [
          271,
          294
        ],
        "media_key": "3_2045600009824198656",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARxjbl7yG2AACgACHGNuX6caQFkAAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABHGNuXvIbYAAKAAIcY25fpxpAWQAA",
            "media_key": "3_2045600009824198656"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HGNuXvIbYAAA3fp.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 870,
              "w": 1554,
              "x": 0,
              "y": 0
            },
            {
              "h": 1554,
              "w": 1554,
              "x": 0,
              "y": 0
            },
            {
              "h": 1772,
              "w": 1554,
              "x": 0,
              "y": 0
            },
            {
              "h": 1796,
              "w": 898,
              "x": 0,
              "y": 0
            },
            {
              "h": 1796,
              "w": 1554,
              "x": 0,
              "y": 0
            }
          ],
          "height": 1796,
          "width": 1554
        },
        "sizes": {
          "large": {
            "h": 1796,
            "w": 1554
          }
        },
        "type": "photo",
        "url": "https://t.co/tt1P4XWOJK"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [
      {
        "display_url": "arxiv.org/abs/2604.14191",
        "expanded_url": "https://arxiv.org/abs/2604.14191",
        "indices": [
          875,
          898
        ],
        "url": "https://t.co/h7k7OrG8Qj"
      },
      {
        "display_url": "academy.dair.ai",
        "expanded_url": "https://academy.dair.ai/",
        "indices": [
          951,
          974
        ],
        "url": "https://t.co/LRnpZN7L4c"
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "communityInfo": null,
  "article": null
}