🐦 Twitter Post Details

Viewing enriched Twitter post

@LiorOnAI

Every foundation model you've ever used has the same bug. It just got fixed. Since 2015, every deep network has been built the same way: each layer does some computation, adds its result to a running total, and passes it forward. Simple. But there's a problem, by layer 100, the signal from any single layer is buried under the sum of everything else. Each new layer matters less and less. Nobody fixed this because it worked well enough. Moonshot AI just changed that. Their new method, Attention Residuals, lets each layer look back at all previous layers and choose which ones actually matter right now. Instead of a blind running total, you get selective retrieval. The analogy: imagine writing an essay where every draft gets merged into one document automatically. By draft 50, your latest edits are invisible. AttnRes lets you keep every draft separate and pull from whichever ones you need. What this fixes: 1. Deeper layers no longer get drowned out 2. Training becomes more stable across the whole network 3. The model uses its own depth more efficiently To make it practical at scale, they group layers into blocks and attend over block summaries instead of every single layer. Overhead at inference: less than 2%. The result: 25% less compute to reach the same performance. Tested on a 48B-parameter model. Holds across sizes. Residual connections have been invisible plumbing for a decade. Now they're becoming dynamic. The next generation of models won't just pass through their own layers, they'll search them.

πŸ“Š Media Metadata

{
  "score": 0.42,
  "score_components": {
    "author": 0.09,
    "engagement": 0.0,
    "quality": 0.12,
    "source": 0.135,
    "nlp": 0.05,
    "recency": 0.025
  },
  "scored_at": "2026-03-16T17:46:44.000418",
  "import_source": "api_import",
  "source_tagged_at": "2026-03-16T17:46:44.000430",
  "enriched": true,
  "enriched_at": "2026-03-16T17:46:44.000433"
}

πŸ”§ Raw API Response

{
  "type": "tweet",
  "id": "2033597954024702434",
  "url": "https://x.com/LiorOnAI/status/2033597954024702434",
  "twitterUrl": "https://twitter.com/LiorOnAI/status/2033597954024702434",
  "text": "Every foundation model you've ever used has the same bug. It just got fixed.\n\nSince 2015, every deep network has been built the same way: each layer does some computation, adds its result to a running total, and passes it forward. \n\nSimple. But there's a problem, by layer 100, the signal from any single layer is buried under the sum of everything else. \n\nEach new layer matters less and less.\n\nNobody fixed this because it worked well enough.\n\nMoonshot AI just changed that. Their new method, Attention Residuals, lets each layer look back at all previous layers and choose which ones actually matter right now. \n\nInstead of a blind running total, you get selective retrieval.\n\nThe analogy: imagine writing an essay where every draft gets merged into one document automatically. By draft 50, your latest edits are invisible. \n\nAttnRes lets you keep every draft separate and pull from whichever ones you need.\n\nWhat this fixes:\n\n1. Deeper layers no longer get drowned out\n2. Training becomes more stable across the whole network\n3. The model uses its own depth more efficiently\n\nTo make it practical at scale, they group layers into blocks and attend over block summaries instead of every single layer. \n\nOverhead at inference: less than 2%.\n\nThe result: \n25% less compute to reach the same performance. Tested on a 48B-parameter model. Holds across sizes.\n\nResidual connections have been invisible plumbing for a decade. Now they're becoming dynamic. \n\nThe next generation of models won't just pass through their own layers, they'll search them.",
  "source": "Twitter for iPhone",
  "retweetCount": 2,
  "replyCount": 0,
  "likeCount": 6,
  "quoteCount": 0,
  "viewCount": 499,
  "createdAt": "Mon Mar 16 17:35:09 +0000 2026",
  "lang": "en",
  "bookmarkCount": 4,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2033597954024702434",
  "displayTextRange": [
    0,
    277
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "LiorOnAI",
    "url": "https://x.com/LiorOnAI",
    "twitterUrl": "https://twitter.com/LiorOnAI",
    "id": "931470139",
    "name": "Lior Alexander",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/2032256308196564993/ozddLZ2O_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/931470139/1761077189",
    "description": "",
    "location": "",
    "followers": 114041,
    "following": 2191,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Wed Nov 07 07:19:36 +0000 2012",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 6817,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 662,
    "statusesCount": 3779,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [],
    "profile_bio": {
      "description": "Covering the latest news for AI devs β€’ Building the 1st Agentic Media Company @AlphaSignalAI (280K users) β€’ 9 yrs in ML β€’ Ex-MILA β€’ SF 🌁",
      "entities": {
        "description": {
          "hashtags": [],
          "symbols": [],
          "urls": [],
          "user_mentions": [
            {
              "id_str": "0",
              "indices": [
                78,
                92
              ],
              "name": "",
              "screen_name": "AlphaSignalAI"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "alphasignal.ai",
              "expanded_url": "https://alphasignal.ai",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/AyubevaLcb"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {},
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [],
    "user_mentions": []
  },
  "quoted_tweet": {
    "type": "tweet",
    "id": "2033378587878072424",
    "url": "https://x.com/Kimi_Moonshot/status/2033378587878072424",
    "twitterUrl": "https://twitter.com/Kimi_Moonshot/status/2033378587878072424",
    "text": "Introducing π‘¨π’•π’•π’†π’π’•π’Šπ’π’ π‘Ήπ’†π’”π’Šπ’…π’–π’‚π’π’”: Rethinking depth-wise aggregation.\n\nResidual connections have long relied on fixed, uniform accumulation. Inspired by the duality of time and depth, we introduce Attention Residuals, replacing standard depth-wise recurrence with learned, input-dependent attention over preceding layers.\n\nπŸ”Ή Enables networks to selectively retrieve past representations, naturally mitigating dilution and hidden-state growth.\nπŸ”Ή Introduces Block AttnRes, partitioning layers into compressed blocks to make cross-layer attention practical at scale.\nπŸ”Ή Serves as an efficient drop-in replacement, demonstrating a 1.25x compute advantage with negligible (<2%) inference latency overhead.\nπŸ”Ή Validated on the Kimi Linear architecture (48B total, 3B activated parameters), delivering consistent downstream performance gains.\n\nπŸ”—Full report:\nhttps://t.co/u3EHICG05h",
    "source": "Twitter for iPhone",
    "retweetCount": 1332,
    "replyCount": 216,
    "likeCount": 9067,
    "quoteCount": 339,
    "viewCount": 2520118,
    "createdAt": "Mon Mar 16 03:03:28 +0000 2026",
    "lang": "en",
    "bookmarkCount": 6620,
    "isReply": false,
    "inReplyToId": null,
    "conversationId": "2033378587878072424",
    "displayTextRange": [
      0,
      261
    ],
    "inReplyToUserId": null,
    "inReplyToUsername": null,
    "author": {
      "type": "user",
      "userName": "Kimi_Moonshot",
      "url": "https://x.com/Kimi_Moonshot",
      "twitterUrl": "https://twitter.com/Kimi_Moonshot",
      "id": "1863959670169501696",
      "name": "Kimi.ai",
      "isVerified": false,
      "isBlueVerified": false,
      "verifiedType": "Business",
      "profilePicture": "https://pbs.twimg.com/profile_images/1910294000927645696/QseOV0uF_normal.png",
      "coverPicture": "https://pbs.twimg.com/profile_banners/1863959670169501696/1733238156",
      "description": "",
      "location": "",
      "followers": 128141,
      "following": 132,
      "status": "",
      "canDm": false,
      "canMediaTag": true,
      "createdAt": "Tue Dec 03 14:54:14 +0000 2024",
      "entities": {
        "description": {
          "urls": []
        },
        "url": {}
      },
      "fastFollowersCount": 0,
      "favouritesCount": 255,
      "hasCustomTimelines": true,
      "isTranslator": false,
      "mediaCount": 111,
      "statusesCount": 298,
      "withheldInCountries": [],
      "affiliatesHighlightedLabel": {},
      "possiblySensitive": false,
      "pinnedTweetIds": [
        "2016024049869324599"
      ],
      "profile_bio": {
        "description": "Built by Moonshot AI to empower everyone to be superhuman. ⚑️API: https://t.co/ggYlFf809H\n@KimiProduct where we share cool use cases and prompts.",
        "entities": {
          "description": {
            "hashtags": [],
            "symbols": [],
            "urls": [
              {
                "display_url": "platform.moonshot.ai",
                "expanded_url": "https://platform.moonshot.ai/",
                "indices": [
                  66,
                  89
                ],
                "url": "https://t.co/ggYlFf809H"
              }
            ],
            "user_mentions": [
              {
                "id_str": "0",
                "indices": [
                  90,
                  102
                ],
                "name": "",
                "screen_name": "KimiProduct"
              }
            ]
          },
          "url": {
            "urls": [
              {
                "display_url": "kimi.com",
                "expanded_url": "https://www.kimi.com/",
                "indices": [
                  0,
                  23
                ],
                "url": "https://t.co/mlnKFmsdLe"
              }
            ]
          }
        }
      },
      "isAutomated": false,
      "automatedBy": null
    },
    "extendedEntities": {
      "media": [
        {
          "allow_download_status": {
            "allow_download": true
          },
          "display_url": "pic.twitter.com/gcWyzhZVc0",
          "expanded_url": "https://twitter.com/Kimi_Moonshot/status/2033378587878072424/photo/1",
          "ext_media_availability": {
            "status": "Available"
          },
          "features": {
            "large": {
              "faces": []
            },
            "orig": {
              "faces": []
            }
          },
          "id_str": "2033378144850530304",
          "indices": [
            262,
            285
          ],
          "media_key": "3_2033378144850530304",
          "media_results": {
            "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARw4AqZB29AACgACHDgDDWhboGgAAA==",
            "result": {
              "__typename": "ApiMedia",
              "id": "QXBpTWVkaWE6DAABCgABHDgCpkHb0AAKAAIcOAMNaFugaAAA",
              "media_key": "3_2033378144850530304"
            }
          },
          "media_url_https": "https://pbs.twimg.com/media/HDgCpkHb0AA0a7_.jpg",
          "original_info": {
            "focus_rects": [
              {
                "h": 553,
                "w": 987,
                "x": 0,
                "y": 0
              },
              {
                "h": 987,
                "w": 987,
                "x": 0,
                "y": 0
              },
              {
                "h": 1125,
                "w": 987,
                "x": 0,
                "y": 0
              },
              {
                "h": 1280,
                "w": 640,
                "x": 159,
                "y": 0
              },
              {
                "h": 1280,
                "w": 987,
                "x": 0,
                "y": 0
              }
            ],
            "height": 1280,
            "width": 987
          },
          "sizes": {
            "large": {
              "h": 1280,
              "w": 987
            }
          },
          "type": "photo",
          "url": "https://t.co/gcWyzhZVc0"
        }
      ]
    },
    "card": null,
    "place": {},
    "entities": {
      "hashtags": [],
      "symbols": [],
      "urls": [
        {
          "display_url": "github.com/MoonshotAI/Att…",
          "expanded_url": "https://github.com/MoonshotAI/Attention-Residuals/blob/master/Attention_Residuals.pdf",
          "indices": [
            847,
            870
          ],
          "url": "https://t.co/u3EHICG05h"
        }
      ],
      "user_mentions": []
    },
    "quoted_tweet": null,
    "retweeted_tweet": null,
    "isLimitedReply": false,
    "article": null
  },
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}