🐦 Twitter Post Details

Viewing enriched Twitter post

@dair_ai

// Think Harder or Know More // Chain-of-thought prompting enables reasoning in LLMs but requires explicit verbalization of intermediate steps. Looped transformers offer an alternative by iteratively refining representations within hidden states, but they sacrifice storage capacity in the process. This paper investigates combining both: adaptive per-layer looping with gated memory banks. Each transformer block learns when to iterate its hidden state and when to access stored knowledge. The key finding: Looping primarily benefits mathematical reasoning, while memory banks recover performance on commonsense tasks. Combining both yields a model that outperforms an iso-FLOP baseline with three times the number of layers on math benchmarks. Analysis of model internals reveals layer specialization. Early layers learn to loop minimally and access memory sparingly, while later layers do both more heavily. The model learns to choose between thinking harder and knowing more, and where to do each. Paper: https://t.co/0Gl77zMwOY Learn to build effective AI agents in our academy: https://t.co/LRnpZN7L4c

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2032107624007876781/media_0.png",
      "filename": "media_0.png"
    },
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2032107624007876781/media_1.png",
      "filename": "media_1.png"
    }
  ],
  "processed_at": "2026-03-12T15:02:21.888016",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2032107624007876781",
  "url": "https://x.com/dair_ai/status/2032107624007876781",
  "twitterUrl": "https://twitter.com/dair_ai/status/2032107624007876781",
  "text": "// Think Harder or Know More //\n\nChain-of-thought prompting enables reasoning in LLMs but requires explicit verbalization of intermediate steps.\n\nLooped transformers offer an alternative by iteratively refining representations within hidden states, but they sacrifice storage capacity in the process.\n\nThis paper investigates combining both: adaptive per-layer looping with gated memory banks.\n\nEach transformer block learns when to iterate its hidden state and when to access stored knowledge.\n\nThe key finding:\n\nLooping primarily benefits mathematical reasoning, while memory banks recover performance on commonsense tasks. Combining both yields a model that outperforms an iso-FLOP baseline with three times the number of layers on math benchmarks.\n\nAnalysis of model internals reveals layer specialization.\n\nEarly layers learn to loop minimally and access memory sparingly, while later layers do both more heavily. The model learns to choose between thinking harder and knowing more, and where to do each.\n\nPaper: https://t.co/0Gl77zMwOY\n\nLearn to build effective AI agents in our academy: https://t.co/LRnpZN7L4c",
  "source": "Twitter for iPhone",
  "retweetCount": 1,
  "replyCount": 1,
  "likeCount": 7,
  "quoteCount": 0,
  "viewCount": 217,
  "createdAt": "Thu Mar 12 14:53:06 +0000 2026",
  "lang": "en",
  "bookmarkCount": 1,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2032107624007876781",
  "displayTextRange": [
    0,
    275
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "dair_ai",
    "url": "https://x.com/dair_ai",
    "twitterUrl": "https://twitter.com/dair_ai",
    "id": "889050642903293953",
    "name": "DAIR.AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1643277398522187778/31dedbLo_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/889050642903293953/1773242460",
    "description": "",
    "location": "",
    "followers": 91666,
    "following": 1,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Sun Jul 23 09:12:45 +0000 2017",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 4250,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 173,
    "statusesCount": 3009,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2031751809908297818"
    ],
    "profile_bio": {
      "description": "Democratizing AI research, education, and technologies. New AI learning portal: https://t.co/LRnpZN7L4c",
      "entities": {
        "description": {
          "hashtags": [],
          "symbols": [],
          "urls": [
            {
              "display_url": "academy.dair.ai",
              "expanded_url": "https://academy.dair.ai/",
              "indices": [
                80,
                103
              ],
              "url": "https://t.co/LRnpZN7L4c"
            }
          ],
          "user_mentions": []
        },
        "url": {
          "urls": [
            {
              "display_url": "dair.ai",
              "expanded_url": "https://www.dair.ai/",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/lkqPZtMU5s"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.twitter.com/W0XpAVDGH6",
        "expanded_url": "https://twitter.com/dair_ai/status/2032107624007876781/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "orig": {
            "faces": []
          }
        },
        "id_str": "2032107620530831360",
        "indices": [
          276,
          299
        ],
        "media_key": "3_2032107620530831360",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARwzfx1DGsAACgACHDN/HhJaQK0AAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABHDN/HUMawAAKAAIcM38eElpArQAA",
            "media_key": "3_2032107620530831360"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HDN_HUMawAAKMrZ.png",
        "original_info": {
          "focus_rects": [
            {
              "h": 818,
              "w": 1460,
              "x": 0,
              "y": 0
            },
            {
              "h": 1460,
              "w": 1460,
              "x": 0,
              "y": 0
            },
            {
              "h": 1654,
              "w": 1451,
              "x": 0,
              "y": 0
            },
            {
              "h": 1654,
              "w": 827,
              "x": 289,
              "y": 0
            },
            {
              "h": 1654,
              "w": 1460,
              "x": 0,
              "y": 0
            }
          ],
          "height": 1654,
          "width": 1460
        },
        "sizes": {
          "large": {
            "h": 1654,
            "w": 1460
          }
        },
        "type": "photo",
        "url": "https://t.co/W0XpAVDGH6"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [
      {
        "display_url": "arxiv.org/abs/2603.08391",
        "expanded_url": "https://arxiv.org/abs/2603.08391",
        "indices": [
          1018,
          1041
        ],
        "url": "https://t.co/0Gl77zMwOY"
      },
      {
        "display_url": "academy.dair.ai",
        "expanded_url": "https://academy.dair.ai/",
        "indices": [
          1094,
          1117
        ],
        "url": "https://t.co/LRnpZN7L4c"
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}