🐦 Twitter Post Details

Viewing enriched Twitter post

@dair_ai

Interesting research from Google. Research has shown that neural networks don't just memorize facts. They build internal maps of how those facts relate to each other. The view of how transformers store knowledge is associative: co-occurring entities get stored in a weight matrix, like a lookup table. The embeddings themselves are arbitrary. But this view can't explain something these researchers found. This new research demonstrates that transformers learn implicit multi-hop reasoning when graph edges are stored in weights, even on adversarially-designed tasks where associative memory should fail. On path-star graphs with 50,000 nodes and 10-hop paths, models achieve 100% accuracy on unseen paths. This geometric view of memory challenges foundational assumptions in knowledge acquisition, capacity, editing, and unlearning. If models encode global relationships implicitly, it could enable combinational creativity but also impose limits on precise knowledge manipulation. Paper: https://t.co/Rk68BdRRcG Learn to build effective AI agents in our academy: https://t.co/zQXQt0Pem8

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2005480659209400789/media_0.jpg?",
      "filename": "media_0.jpg"
    },
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2005480659209400789/media_1.png?",
      "filename": "media_1.png"
    }
  ],
  "processed_at": "2025-12-31T02:48:09.473636",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2005480659209400789",
  "url": "https://x.com/dair_ai/status/2005480659209400789",
  "twitterUrl": "https://twitter.com/dair_ai/status/2005480659209400789",
  "text": "Interesting research from Google.\n\nResearch has shown that neural networks don't just memorize facts. They build internal maps of how those facts relate to each other.\n\nThe view of how transformers store knowledge is associative: co-occurring entities get stored in a weight matrix, like a lookup table. The embeddings themselves are arbitrary.\n\nBut this view can't explain something these researchers found.\n\nThis new research demonstrates that transformers learn implicit multi-hop reasoning when graph edges are stored in weights, even on adversarially-designed tasks where associative memory should fail. On path-star graphs with 50,000 nodes and 10-hop paths, models achieve 100% accuracy on unseen paths.\n\nThis geometric view of memory challenges foundational assumptions in knowledge acquisition, capacity, editing, and unlearning. If models encode global relationships implicitly, it could enable combinational creativity but also impose limits on precise knowledge manipulation.\n\nPaper: https://t.co/Rk68BdRRcG\n\nLearn to build effective AI agents in our academy: https://t.co/zQXQt0Pem8",
  "source": "Twitter for iPhone",
  "retweetCount": 207,
  "replyCount": 38,
  "likeCount": 1115,
  "quoteCount": 23,
  "viewCount": 76644,
  "createdAt": "Mon Dec 29 03:27:03 +0000 2025",
  "lang": "en",
  "bookmarkCount": 1031,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2005480659209400789",
  "displayTextRange": [
    0,
    299
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "dair_ai",
    "url": "https://x.com/dair_ai",
    "twitterUrl": "https://twitter.com/dair_ai",
    "id": "889050642903293953",
    "name": "DAIR.AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1643277398522187778/31dedbLo_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/889050642903293953/1742055232",
    "description": "",
    "location": "",
    "followers": 84575,
    "following": 1,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Sun Jul 23 09:12:45 +0000 2017",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 3959,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 104,
    "statusesCount": 2740,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2006002628250243125"
    ],
    "profile_bio": {
      "description": "Democratizing AI research, education, and technologies.",
      "entities": {
        "description": {},
        "url": {
          "urls": [
            {
              "display_url": "dair.ai",
              "expanded_url": "https://www.dair.ai/",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/lkqPZtMmfU"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.twitter.com/W8qVbsZvRY",
        "expanded_url": "https://twitter.com/dair_ai/status/2005480659209400789/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {},
          "orig": {}
        },
        "id_str": "2005480654599847936",
        "indices": [
          300,
          323
        ],
        "media_key": "3_2005480654599847936",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARvU5gelV4AACgACG9TmCLgXsdUAAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABG9TmB6VXgAAKAAIb1OYIuBex1QAA",
            "media_key": "3_2005480654599847936"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/G9TmB6VXgAAvZNk.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 874,
              "w": 1560,
              "x": 0,
              "y": 0
            },
            {
              "h": 1560,
              "w": 1560,
              "x": 0,
              "y": 0
            },
            {
              "h": 1736,
              "w": 1523,
              "x": 0,
              "y": 0
            },
            {
              "h": 1736,
              "w": 868,
              "x": 0,
              "y": 0
            },
            {
              "h": 1736,
              "w": 1560,
              "x": 0,
              "y": 0
            }
          ],
          "height": 1736,
          "width": 1560
        },
        "sizes": {
          "large": {
            "h": 1736,
            "w": 1560
          }
        },
        "type": "photo",
        "url": "https://t.co/W8qVbsZvRY"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "urls": [
      {
        "display_url": "arxiv.org/abs/2510.26745",
        "expanded_url": "https://arxiv.org/abs/2510.26745",
        "indices": [
          996,
          1019
        ],
        "url": "https://t.co/Rk68BdRRcG"
      },
      {
        "display_url": "dair-ai.thinkific.com",
        "expanded_url": "https://dair-ai.thinkific.com/",
        "indices": [
          1072,
          1095
        ],
        "url": "https://t.co/zQXQt0Pem8"
      }
    ]
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}