🐦 Twitter Post Details

Viewing enriched Twitter post

@rohanpaul_ai

Yann LeCun's new interview - explains why LLMs are so limited in terms of real-world intelligence. Says the biggest LLM is trained on about 30 trillion words, which is roughly 10 to the power 14 bytes of text. That sounds huge, but a 4 year old who has been awake about 16,000 hours has also taken in about 10 to the power 14 bytes through the eyes alone. So a small child has already seen as much raw data as the largest LLM has read. But the child’s data is visual, continuous, noisy, and tied to actions: gravity, objects falling, hands grabbing, people moving, cause and effect. From this, the child builds an internal “world model” and intuitive physics, and can learn new tasks like loading a dishwasher from a handful of demonstrations. LLMs only see disconnected text and are trained just to predict the next token. So they get very good at symbol patterns, exams, and code, but they lack grounded physical understanding, real common sense, and efficient learning from a few messy real-world experiences. --- From 'Pioneer Works' YT channel (link in comment)

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001322361514336541/media_0.mp4?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001322361514336541/media_0.mp4?",
      "type": "video",
      "filename": "media_0.mp4"
    }
  ],
  "processed_at": "2025-12-18T00:41:37.818094",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2001322361514336541",
  "url": "https://x.com/rohanpaul_ai/status/2001322361514336541",
  "twitterUrl": "https://twitter.com/rohanpaul_ai/status/2001322361514336541",
  "text": "Yann LeCun's new interview - explains why LLMs are so limited in terms of real-world intelligence.\n\nSays the biggest LLM is trained on about 30 trillion words, which is roughly 10 to the power 14 bytes of text. \nThat sounds huge, but a 4 year old who has been awake about 16,000 hours has also taken in about 10 to the power 14 bytes through the eyes alone. So a small child has already seen as much raw data as the largest LLM has read.\n\nBut the child’s data is visual, continuous, noisy, and tied to actions: gravity, objects falling, hands grabbing, people moving, cause and effect. From this, the child builds an internal “world model” and intuitive physics, and can learn new tasks like loading a dishwasher from a handful of demonstrations.\n\nLLMs only see disconnected text and are trained just to predict the next token. So they get very good at symbol patterns, exams, and code, but they lack grounded physical understanding, real common sense, and efficient learning from a few messy real-world experiences.\n\n---\n\nFrom 'Pioneer Works' YT channel (link in comment)",
  "source": "Twitter for iPhone",
  "retweetCount": 33,
  "replyCount": 31,
  "likeCount": 184,
  "quoteCount": 8,
  "viewCount": 22553,
  "createdAt": "Wed Dec 17 16:03:28 +0000 2025",
  "lang": "en",
  "bookmarkCount": 104,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2001322361514336541",
  "displayTextRange": [
    0,
    279
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "rohanpaul_ai",
    "url": "https://x.com/rohanpaul_ai",
    "twitterUrl": "https://twitter.com/rohanpaul_ai",
    "id": "2588345408",
    "name": "Rohan Paul",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1816185267037859840/Fd18CH0v_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/2588345408/1729559315",
    "description": "",
    "location": "Ex Inv Banking (Deutsche)",
    "followers": 115910,
    "following": 8355,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Wed Jun 25 22:38:54 +0000 2014",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 54871,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 25324,
    "statusesCount": 62461,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1965551636082032917"
    ],
    "profile_bio": {
      "description": "Compiling in real-time, the race towards AGI.\n\nThe Largest Show on X for AI.\n\n🗞️ Get my daily AI analysis newsletter to your email  👉 https://t.co/6LBxO8215l",
      "entities": {
        "description": {
          "urls": [
            {
              "display_url": "rohan-paul.com",
              "expanded_url": "https://www.rohan-paul.com",
              "indices": [
                134,
                157
              ],
              "url": "https://t.co/6LBxO8215l"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "rohan-paul.com",
              "expanded_url": "http://www.rohan-paul.com",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/2NKnK0wIil"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "additional_media_info": {
          "monetizable": false
        },
        "allow_download_status": {
          "allow_download": true
        },
        "display_url": "pic.twitter.com/ip5gZVmx4E",
        "expanded_url": "https://twitter.com/rohanpaul_ai/status/2001322361514336541/video/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "id_str": "2001321049603100672",
        "indices": [
          280,
          303
        ],
        "media_key": "13_2001321049603100672",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwABAoAARvGHuQjGtAAAAA=",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAAECgABG8Ye5CMa0AAAAA==",
            "media_key": "13_2001321049603100672"
          }
        },
        "media_url_https": "https://pbs.twimg.com/amplify_video_thumb/2001321049603100672/img/N4zu3ZFcQZVJQXD5.jpg",
        "original_info": {
          "focus_rects": [],
          "height": 1080,
          "width": 1920
        },
        "sizes": {
          "large": {
            "h": 1080,
            "w": 1920
          }
        },
        "type": "video",
        "url": "https://t.co/ip5gZVmx4E",
        "video_info": {
          "aspect_ratio": [
            16,
            9
          ],
          "duration_millis": 53994,
          "variants": [
            {
              "content_type": "application/x-mpegURL",
              "url": "https://video.twimg.com/amplify_video/2001321049603100672/pl/0GuOyKZevFs_FW6V.m3u8?tag=21&v=581"
            },
            {
              "bitrate": 256000,
              "content_type": "video/mp4",
              "url": "https://video.twimg.com/amplify_video/2001321049603100672/vid/avc1/480x270/GqJgDDoRQ33Gcwlj.mp4?tag=21"
            },
            {
              "bitrate": 832000,
              "content_type": "video/mp4",
              "url": "https://video.twimg.com/amplify_video/2001321049603100672/vid/avc1/640x360/FTHjvEaEWhnRnVPH.mp4?tag=21"
            },
            {
              "bitrate": 2176000,
              "content_type": "video/mp4",
              "url": "https://video.twimg.com/amplify_video/2001321049603100672/vid/avc1/1280x720/KR6_NIGIJY_SK6aR.mp4?tag=21"
            },
            {
              "bitrate": 10368000,
              "content_type": "video/mp4",
              "url": "https://video.twimg.com/amplify_video/2001321049603100672/vid/avc1/1920x1080/ePMECpnS5Yr6UHvJ.mp4?tag=21"
            }
          ]
        }
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {},
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}