🐦 Twitter Post Details

Viewing enriched Twitter post

@karpathy

Transforming human knowledge, sensors and actuators from human-first and human-legible to LLM-first and LLM-legible is a beautiful space with so much potential and so much can be done... One example I'm obsessed with recently - for every textbook pdf/epub, there is a perfect "LLMification" of it intended not for human but for an LLM (though it is a non-trivial transformation that would need human in the loop involvement). - All of the exposition is extracted into a markdown document, including all latex, styling (bold/italic), tables, lists, etc. All of the figures are extracted as images. - All worked problems get extracted into SFT examples. Any referenced made to previous figures/tables/etc. are parsed and included. - All practice problems are extracted into environment examples for RL. The correct answers are located in the answer key and attached. Any additional information is added as "answer key" for a potential LLM judge. - Synthetic data expansion. For every specific problem, you can create an infinite problem generator, which emits problems of that type. For example, if a problem is "What is the angle between the hour and minute hands at 9am?" , you can imagine generalizing that to any arbitrary time and calculating answers using Python code, and possibly generating synthetic variations of the prompt text. - All of the data above could be nicely indexed and embedded into a RAG database for later reference, or maybe MCP servers that make it available. Then just as a (human) student could take a high school physics course, an LLM could take it in the exact same way. This would be a significantly richer source of legible, workable information for an LLM than just something like pdf-to-text (current prevailing practice), which simply asks the LLM to predict the textbook content top to bottom token by token (umm - lame). As just a quick and crappy example of synthetic variations of the above example, GPT-5 gave me this problem generator (see image), which can now generalize that problem template to many variations: - When the time is 11:07 a.m., what is the degree measure of the angle between the hands? (Answer: 68) - Determine the angle in degrees between the clock’s hands at 4:14 a.m.. (Answer: 43) - What angle do the clock hands form when the time reads 11:47 a.m.? (Answer: 71) - At 7:02 a.m., what angle separates the hour hand and the minute hand? (Answer: 161) - At 4:14 a.m., calculate the angle made between the two hands. (Answer: 43) - What angle is formed by the hands of a clock at 4:45 p.m.? (Answer: 127) - What is the angle between the hour and minute hands at 8:37 p.m.? (Answer: 36) (infinite practice problems can be created...)

Media 1

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1961128638725923119/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1961128638725923119/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    }
  ],
  "processed_at": "2025-08-29T00:14:35.408201",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "1961128638725923119",
  "url": "https://x.com/karpathy/status/1961128638725923119",
  "twitterUrl": "https://twitter.com/karpathy/status/1961128638725923119",
  "text": "Transforming human knowledge, sensors and actuators from human-first and human-legible to LLM-first and LLM-legible is a beautiful space with so much potential and so much can be done...\n\nOne example I'm obsessed with recently - for every textbook pdf/epub, there is a perfect \"LLMification\" of it intended not for human but for an LLM (though it is a non-trivial transformation that would need human in the loop involvement).\n\n- All of the exposition is extracted into a markdown document, including all latex, styling (bold/italic), tables, lists, etc. All of the figures are extracted as images.\n- All worked problems get extracted into SFT examples. Any referenced made to previous figures/tables/etc. are parsed and included.\n- All practice problems are extracted into environment examples for RL. The correct answers are located in the answer key and attached. Any additional information is added as \"answer key\" for a potential LLM judge.\n- Synthetic data expansion. For every specific problem, you can create an infinite problem generator, which emits problems of that type. For example, if a problem is \"What is the angle between the hour and minute hands at 9am?\" , you can imagine generalizing that to any arbitrary time and calculating answers using Python code, and possibly generating synthetic variations of the prompt text.\n- All of the data above could be nicely indexed and embedded into a RAG database for later reference, or maybe MCP servers that make it available.\n\nThen just as a (human) student could take a high school physics course, an LLM could take it in the exact same way. This would be a significantly richer source of legible, workable information for an LLM than just something like pdf-to-text (current prevailing practice), which simply asks the LLM to predict the textbook content top to bottom token by token (umm - lame).\n\nAs just a quick and crappy example of synthetic variations of the above example, GPT-5 gave me this problem generator (see image), which can now generalize that problem template to many variations:\n\n- When the time is 11:07 a.m., what is the degree measure of the angle between the hands? (Answer: 68)\n- Determine the angle in degrees between the clock’s hands at 4:14 a.m.. (Answer: 43)\n- What angle do the clock hands form when the time reads 11:47 a.m.? (Answer: 71)\n- At 7:02 a.m., what angle separates the hour hand and the minute hand? (Answer: 161)\n- At 4:14 a.m., calculate the angle made between the two hands. (Answer: 43)\n- What angle is formed by the hands of a clock at 4:45 p.m.? (Answer: 127)\n- What is the angle between the hour and minute hands at 8:37 p.m.? (Answer: 36)\n(infinite practice problems can be created...)",
  "source": "Twitter for iPhone",
  "retweetCount": 283,
  "replyCount": 174,
  "likeCount": 2716,
  "quoteCount": 48,
  "viewCount": 224711,
  "createdAt": "Thu Aug 28 18:07:58 +0000 2025",
  "lang": "en",
  "bookmarkCount": 2304,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "1961128638725923119",
  "displayTextRange": [
    0,
    277
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "karpathy",
    "url": "https://x.com/karpathy",
    "twitterUrl": "https://twitter.com/karpathy",
    "id": "33836629",
    "name": "Andrej Karpathy",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1296667294148382721/9Pr6XrPB_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/33836629/1407117611",
    "description": "",
    "location": "Stanford",
    "followers": 1380401,
    "following": 1008,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Tue Apr 21 06:49:15 +0000 2009",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 19293,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 809,
    "statusesCount": 9691,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1617979122625712128"
    ],
    "profile_bio": {
      "description": "Building @EurekaLabsAI. Previously Director of AI @ Tesla, founding team @ OpenAI, CS231n/PhD @ Stanford. I like to train large deep neural nets.",
      "entities": {
        "description": {
          "user_mentions": [
            {
              "id_str": "0",
              "indices": [
                9,
                22
              ],
              "name": "",
              "screen_name": "EurekaLabsAI"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "karpathy.ai",
              "expanded_url": "https://karpathy.ai",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/0EcFthjJXM"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "allow_download_status": {
          "allow_download": true
        },
        "display_url": "pic.twitter.com/KmIdMI96ws",
        "expanded_url": "https://twitter.com/karpathy/status/1961128638725923119/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": [
              {
                "h": 150,
                "w": 150,
                "x": 21,
                "y": 188
              }
            ]
          },
          "orig": {
            "faces": [
              {
                "h": 150,
                "w": 150,
                "x": 21,
                "y": 188
              }
            ]
          }
        },
        "id_str": "1961127346574168064",
        "indices": [
          278,
          301
        ],
        "media_key": "3_1961127346574168064",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARs3Uu5j25AACgACGzdUGz4aoS8AAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABGzdS7mPbkAAKAAIbN1QbPhqhLwAA",
            "media_key": "3_1961127346574168064"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/GzdS7mPbkAABx6i.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 1037,
              "w": 1852,
              "x": 0,
              "y": 0
            },
            {
              "h": 1602,
              "w": 1602,
              "x": 0,
              "y": 0
            },
            {
              "h": 1602,
              "w": 1405,
              "x": 0,
              "y": 0
            },
            {
              "h": 1602,
              "w": 801,
              "x": 201,
              "y": 0
            },
            {
              "h": 1602,
              "w": 1852,
              "x": 0,
              "y": 0
            }
          ],
          "height": 1602,
          "width": 1852
        },
        "sizes": {
          "large": {
            "h": 1602,
            "w": 1852
          }
        },
        "type": "photo",
        "url": "https://t.co/KmIdMI96ws"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {},
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}