🐦 Twitter Post Details

Viewing enriched Twitter post

@gerardsans

Bold Lacanian read on AI hallucination, but the analogy leans on heavy anthropomorphic baggage. All LLM outputs start the same: every token is just next token prediction. A continuation becomes a hallucination only when a human adds real world context the model never had. There is no psyche trying to fill a lack. Personality in LLMs is RLHF rewarding fluency, not truth. Apparent traits are prompt shaped data artefacts as in Han et al 2025 arXiv:2509.03730. Self reported Big Five maps to behaviour in about 24 percent of cases. This is a stochastic funnel, not a barred subject. The confidence in hallucinations is not Lacanian jouissance. It is the Eliza effect. We project coherence and intention, then blame the model for a mismatch created by our own projection. Great paper, but it needs a reminder to flag every anthropomorphic move with the actual technical context. Call out when you are interpreting output after the fact, not describing how it was produced, and avoid projecting human traits that do not exist. Follow for more insights or subscribe to receive updates in your inbox: https://t.co/DybOvoBDEw

Media 1

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1998353731650257093/media_0.jpg?",
      "filename": "media_0.jpg"
    }
  ],
  "processed_at": "2025-12-09T11:35:44.562192",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "1998353731650257093",
  "url": "https://x.com/gerardsans/status/1998353731650257093",
  "twitterUrl": "https://twitter.com/gerardsans/status/1998353731650257093",
  "text": "Bold Lacanian read on AI hallucination, but the analogy leans on heavy anthropomorphic baggage.\n\nAll LLM outputs start the same: every token is just next token prediction. A continuation becomes a hallucination only when a human adds real world context the model never had. There is no psyche trying to fill a lack.\n\nPersonality in LLMs is RLHF rewarding fluency, not truth.\nApparent traits are prompt shaped data artefacts as in Han et al 2025 arXiv:2509.03730.\nSelf reported Big Five maps to behaviour in about 24 percent of cases. This is a stochastic funnel, not a barred subject.\n\nThe confidence in hallucinations is not Lacanian jouissance. It is the Eliza effect. We project coherence and intention, then blame the model for a mismatch created by our own projection.\n\nGreat paper, but it needs a reminder to flag every anthropomorphic move with the actual technical context. Call out when you are interpreting output after the fact, not describing how it was produced, and avoid projecting human traits that do not exist.\n\nFollow for more insights or subscribe to receive updates in your inbox: \n\nhttps://t.co/DybOvoBDEw",
  "source": "Twitter for iPhone",
  "retweetCount": 0,
  "replyCount": 0,
  "likeCount": 0,
  "quoteCount": 0,
  "viewCount": 8,
  "createdAt": "Tue Dec 09 11:27:11 +0000 2025",
  "lang": "en",
  "bookmarkCount": 0,
  "isReply": true,
  "inReplyToId": "1992231719047467021",
  "conversationId": "1992231719047467021",
  "displayTextRange": [
    16,
    296
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "gerardsans",
    "url": "https://x.com/gerardsans",
    "twitterUrl": "https://twitter.com/gerardsans",
    "id": "9284062",
    "name": "Gerard Sans | Axiom 🇬🇧",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1938955632763105280/aBJaOCJJ_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/9284062/1751119206",
    "description": "Founder Axiom // Forging skills for the new era of AI. GDE in AI, Cloud & Angular. Building London's tech & art nexus @nextai_london. Speaker | MC | Trainer.",
    "location": "London ☔",
    "followers": 35909,
    "following": 6921,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Sat Oct 06 20:04:48 +0000 2007",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {
        "urls": [
          {
            "display_url": "aws.amazon.com/amplify",
            "expanded_url": "http://aws.amazon.com/amplify",
            "url": "https://t.co/ufepRUvlgW",
            "indices": [
              0,
              23
            ]
          }
        ]
      }
    },
    "fastFollowersCount": 0,
    "favouritesCount": 26538,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 4989,
    "statusesCount": 36303,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1741153588959654329"
    ],
    "profile_bio": {},
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.x.com/IKqzCFH7sG",
        "expanded_url": "https://x.com/gerardsans/status/1998353731650257093/photo/1",
        "id_str": "1998353723442032640",
        "indices": [
          297,
          320
        ],
        "media_key": "3_1998353723442032640",
        "media_url_https": "https://pbs.twimg.com/media/G7uUH61XcAAaOuT.jpg",
        "type": "photo",
        "url": "https://t.co/IKqzCFH7sG",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": [
              {
                "x": 1862,
                "y": 384,
                "h": 80,
                "w": 80
              },
              {
                "x": 278,
                "y": 382,
                "h": 106,
                "w": 106
              }
            ]
          },
          "medium": {
            "faces": [
              {
                "x": 1091,
                "y": 225,
                "h": 46,
                "w": 46
              },
              {
                "x": 162,
                "y": 223,
                "h": 62,
                "w": 62
              }
            ]
          },
          "small": {
            "faces": [
              {
                "x": 618,
                "y": 127,
                "h": 26,
                "w": 26
              },
              {
                "x": 92,
                "y": 126,
                "h": 35,
                "w": 35
              }
            ]
          },
          "orig": {
            "faces": [
              {
                "x": 1862,
                "y": 384,
                "h": 80,
                "w": 80
              },
              {
                "x": 278,
                "y": 382,
                "h": 106,
                "w": 106
              }
            ]
          }
        },
        "sizes": {
          "large": {
            "h": 1144,
            "w": 2048,
            "resize": "fit"
          },
          "medium": {
            "h": 670,
            "w": 1200,
            "resize": "fit"
          },
          "small": {
            "h": 380,
            "w": 680,
            "resize": "fit"
          },
          "thumb": {
            "h": 150,
            "w": 150,
            "resize": "crop"
          }
        },
        "original_info": {
          "height": 1144,
          "width": 2048,
          "focus_rects": [
            {
              "x": 0,
              "y": 0,
              "w": 2043,
              "h": 1144
            },
            {
              "x": 0,
              "y": 0,
              "w": 1144,
              "h": 1144
            },
            {
              "x": 0,
              "y": 0,
              "w": 1004,
              "h": 1144
            },
            {
              "x": 174,
              "y": 0,
              "w": 572,
              "h": 1144
            },
            {
              "x": 0,
              "y": 0,
              "w": 2048,
              "h": 1144
            }
          ]
        },
        "allow_download_status": {
          "allow_download": true
        },
        "media_results": {
          "result": {
            "media_key": "3_1998353723442032640"
          }
        }
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "timestamps": [],
    "urls": [
      {
        "display_url": "ai-cosmos.hashnode.dev/the-attention-…",
        "expanded_url": "https://ai-cosmos.hashnode.dev/the-attention-bottleneck-ai-failure-modes-explained",
        "url": "https://t.co/DybOvoBDEw",
        "indices": [
          1104,
          1127
        ]
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}