🐦 Twitter Post Details

Viewing enriched Twitter post

@AlphaSignalAI

A trillion-parameter model just made half its brain disappear. It got smarter. Yuan3.0 Ultra is a new open-source multimodal MoE model from Yuan Lab. 1010B total parameters, only 68.8B active at inference. It beat GPT-5.2, Gemini 3.1 Pro, and Claude Opus 4.6 on RAG benchmarks by wide margins. 67.4% on Docmatix vs GPT-4o's 56.8%. Here's what it unlocks: > Enterprise RAG with 68.2% avg accuracy across 10 retrieval tasks > Complex table understanding at 62.3% on MMTab > Text-to-SQL generation scoring 83.9% on Spider 1.0 > Multimodal doc analysis with a 64K context window The key innovation: Layer-Adaptive Expert Pruning (LAEP). During pretraining, expert token loads become wildly imbalanced. Some experts get 500x more tokens than others. LAEP prunes the underused ones layer by layer, cutting 33% of parameters while boosting training efficiency by 49%. They also refined "fast-thinking" RL. Correct answers with fewer reasoning steps get rewarded more. This cut output tokens by 14.38% while improving accuracy by 16.33%. The bigger signal here: MoE models are learning to self-compress during training, not after. If pruning becomes part of pretraining, the cost curve for trillion-scale models shifts dramatically.

Media 1

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029263515421405344/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2029263515421405344/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    }
  ],
  "processed_at": "2026-03-06T14:09:29.949984",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2029263515421405344",
  "url": "https://x.com/AlphaSignalAI/status/2029263515421405344",
  "twitterUrl": "https://twitter.com/AlphaSignalAI/status/2029263515421405344",
  "text": "A trillion-parameter model just made half its brain disappear. It got smarter.\n\nYuan3.0 Ultra is a new open-source multimodal MoE model from Yuan Lab. 1010B total parameters, only 68.8B active at inference.\n\nIt beat GPT-5.2, Gemini 3.1 Pro, and Claude Opus 4.6 on RAG benchmarks by wide margins. \n\n67.4% on Docmatix vs GPT-4o's 56.8%.\n\nHere's what it unlocks:\n> Enterprise RAG with 68.2% avg accuracy across 10 retrieval tasks\n> Complex table understanding at 62.3% on MMTab\n> Text-to-SQL generation scoring 83.9% on Spider 1.0\n> Multimodal doc analysis with a 64K context window\n\nThe key innovation: Layer-Adaptive Expert Pruning (LAEP).\n\nDuring pretraining, expert token loads become wildly imbalanced. Some experts get 500x more tokens than others. \n\nLAEP prunes the underused ones layer by layer, cutting 33% of parameters while boosting training efficiency by 49%.\n\nThey also refined \"fast-thinking\" RL. Correct answers with fewer reasoning steps get rewarded more. This cut output tokens by 14.38% while improving accuracy by 16.33%.\n\nThe bigger signal here: MoE models are learning to self-compress during training, not after. If pruning becomes part of pretraining, the cost curve for trillion-scale models shifts dramatically.",
  "source": "Twitter for iPhone",
  "retweetCount": 12,
  "replyCount": 9,
  "likeCount": 111,
  "quoteCount": 0,
  "viewCount": 9726,
  "createdAt": "Wed Mar 04 18:31:38 +0000 2026",
  "lang": "en",
  "bookmarkCount": 61,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2029263515421405344",
  "displayTextRange": [
    0,
    278
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "AlphaSignalAI",
    "url": "https://x.com/AlphaSignalAI",
    "twitterUrl": "https://twitter.com/AlphaSignalAI",
    "id": "114783808",
    "name": "AlphaSignal AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": "Business",
    "profilePicture": "https://pbs.twimg.com/profile_images/2014100845189529600/Ff1Xc28-_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/114783808/1769023589",
    "description": "The latest news from the top 100 companies in AI. Over 280,000 devs read our newsletter.",
    "location": "Signup →",
    "followers": 11621,
    "following": 277,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Tue Feb 16 16:04:01 +0000 2010",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {
        "urls": [
          {
            "display_url": "AlphaSignal.ai",
            "expanded_url": "http://AlphaSignal.ai",
            "indices": [
              0,
              23
            ],
            "url": "https://t.co/Fnj7bpl3qS"
          }
        ]
      }
    },
    "fastFollowersCount": 0,
    "favouritesCount": 2493,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 37,
    "statusesCount": 168,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1888966579125153939"
    ],
    "profile_bio": {},
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.x.com/nuzvOTknKU",
        "expanded_url": "https://x.com/AlphaSignalAI/status/2029263515421405344/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": [
              {
                "h": 78,
                "w": 78,
                "x": 822,
                "y": 752
              },
              {
                "h": 112,
                "w": 112,
                "x": 340,
                "y": 538
              },
              {
                "h": 132,
                "w": 132,
                "x": 440,
                "y": 532
              }
            ]
          },
          "medium": {
            "faces": [
              {
                "h": 45,
                "w": 45,
                "x": 481,
                "y": 440
              },
              {
                "h": 65,
                "w": 65,
                "x": 199,
                "y": 315
              },
              {
                "h": 77,
                "w": 77,
                "x": 257,
                "y": 311
              }
            ]
          },
          "orig": {
            "faces": [
              {
                "h": 156,
                "w": 156,
                "x": 1644,
                "y": 1504
              },
              {
                "h": 224,
                "w": 224,
                "x": 680,
                "y": 1076
              },
              {
                "h": 264,
                "w": 264,
                "x": 880,
                "y": 1064
              }
            ]
          },
          "small": {
            "faces": [
              {
                "h": 25,
                "w": 25,
                "x": 272,
                "y": 249
              },
              {
                "h": 37,
                "w": 37,
                "x": 112,
                "y": 178
              },
              {
                "h": 43,
                "w": 43,
                "x": 146,
                "y": 176
              }
            ]
          }
        },
        "id_str": "2029263234466033664",
        "indices": [
          279,
          302
        ],
        "media_key": "3_2029263234466033664",
        "media_results": {
          "result": {
            "media_key": "3_2029263234466033664"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HClkKQYbsAAzluB.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 2294,
              "w": 4096,
              "x": 0,
              "y": 0
            },
            {
              "h": 2393,
              "w": 2393,
              "x": 953,
              "y": 0
            },
            {
              "h": 2393,
              "w": 2099,
              "x": 1100,
              "y": 0
            },
            {
              "h": 2393,
              "w": 1197,
              "x": 1551,
              "y": 0
            },
            {
              "h": 2393,
              "w": 4096,
              "x": 0,
              "y": 0
            }
          ],
          "height": 2393,
          "width": 4096
        },
        "sizes": {
          "large": {
            "h": 1197,
            "resize": "fit",
            "w": 2048
          },
          "medium": {
            "h": 701,
            "resize": "fit",
            "w": 1200
          },
          "small": {
            "h": 397,
            "resize": "fit",
            "w": 680
          },
          "thumb": {
            "h": 150,
            "resize": "crop",
            "w": 150
          }
        },
        "type": "photo",
        "url": "https://t.co/nuzvOTknKU"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}