🐦 Twitter Post Details

Viewing enriched Twitter post

@_weiping

🚀 Introducing Nemotron-Cascade! 🚀 We’re thrilled to release Nemotron-Cascade, a family of general-purpose reasoning models trained with cascaded, domain-wise reinforcement learning (Cascade RL), delivering best-in-class performance across a wide range of benchmarks. 💻 Coding powerhouse After RL, our 14B model: • Surpasses DeepSeek-R1-0528 (671B) on LiveCodeBench v5/v6/Pro. • Achieves silver-medal performance at IOI 2025 🥈. • Reaches a 43.1% pass@1 on SWE-Bench Verified, and 53.8% with test-time scaling. 🧠 What is Cascade RL? Instead of mixing heterogeneous prompts across domains, Cascade RL trains sequentially, domain by domain, which reduces engineering complexity, mitigates heterogeneous verification latencies, and enables domain-specific curricula and tailored hyperparameter tuning. ✨ Key insight Using RLHF for alignment as a pre-step dramatically boosts complex reasoning—far beyond preference optimization. Subsequent domain-wise RLVR stages rarely hurt the benchmark performance attained in earlier domains and may even improve it, as illustrated in the following figure. 🤗 Models & training data 🔥 👉 https://t.co/wfVcAaMocA 📄 Technical report with detailed training and data recipes 👉 https://t.co/FdMINvB4yM

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2000947255088701628/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2000947255088701628/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2000947255088701628/media_1.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2000947255088701628/media_1.jpg?",
      "type": "photo",
      "filename": "media_1.jpg"
    }
  ],
  "processed_at": "2025-12-16T18:10:36.738730",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2000947255088701628",
  "url": "https://x.com/_weiping/status/2000947255088701628",
  "twitterUrl": "https://twitter.com/_weiping/status/2000947255088701628",
  "text": "🚀 Introducing Nemotron-Cascade! 🚀\n\nWe’re thrilled to release Nemotron-Cascade, a family of general-purpose reasoning models trained with cascaded, domain-wise reinforcement learning (Cascade RL), delivering best-in-class performance across a wide range of benchmarks.\n\n💻 Coding powerhouse\nAfter RL, our 14B model:\n• Surpasses DeepSeek-R1-0528 (671B) on LiveCodeBench v5/v6/Pro.\n• Achieves silver-medal performance at IOI 2025 🥈.\n• Reaches a 43.1% pass@1 on SWE-Bench Verified, and 53.8% with test-time scaling.\n\n🧠 What is Cascade RL?\nInstead of mixing heterogeneous prompts across domains, Cascade RL trains sequentially, domain by domain, which reduces engineering complexity, mitigates heterogeneous verification latencies, and enables domain-specific curricula and tailored hyperparameter tuning.\n\n✨ Key insight\nUsing RLHF for alignment as a pre-step dramatically boosts complex reasoning—far beyond preference optimization. Subsequent domain-wise RLVR stages rarely hurt the benchmark performance attained in earlier domains and may even improve it, as illustrated in the following figure.\n\n🤗 Models & training data 🔥\n👉 https://t.co/wfVcAaMocA\n\n📄 Technical report with detailed training and data  recipes \n👉 https://t.co/FdMINvB4yM",
  "source": "Twitter for iPhone",
  "retweetCount": 26,
  "replyCount": 3,
  "likeCount": 169,
  "quoteCount": 5,
  "viewCount": 10357,
  "createdAt": "Tue Dec 16 15:12:56 +0000 2025",
  "lang": "en",
  "bookmarkCount": 76,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2000947255088701628",
  "displayTextRange": [
    0,
    278
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "_weiping",
    "url": "https://x.com/_weiping",
    "twitterUrl": "https://twitter.com/_weiping",
    "id": "1273059094325153793",
    "name": "Wei Ping",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1931247368193896448/vdfbAXh-_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/1273059094325153793/1730273308",
    "description": "",
    "location": "San Francisco, CA",
    "followers": 2449,
    "following": 339,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Wed Jun 17 01:05:33 +0000 2020",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 605,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 15,
    "statusesCount": 301,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2000947255088701628"
    ],
    "profile_bio": {
      "description": "distinguished research scientist @nvidia | post-training, RL, multimodal | generative models for audio.\nViews are my own.",
      "entities": {
        "description": {
          "user_mentions": [
            {
              "id_str": "0",
              "indices": [
                33,
                40
              ],
              "name": "",
              "screen_name": "nvidia"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "wpingnet.github.io",
              "expanded_url": "https://wpingnet.github.io/",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/ZO9fS3Un4J"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "allow_download_status": {
          "allow_download": true
        },
        "display_url": "pic.twitter.com/ZPbGPJNwwW",
        "expanded_url": "https://twitter.com/_weiping/status/2000947255088701628/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {},
          "orig": {}
        },
        "id_str": "2000944809301958660",
        "indices": [
          279,
          302
        ],
        "media_key": "3_2000944809301958660",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARvEyLPdmuAECgACG8TK7VHa4LwAAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABG8TIs92a4AQKAAIbxMrtUdrgvAAA",
            "media_key": "3_2000944809301958660"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/G8TIs92a4AQQ4Zq.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 2294,
              "w": 4096,
              "x": 0,
              "y": 0
            },
            {
              "h": 2440,
              "w": 2440,
              "x": 0,
              "y": 0
            },
            {
              "h": 2440,
              "w": 2140,
              "x": 0,
              "y": 0
            },
            {
              "h": 2440,
              "w": 1220,
              "x": 0,
              "y": 0
            },
            {
              "h": 2440,
              "w": 4096,
              "x": 0,
              "y": 0
            }
          ],
          "height": 2440,
          "width": 4096
        },
        "sizes": {
          "large": {
            "h": 1220,
            "w": 2048
          }
        },
        "type": "photo",
        "url": "https://t.co/ZPbGPJNwwW"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "urls": [
      {
        "display_url": "huggingface.co/collections/nv…",
        "expanded_url": "https://huggingface.co/collections/nvidia/nemotron-cascade",
        "indices": [
          1124,
          1147
        ],
        "url": "https://t.co/wfVcAaMocA"
      },
      {
        "display_url": "arxiv.org/pdf/2512.13607",
        "expanded_url": "https://arxiv.org/pdf/2512.13607",
        "indices": [
          1212,
          1235
        ],
        "url": "https://t.co/FdMINvB4yM"
      }
    ]
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}