🐦 Twitter Post Details

Viewing enriched Twitter post

@Prince_Canuma

mlx-vlm v0.4.0 is here 🚀 New models: • Moondream3 by @vikhyatk • Phi-4-reasoning-vision by @MSFTResearch • Phi4-multimodal-instruct by @MSFTResearch • Minicpm-o-2.5 (except tts) by @OpenBMB What's new: → Full weight finetuning + ORPO h/t @ActuallyIsaak → Tool calling in server → Thinking budget support → KV cache quantization for server → Fused SDPA attention optimization → Streaming & OpenAI-compatible endpoint improvements Fixes: • Gemma3n • Qwen3-VL • Qwen3.5-MoE • Qwen3-Omni h/t @ronaldseoh • Batch inference, and more. Big shoutout to 7 new contributors this release! 🙌 Get started today: > uv pip install -U mlx-vlm Leave us a star ⭐️ https://t.co/un61O8fEZd

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2030360667727745165/media_0.jpg",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2030360667727745165/media_0.jpg",
      "type": "photo",
      "filename": "media_0.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2030360667727745165/media_1.jpg",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2030360667727745165/media_1.jpg",
      "type": "photo",
      "filename": "media_1.jpg"
    }
  ],
  "processed_at": "2026-03-09T01:32:21.462252",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "2030360667727745165",
  "url": "https://x.com/Prince_Canuma/status/2030360667727745165",
  "twitterUrl": "https://twitter.com/Prince_Canuma/status/2030360667727745165",
  "text": "mlx-vlm v0.4.0 is here 🚀\n\nNew models: \n• Moondream3 by @vikhyatk \n• Phi-4-reasoning-vision by @MSFTResearch \n• Phi4-multimodal-instruct by @MSFTResearch \n• Minicpm-o-2.5 (except tts) by @OpenBMB \n\nWhat's new: \n→ Full weight finetuning + ORPO h/t @ActuallyIsaak \n→ Tool calling in server \n→ Thinking budget support \n→ KV cache quantization for server \n→ Fused SDPA attention optimization \n→ Streaming & OpenAI-compatible endpoint improvements\n\nFixes:\n• Gemma3n\n• Qwen3-VL \n• Qwen3.5-MoE\n• Qwen3-Omni h/t @ronaldseoh\n• Batch inference, and more.\n\nBig shoutout to 7 new contributors this release! 🙌\n\nGet started today: \n\n> uv pip install -U mlx-vlm\n\nLeave us a star ⭐️ \n\nhttps://t.co/un61O8fEZd",
  "source": "Twitter for iPhone",
  "retweetCount": 20,
  "replyCount": 4,
  "likeCount": 115,
  "quoteCount": 4,
  "viewCount": 12610,
  "createdAt": "Sat Mar 07 19:11:20 +0000 2026",
  "lang": "en",
  "bookmarkCount": 49,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2030360667727745165",
  "displayTextRange": [
    0,
    268
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "Prince_Canuma",
    "url": "https://x.com/Prince_Canuma",
    "twitterUrl": "https://twitter.com/Prince_Canuma",
    "id": "726009698",
    "name": "Prince Canuma",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/2018690435636510720/Aq3-a3Mv_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/726009698/1770128291",
    "description": "",
    "location": "Krakow, Poland",
    "followers": 10153,
    "following": 1188,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Mon Jul 30 12:54:47 +0000 2012",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 20468,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 2545,
    "statusesCount": 20652,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "2026022469010599954"
    ],
    "profile_bio": {
      "description": "Apple MLX King 🤴🏽• Creator of (mlx-audio & mlx-vlm) • Ex-@arcee_ai • @neptune_ai • https://t.co/iZnxoefJBU",
      "entities": {
        "description": {
          "hashtags": [],
          "symbols": [],
          "urls": [
            {
              "display_url": "linktr.ee/prince.canuma",
              "expanded_url": "https://linktr.ee/prince.canuma",
              "indices": [
                83,
                106
              ],
              "url": "https://t.co/iZnxoefJBU"
            }
          ],
          "user_mentions": [
            {
              "id_str": "0",
              "indices": [
                57,
                66
              ],
              "name": "",
              "screen_name": "arcee_ai"
            },
            {
              "id_str": "0",
              "indices": [
                69,
                80
              ],
              "name": "",
              "screen_name": "neptune_ai"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "github.com/Blaizzy",
              "expanded_url": "https://github.com/Blaizzy",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/ELxUGGEMjU"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.twitter.com/kOoEWXzF4v",
        "expanded_url": "https://twitter.com/Prince_Canuma/status/2030360667727745165/photo/1",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "orig": {
            "faces": []
          }
        },
        "id_str": "2030360659406589952",
        "indices": [
          269,
          292
        ],
        "media_key": "3_2030360659406589952",
        "media_results": {
          "id": "QXBpTWVkaWFSZXN1bHRzOgwAAQoAARwtSkMt25AACgACHC1KRR3WUI0AAA==",
          "result": {
            "__typename": "ApiMedia",
            "id": "QXBpTWVkaWE6DAABCgABHC1KQy3bkAAKAAIcLUpFHdZQjQAA",
            "media_key": "3_2030360659406589952"
          }
        },
        "media_url_https": "https://pbs.twimg.com/media/HC1KQy3bkAA2Xwb.jpg",
        "original_info": {
          "focus_rects": [
            {
              "h": 457,
              "w": 816,
              "x": 0,
              "y": 0
            },
            {
              "h": 816,
              "w": 816,
              "x": 0,
              "y": 0
            },
            {
              "h": 930,
              "w": 816,
              "x": 0,
              "y": 0
            },
            {
              "h": 1554,
              "w": 777,
              "x": 0,
              "y": 0
            },
            {
              "h": 1554,
              "w": 816,
              "x": 0,
              "y": 0
            }
          ],
          "height": 1554,
          "width": 816
        },
        "sizes": {
          "large": {
            "h": 1554,
            "w": 816
          }
        },
        "type": "photo",
        "url": "https://t.co/kOoEWXzF4v"
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "timestamps": [],
    "urls": [
      {
        "display_url": "github.com/Blaizzy/mlx-vl…",
        "expanded_url": "http://github.com/Blaizzy/mlx-vlm/releases/tag/v0.4.0",
        "indices": [
          668,
          691
        ],
        "url": "https://t.co/un61O8fEZd"
      }
    ],
    "user_mentions": [
      {
        "id_str": "17406365",
        "indices": [
          55,
          64
        ],
        "name": "vik",
        "screen_name": "vikhyatk"
      },
      {
        "id_str": "21457289",
        "indices": [
          94,
          107
        ],
        "name": "Microsoft Research",
        "screen_name": "MSFTResearch"
      },
      {
        "id_str": "21457289",
        "indices": [
          139,
          152
        ],
        "name": "Microsoft Research",
        "screen_name": "MSFTResearch"
      },
      {
        "id_str": "1496119294844825600",
        "indices": [
          186,
          194
        ],
        "name": "OpenBMB",
        "screen_name": "OpenBMB"
      },
      {
        "id_str": "1497868184464105475",
        "indices": [
          246,
          260
        ],
        "name": "Gökdeniz Gülmez",
        "screen_name": "ActuallyIsaak"
      },
      {
        "id_str": "1050961850169344000",
        "indices": [
          503,
          514
        ],
        "name": "Ronald Seoh",
        "screen_name": "ronaldseoh"
      }
    ]
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "isLimitedReply": false,
  "article": null
}