@XiaomiMiMo
β‘ Faster than Fast. Designed for Agentic AI. Introducing Xiaomi MiMo-V2-Flash β our new open-source MoE model: 309B total params, 15B active. Blazing speed meets frontier performance. π₯ Highlights: ποΈ Hybrid Attention: 5:1 interleaved 128-window SWA + Global | 256K context π Performance: βοΈ Matches DeepSeek-V3.2 on general benchmarks β at a fraction of the latency π SWE-Bench Verified: 73.4% | SWE-Bench Multilingual: 71.7% β new SOTA for open-source models π Speed: 150 output tokens/s with Day-0 support from @lmsysorgπ€ π€ Model: https://t.co/4Etm0yZKTL π Blog Post: https://t.co/5zxmcDuB6o π Technical Report: https://t.co/crac1YTLYl π¨ AI Studio: https://t.co/nSReUs6QgW