@MiniMax_AI
Introducing MMX-CLI β our first piece of infrastructure built not for humans, but for Agents. Your Agent can read, think, and write. But ask it to sing, paint, or show you a world it's never seen β and it falls silent. Not because it doesn't understand, but because it has no mouth, no hands, no camera. Today, that changes. MMX-CLI gives every Agent seven new senses β image, video, voice, music, vision, search, conversation β powered by MiniMax's full-modal stack, today's SOTA across mainstream omni-modal models. One command: mmxAgent-native I/O. Zero MCP glue. Runs on your existing Token Plan. Two lines to give your Agent a voice: npx skills add MiniMax-AI/cli -y -g npm install -g mmx-cli Then tell it: "you have mmx commands available." It'll learn the rest. Github β https://t.co/fSRc5Lo30j Token Plan: https://t.co/BDCycxepZw