@Scobleizer
RT @Vassivasss: What if AI could see the world the way we do? Thatโs the idea we bet our weekend on at the Mistral Worldwide Hackathon. With @haaspierre_ and Arman Artola-Zanganeh, we built ๐ฃ๐ผ๐ฟ๐:๐ช๐ผ๐ฟ๐น๐ฑ๐, an open-source framework that lets anyone connect their Meta glasses to any AI system. Let me take you back to saturday morning. So before knowing it could work we needed the hardware. So I ran to Rue de Rivoli and bought โฌ500 Meta glasses on the spot. If thatโs not commitment, I donโt know what is (a true bet). We then built non-stop for 36 hours to make it usable. End-to-end. The glasses stream what you see โ the AI makes sense of it โ it answers back through the glassesโ speaker. And suddenly when we understood that it was going to work, the question changed. It was no longer โ๐๐ ๐๐ต๐ถ๐ ๐ฑ๐ผ๐ฎ๐ฏ๐น๐ฒ?โ It became โ๐ช๐ต๐ฎ๐ ๐ฐ๐ฎ๐ป ๐ฝ๐ฒ๐ผ๐ฝ๐น๐ฒ ๐ฏ๐๐ถ๐น๐ฑ ๐๐ถ๐๐ต ๐๐ต๐ถ๐?โ - A plumber getting live assistance while repairing something. - A technician repairing industrial machinery. - A traveler exploring a new country. - A visually impaired person navigating space. At first, we were looking for the โrightโ use case. Then we realized something more interesting. If AI can share your perspective, continuously, the use cases are not ours to decide. Thatโs why ๐ฃ๐ผ๐ฟ๐:๐ช๐ผ๐ฟ๐น๐ฑ๐ is fully open source. If you want to connect your Meta glasses, plug in your own models, customize with your own prompts, your own MCP, your Openclawโฆ you can. Link to the open source repo (you can contribute and give it a little star โค๏ธ): https://t.co/UueLnkMZpM Link to the demo video: https://t.co/qcTDqKGvax Huge thanks to the organizing team of the hackathon, it was truly great. @Jthmas404