@llama_index
Do you want to run coding agents safely, without damaging to your filesystem? π Last week, we published a blog post and a demo showing exactly how to do this with @claudeai and AgentFS by @tursodatabase. After strong community interest, weβve now shipped support for @OpenAI Codex as well π’ How it works: π» Launch the filesystem MCP server π Open a new demo session π Start coding with Codex Supporting Codex unlocks a big advantage: developers can use any OpenAI-compatible provider, including @ollama and @huggingface Inference API. This means more flexibility and safer experimentation, all without compromising your local environment. Let us know what you build with it! π©βπ» Find the code on GitHub: https://t.co/lCeaFCYHve π Read the blog: https://t.co/IiCW8Bo0NZ