@omarsar0
NEW paper: Memory Intelligence Agent (MIA) MIA boosts GPT-5.4 by up to 9% on LiveVQA. Quick summary: Most memory-augmented agents treat memory as a static retrieval problem. They store trajectories, retrieve similar ones, and hope for the best. But memory that doesn't evolve becomes stale, and storage costs grow without bound. This new framework combines a non-parametric Memory Manager for compressed trajectory storage, a parametric Planner trained via RL to produce search strategies, and an Executor that carries them out. The key innovation is bidirectional conversion between parametric and non-parametric memory, plus test-time learning that updates the Planner on-the-fly during inference. MIA boosts GPT-5.4 by up to 9% on LiveVQA. With a lightweight 7B Executor, it achieves 31% average improvement across eleven benchmarks, outperforming the much larger 32B model by 18%. Memory systems for agents need to evolve, not just accumulate. MIA's alternating RL training and bidirectional memory conversion show that treating memory as a living system, not a database, produces substantially better deep research agents. Paper: https://t.co/eSQa0URWCk Learn to build effective AI agents in our academy: https://t.co/1e8RZKs4uX