memU: 24/7 Proactive Memory for AI Agents
Summary
This article presents memU, a 24/7 proactive memory framework for AI agents designed to understand and persist user intent while reducing LLM token costs. It details a file-system–like memory model, a three-layer architecture, practical use cases, deployment options, and a dual retrieval strategy (RAG vs. LLM) to support proactive context loading and reactive queries.