The heroine of "Resident Evil" created an AI memory system using Claude, achieving a perfect score on the LongMemEval benchmark.

robot
Abstract generation in progress

According to 1M AI News monitoring, Hollywood actress Milla Jovovich (best known for her roles in The Fifth Element and the Resident Evil series) and Ben Sigman, a Bitcoin entrepreneur and the founder of decentralized lending platform Libre, jointly developed the open-source AI memory system MemPalace. Released on GitHub under the MIT license, it received 5,500 stars within three days. Sigman said the two spent months developing the project with Anthropic’s Claude, and in the commit history, Claude Opus 4.6 is listed as a co-author of the code.

MemPalace’s core advantage is its benchmark performance. On the industry-standard memory retrieval benchmark LongMemEval, pure local retrieval (without calling any external APIs) achieved 96.6% Recall@5. After enabling the optional Haiku model re-ranking, it got a perfect score with all 500 questions correct. The project team says this is the highest score this benchmark has ever seen, for both free and paid products. On two other benchmarks, ConvoMem scored 92.9%, claiming it exceeds AI memory product Mem0 by more than two times; LoCoMo received full marks across all multi-hop reasoning categories. Benchmark test code has been released with the repository so the results can be reproduced.

Unlike common vector database approaches, MemPalace imitates the “memory palace” technique of an ancient Greek orator to organize information. The system mines and organizes a user’s conversation logs into a four-layer structure: Wing (Wing, grouped by person or project) → Room (Room, specific topic) → Closet (Closet, compressed summaries) → Drawer (Drawer, verbatim dialogue records). Related rooms within the same Wing are connected laterally through a “Hall,” while different Wings cross-reference each other via “Tunnels.” Project testing shows that with only this structure, retrieval accuracy can be improved by 34%.

The project also created its own lossless compression dialect named AAAK, designed specifically for AI agents. It compresses user context of over a thousand tokens into about 120 tokens, for a compression ratio of roughly 30x. AAAK is purely structured text: it does not require a special decoder or fine-tuning, and any large language model that can read text can understand it directly. The system also includes built-in contradiction detection, able to catch inconsistencies before output, such as names, pronouns, and ages.

The entire system runs fully on-device. It does not rely on cloud services, does not require an API key, and is free of charge. It supports connecting to tools like Claude, ChatGPT, and Cursor via the MCP protocol (providing 19 MCP tools), and it also supports generating context summaries using command-line behavior with local models like Llama and Mistral.

Jovovich’s cross-over into the tech world felt somewhat unexpected. The project repository is registered under her GitHub account, and in 7 commits, 4 were made by her, including the initial commit containing all core code. She posted a project introduction video on Instagram.

BTC4.96%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin