Tencent open-sources Agent memory system, OpenClaw saves up to 61% of tokens

robot
Abstract generation in progress

AIMPACT News, May 14 (UTC+8). According to Dongcha Beating monitoring, Tencent Cloud Database’s team spent 6 months specifically tackling the long-conversation forgetfulness issue, and has recently officially open-sourced TencentDB Agent Memory. This is a local-first memory engine designed for AI Agents. By default, it uses SQLite + sqlite-vec as the local backend, can be installed as an OpenClaw plugin, and also supports integration with Hermes Gateway.

Its core is not to directly dump historical conversations into a vector database, but to break memories into two structures. Long-term memory is layered into L0 raw conversations, L1 atomic facts, L2 scene chunking, and L3 user profiles; for short-term task memory, lengthy tool logs are externalized into refs files, step summaries are written into jsonl, and a Mermaid canvas is used to preserve the task structure and node indices. In complex workflows with more than 30 steps, the Agent usually only reads lightweight Mermaid structure diagrams; when it needs to verify details, it returns to the original logs via node_id.

Official benchmarks show that after integrating OpenClaw, the token consumption for WideSearch tasks dropped from 221.31M to 85.64M (a 61.38% decrease), and the success rate increased by 51.52% relative to before. In the long-term memory evaluation PersonaMem, accuracy improved from 48% to 76%. The value of this design is that it does not use a one-time summary to swallow historical details; instead, it preserves the complete trail—from high-level personas and task canvases all the way down to the original text.

(Source: BlockBeats)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pinned