Memory for
AI agents.
Your AI agent forgets everything between sessions. Engram gives it persistent, intelligent memory — a temporal knowledge graph that extracts facts, detects conflicts, and consolidates over time. Open source. Works with any framework through MCP.
Every conversation starts from zero. Your agent re-asks the same questions, forgets preferences, loses project context.
Embedding raw messages is not understanding. There is no fact extraction, no conflict resolution, no temporal awareness.
Mem0 needs Qdrant. Zep needs Neo4j. Letta needs PostgreSQL. By the time you have configured the stack, you have forgotten what you were building.
Most memory solutions are tied to one framework. Switch from LangChain to CrewAI and your memory layer breaks.
Not a vector store.
A knowledge graph that learns.
Raw conversation messages go in. Structured facts come out. Entities, relationships, preferences, events — all extracted automatically via LLM.
When new information contradicts existing facts, Engram detects the conflict and supersedes the old fact. "I moved to Austin" invalidates "I live in Seattle."
Vector search + FTS5 keyword search + graph spreading activation. Six retrieval signals combined: semantic similarity, temporal proximity, node importance, calibrated confidence, keyword match, and graph connections.
Background engine that decays stale facts, promotes important ones, deduplicates, summarizes entity histories, and builds preference rollups. Memory stays clean over time.
Facts carry timestamps. Queries understand "last week" and "recently." Context output is date-annotated so your agent knows when things happened.
One binary speaks Ollama (local, free), OpenAI-compatible endpoints, Anthropic Claude, Google Gemini, or a shell-out command. Set one env var.
One memory layer. Any framework.
Engram exposes 11 MCP tools. Any client that speaks MCP can use it — no adapter code, no integration library, no lock-in.
Also available as: Rust library (embed in your app) · REST API · Python client · Java client · Spring Boot starter
Running in 60 seconds.
Docker (recommended)
docker run --rm -i \
-v engram-data:/data \
ghcr.io/jamjet-labs/engram-server:0.5.0 Uses local Ollama by default. Zero config.
Cargo install
cargo install jamjet-engram-server
engram serve --db memory.db Native binary. ~3MB. Starts instantly.
MCP config (Claude Desktop)
{
"mcpServers": {
"memory": {
"command": "engram",
"args": ["serve", "--db", "memory.db"]
}
}
} Add to your MCP config and restart. Done.
memory_add memory_recall memory_context memory_search memory_forget memory_stats memory_consolidate messages_save messages_get messages_list messages_delete Honest comparison.
| Capability | Engram | Mem0 | Zep | Letta |
|---|---|---|---|---|
| Fact extraction | Yes (LLM-based) | Yes | Basic | No |
| Conflict detection | Yes (auto-supersede) | Partial | No | No |
| Knowledge graph | Yes (entities + relationships) | v2 (new) | Yes | No |
| Hybrid retrieval | Vector + FTS5 + graph (6 signals) | Vector + graph | Hybrid | Vector only |
| Consolidation engine | 5 ops (decay, promote, dedup, summarize, reflect) | No | No | No |
| MCP server | Yes (11 tools) | No | No | No |
| Zero-infra quickstart | SQLite (single file) | Needs Qdrant | Needs Neo4j + Docker | Needs PostgreSQL |
| LLM provider choice | Ollama, OpenAI, Anthropic, Google, shell-out | OpenAI default | OpenAI default | OpenAI default |
| Message store | Built-in (save/get/list/delete) | No | Yes | Yes |
| Spring Boot starter | Maven Central | No | No | No |
| Maturity | New (v0.5.0, small community) | Established (large community) | Established | Established (ex-MemGPT) |
All frameworks evolve. Check their docs for the latest. Engram's advantage is MCP-native distribution + zero-infra + consolidation. Its disadvantage is a smaller community and fewer production deployments.
5 registries. One install.
Need more than memory?
Engram is part of JamJet — the open-source runtime for production-safe AI agents. When your agents need policy controls, audit trails, human approval, crash recovery, and cost governance, JamJet has it built in.