⠀⠀⠀⠀⠀⠀⠀⠀⣠⣤⣤⣤⣤⣄⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⢀⣤⣾⣿⣿⣿⣿⣿⣿⣿⣿⡀⠀⠀⠀⠀ ⠀⠀⠀⣴⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣧⠀⠀⠀⠀ ⠀⠀⠀⣿⣿⣿⣿⣿⣿⣿⣿⣿⠈⠯⢹⣿⠀⠀⠀⠀ ⠀⠀⠀⣿⣿⣿⠟⠋⠉⠙⣿⣿⠀⠀⠀⠻⠷⠖⠀⠀ ⠀⠀⠐⠛⠛⠛⠀⠀⠀⠀⠛⠛⠃⠀⠀⠀⠀⠀⠀⠀

███████╗██╗ ██╗ ██████╗ ██████╗ ██╗ ██╗ ██╔════╝██║ ██║██╔═══██╗██╔══██╗██║ ██║ ███████╗███████║██║ ██║██║ ██║███████║ ╚════██║██╔══██║██║ ██║██║ ██║██╔══██║ ███████║██║ ██║╚██████╔╝██████╔╝██║ ██║ ╚══════╝╚═╝ ╚═╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝
A cognitive brain for AI agents. Three-tier memory architecture based on Cowan's model. Connections strengthen with use, decay naturally over time—just like biological neurons.
Features
What makes shodh-memory different
Hebbian Learning
Connections that fire together wire together. Frequently accessed associations become permanent through Long-Term Potentiation.
Runs Offline
Single ~30MB binary with no cloud dependency. Works on Raspberry Pi, Jetson, industrial PCs, air-gapped systems.
Sub-Millisecond
Graph lookups in <1μs. Semantic search in 34-58ms. Fast enough for real-time agent decision making.
Neuroscience-Grounded
3-tier architecture based on Cowan's working memory model. Hybrid decay (exponential + power-law) from cognitive research.
Knowledge Graph
Not just vector search—includes spreading activation, interference detection, and memory replay during consolidation.
MCP Integration
First-class Model Context Protocol support. Works with Claude Code, Cursor, and any MCP-compatible agent.
Why shodh-memory?
What makes this different from mem0, zep, cognee, and other memory solutions
Not another vector database
Most "memory" solutions are just vector search with a wrapper. Shodh-memory has a knowledge graph, temporal indices, and hybrid ranking. Connections between memories strengthen when accessed together—like biological synapses.
No cloud required
Mem0, Zep, and others are cloud-first. Shodh-memory is a single ~30MB binary. No API keys, no Docker, no external dependencies. Your agent's memory runs on your hardware.
Memory that gets smarter
Static storage forgets nothing and learns nothing. Shodh-memory uses Hebbian learning—frequently co-accessed memories form stronger bonds. Rarely used knowledge decays naturally. Just like your brain.
Edge-first architecture
Designed for robots, IoT, air-gapped systems. Runs on Raspberry Pi Zero. Sub-microsecond graph lookups. Your drone doesn't need WiFi to remember.
| Feature | shodh | mem0 | zep | cognee |
|---|---|---|---|---|
| Runs fully offline | ✓ | — | — | — |
| Single binary, no Docker | ✓ | — | — | — |
| Hebbian learning | ✓ | — | — | — |
| Knowledge graph | ✓ | — | ✓ | ✓ |
| Memory decay model | ✓ | — | — | — |
| Runs on Raspberry Pi | ✓ | — | — | — |
| Sub-millisecond lookup | ✓ | — | — | — |
| Open source | ✓ | ✓ | ✓ | ✓ |
Others give you storage. We give you cognition.
Context Durability
How long do memories actually last? Here's the science.
| Time | Normal | Potentiated |
|---|---|---|
| Day 1 | 50% | 70% |
| Day 7 | 35% | 55% |
| Day 30 | 18% | 40% |
| Day 90 | 10% | 28% |
| Day 365 | >1% | >5% |
Strength
100% |*
| *
70% | * <- Potentiated (10+ accesses)
| *____
50% | * \____
| * \________
30% | * \___________
| \____
10% |----------------------------------------\__
| Normal decay *
1% |--------------------------------------------*-
+----+----+----+----+----+----+----+----+----+->
3d 7d 14d 30d 60d 90d 180d 365d
Time
[====] Exponential [----] Power-law (heavy tail)
(0-3 days) (3+ days)
Hybrid Decay Model
Exponential decay for the first 3 days (consolidation phase), then power-law for long-term retention. Memories never truly hit zero.
Long-Term Potentiation
Memories accessed 10+ times become "potentiated" and decay 10x slower. Effective half-life jumps from 14 days to ~140 days.
Hebbian Strengthening
Co-accessed memories form stronger bonds. Fire together, wire together. Associations strengthen with use, weaken with neglect.
Memory Replay
During maintenance cycles, important memories are replayed and strengthened—mimicking hippocampal replay during sleep.
Memories accessed 10+ times become permanent.
Even rarely-accessed memories retain >1% strength after a year due to power-law decay.
See the research behind our memory model →Architecture
Cowan's working memory model, implemented
Sensory Buffer
Immediate context window. Raw input before processing.
Working Memory
Active manipulation space. Current task context and associations.
Long-Term Memory
Persistent storage. Episodic + semantic with Hebbian strengthening.
┌─────────────────────────────────┐
│ MCP / API Layer │
│ remember recall forget ... │
└───────────────┬─────────────────┘
│
▼
┌───────────────────────────────────────────────────────────────────┐
│ MEMORY CORE (Rust) │
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
│ │ SENSORY │ │ WORKING │ │ LONG-TERM │ │
│ │ BUFFER │─▶│ MEMORY │──▶│ MEMORY │ │
│ │ ~7 items │ │ ~4 chunks │ │ unlimited │ │
│ │ decay:<1s │ │ decay:mins │ │ decay:power-law │ │
│ └─────────────┘ └─────────────┘ └─────────────────────────┘ │
│ │ │ │ │
│ └────attention────┴────consolidation─────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ RETRIEVAL SUBSYSTEM │ │
│ │ │ │
│ │ VECTOR INDEX KNOWLEDGE GRAPH TEMPORAL INDEX │ │
│ │ (HNSW) (Hebbian) (decay) │ │
│ │ │ │
│ │ │ │ │ │ │
│ │ └──────────────────┼───────────────────┘ │ │
│ │ ▼ │ │
│ │ HYBRID RANKING │ │
│ │ vector + graph + time │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ RocksDB │ │
│ │ memories | graph | vectors | episodes │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
└───────────────────────────────────────────────────────────────────┘
│
┌─────────────────────┴─────────────────────┐
▼ ▼
┌──────────────────────────┐ ┌───────────────────────────┐
│ HEBBIAN CONSOLIDATION │◀────────▶ │ INTERFERENCE ENGINE │
│ │ │ │
│co-activation strengthens │ │ similar memories compete │
│ edge.weight += η·Δw │ │ old decays when new fits │
└──────────────────────────┘ └───────────────────────────┘
Installation
Get started in seconds
Try It
Type commands to interact with a simulated memory system
Ready to add memory to your AI agents?