Shodh Logo
███████╗██╗  ██╗ ██████╗ ██████╗ ██╗  ██╗
██╔════╝██║  ██║██╔═══██╗██╔══██╗██║  ██║
███████╗███████║██║   ██║██║  ██║███████║
╚════██║██╔══██║██║   ██║██║  ██║██╔══██║
███████║██║  ██║╚██████╔╝██████╔╝██║  ██║
╚══════╝╚═╝  ╚═╝ ╚═════╝ ╚═════╝ ╚═╝  ╚═╝
M E M O R Y
>

A cognitive brain for AI agents. Three-tier memory architecture based on Cowan's model. Connections strengthen with use, decay naturally over time—just like biological neurons.

[01]

Features

What makes shodh-memory different

hebbian-learning.rs
🧠

Hebbian Learning

Connections that fire together wire together. Frequently accessed associations become permanent through Long-Term Potentiation.

runs-offline.rs
🔌

Runs Offline

Single ~30MB binary with no cloud dependency. Works on Raspberry Pi, Jetson, industrial PCs, air-gapped systems.

sub-millisecond.rs

Sub-Millisecond

Graph lookups in <1μs. Semantic search in 34-58ms. Fast enough for real-time agent decision making.

neuroscience-grounded.rs
🔬

Neuroscience-Grounded

3-tier architecture based on Cowan's working memory model. Hybrid decay (exponential + power-law) from cognitive research.

knowledge-graph.rs
🌐

Knowledge Graph

Not just vector search—includes spreading activation, interference detection, and memory replay during consolidation.

mcp-integration.rs
🔧

MCP Integration

First-class Model Context Protocol support. Works with Claude Code, Cursor, and any MCP-compatible agent.

[02]

Why shodh-memory?

What makes this different from mem0, zep, cognee, and other memory solutions

Not another vector database

Most "memory" solutions are just vector search with a wrapper. Shodh-memory has a knowledge graph, temporal indices, and hybrid ranking. Connections between memories strengthen when accessed together—like biological synapses.

🔒

No cloud required

Mem0, Zep, and others are cloud-first. Shodh-memory is a single ~30MB binary. No API keys, no Docker, no external dependencies. Your agent's memory runs on your hardware.

🧠

Memory that gets smarter

Static storage forgets nothing and learns nothing. Shodh-memory uses Hebbian learning—frequently co-accessed memories form stronger bonds. Rarely used knowledge decays naturally. Just like your brain.

📡

Edge-first architecture

Designed for robots, IoT, air-gapped systems. Runs on Raspberry Pi Zero. Sub-microsecond graph lookups. Your drone doesn't need WiFi to remember.

comparison.md
Featureshodhmem0zepcognee
Runs fully offline
Single binary, no Docker
Hebbian learning
Knowledge graph
Memory decay model
Runs on Raspberry Pi
Sub-millisecond lookup
Open source

Others give you storage. We give you cognition.

[03]

Context Durability

How long do memories actually last? Here's the science.

retention_curve.rs
// Power-law decay: memories never truly hit zero
TimeNormalPotentiated
Day 150%70%
Day 735%55%
Day 3018%40%
Day 9010%28%
Day 365>1%>5%
Potentiated = accessed 10+ times
decay_model.txt

Strength
  100% |*
       | *
   70% |  *  <- Potentiated (10+ accesses)
       |   *____
   50% |    *   \____
       |         *   \________
   30% |              *       \___________
       |                                  \____
   10% |----------------------------------------\__
       |  Normal decay                            *
    1% |--------------------------------------------*-
       +----+----+----+----+----+----+----+----+----+->
           3d   7d   14d  30d  60d  90d  180d 365d
                        Time

  [====] Exponential   [----] Power-law (heavy tail)
         (0-3 days)           (3+ days)

Hybrid Decay Model

Exponential decay for the first 3 days (consolidation phase), then power-law for long-term retention. Memories never truly hit zero.

Long-Term Potentiation

Memories accessed 10+ times become "potentiated" and decay 10x slower. Effective half-life jumps from 14 days to ~140 days.

Based on: Bi & Poo (1998)

Hebbian Strengthening

Co-accessed memories form stronger bonds. Fire together, wire together. Associations strengthen with use, weaken with neglect.

Memory Replay

During maintenance cycles, important memories are replayed and strengthened—mimicking hippocampal replay during sleep.

Memories accessed 10+ times become permanent.

Even rarely-accessed memories retain >1% strength after a year due to power-law decay.

See the research behind our memory model →
[03]

Architecture

Cowan's working memory model, implemented

👁

Sensory Buffer

Capacity: ~7 items
Decay: < 1 second

Immediate context window. Raw input before processing.

💭

Working Memory

Capacity: ~4 chunks
Decay: Minutes

Active manipulation space. Current task context and associations.

🧠

Long-Term Memory

Capacity: Unlimited
Decay: Power-law

Persistent storage. Episodic + semantic with Hebbian strengthening.

architecture.rs

                    ┌─────────────────────────────────┐
                    │         MCP / API Layer         │
                    │  remember  recall  forget  ...  │
                    └───────────────┬─────────────────┘
                                    │
                                    ▼
┌───────────────────────────────────────────────────────────────────┐
│                        MEMORY CORE (Rust)                         │
│                                                                   │
│  ┌─────────────┐   ┌─────────────┐   ┌─────────────────────────┐  │
│  │   SENSORY   │   │   WORKING   │   │       LONG-TERM         │  │
│  │   BUFFER    │─▶│   MEMORY    │──▶│        MEMORY           │  │
│  │   ~7 items  │   │  ~4 chunks  │   │      unlimited          │  │
│  │  decay:<1s  │   │  decay:mins │   │    decay:power-law      │  │
│  └─────────────┘   └─────────────┘   └─────────────────────────┘  │ 
│         │                 │                      │                │
│         └────attention────┴────consolidation─────┘                │
│                                  │                                │
│                                  ▼                                │
│  ┌────────────────────────────────────────────────────────────┐   │
│  │                    RETRIEVAL SUBSYSTEM                     │   │
│  │                                                            │   │
│  │   VECTOR INDEX      KNOWLEDGE GRAPH      TEMPORAL INDEX    │   │
│  │      (HNSW)          (Hebbian)            (decay)          │   │
│  │                                                            │   │
│  │        │                  │                   │            │   │
│  │        └──────────────────┼───────────────────┘            │   │
│  │                           ▼                                │   │
│  │                    HYBRID RANKING                          │   │
│  │               vector + graph + time                        │   │
│  └────────────────────────────────────────────────────────────┘   │
│                                  │                                │
│                                  ▼                                │
│  ┌────────────────────────────────────────────────────────────┐   │
│  │                         RocksDB                            │   │
│  │         memories | graph | vectors | episodes              │   │
│  └────────────────────────────────────────────────────────────┘   │
│                                                                   │
└───────────────────────────────────────────────────────────────────┘
                                    │
              ┌─────────────────────┴─────────────────────┐
              ▼                                           ▼
┌──────────────────────────┐            ┌───────────────────────────┐
│  HEBBIAN CONSOLIDATION   │◀────────▶ │   INTERFERENCE ENGINE     │
│                          │            │                           │
│co-activation strengthens │            │  similar memories compete │
│  edge.weight += η·Δw     │            │  old decays when new fits │
└──────────────────────────┘            └───────────────────────────┘
Vector Index
HNSW for semantic similarity
Knowledge Graph
Entities + relationships + spreading activation
Temporal Index
Time-based retrieval and decay
Episode Manager
Conversation threading and context
[04]

Installation

Get started in seconds

terminal
$ npx -y @shodh/memory-mcp # Run with npx (recommended)

# Or add to Claude Code:
$ claude mcp add shodh-memory npx -y @shodh/memory-mcp
[05]

Try It

Type commands to interact with a simulated memory system

shodh-memory --interactive
shodh-memory v0.1.80 - Cognitive Memory System
Type "help" for available commands
$

Ready to add memory to your AI agents?

[06]

FAQ

Frequently asked questions about shodh-memory