How to Set Up Persistent Memory for Claude Code and Cursor
How to Set Up Persistent Memory for Claude Code and Cursor
Claude Code and Cursor are two of the most powerful AI coding assistants in 2026. But they share the same fundamental limitation: they forget everything between sessions.
Start a new conversation with Claude Code, and it doesn't remember your codebase preferences, the architectural decisions you made last week, or the debugging patterns that worked for your specific stack. Cursor resets the same way — every session is a cold start.
shodh-memory fixes this using the Model Context Protocol (MCP). Setup takes under 5 minutes. Here's how.
How It Works
MCP (Model Context Protocol) is a standard that lets AI assistants connect to external tools and data sources. shodh-memory implements an MCP server that exposes 45 memory tools — remember, recall, forget, proactive context, todos, and more.
When connected, your AI assistant can:
Everything runs locally. No cloud. No API keys. No data leaving your machine.
Setup for Claude Code
Step 1: Install
npm install -g @shodh/memory-mcp
Step 2: Add to Claude Code MCP config
Open your Claude Code settings and add shodh-memory as an MCP server:
{
"mcpServers": {
"shodh-memory": {
"command": "npx",
"args": ["-y", "@shodh/memory-mcp"]
}
}
}
That's it. Claude Code now has access to 45 memory tools.
Step 3: Enable Hooks (Optional, Recommended)
For automatic memory capture, add hooks to your project's `.claude/settings.json`:
{
"hooks": {
"on_tool_use": "npx @shodh/memory-mcp hook-tool",
"on_message": "npx @shodh/memory-mcp hook-message"
}
}
With hooks enabled, shodh-memory automatically captures context from tool usage and conversations. You don't need to explicitly tell Claude to remember things — it happens in the background.
Setup for Cursor
Step 1: Install
npm install -g @shodh/memory-mcp
Step 2: Add to Cursor MCP config
In Cursor, go to Settings → MCP Servers → Add Server:
{
"shodh-memory": {
"command": "npx",
"args": ["-y", "@shodh/memory-mcp"]
}
}
Cursor will now show shodh-memory's tools in the MCP panel.
What Happens After Setup
First Session
Use your AI assistant normally. As you work, shodh-memory captures:
These are stored locally in RocksDB with vector embeddings, entity extraction, and knowledge graph connections — all computed on your machine using MiniLM-L6-v2.
Second Session
When you start a new session, shodh-memory's proactive context surfaces relevant memories based on what you're working on. If you open a file in the auth module, memories about your JWT decisions, session handling, and that CORS fix automatically appear in context.
You don't need to re-explain your stack, your preferences, or your project structure. The assistant already knows.
After a Week
Frequently accessed knowledge has promoted from working memory to long-term storage. The knowledge graph has dense clusters around your most-discussed topics. Memories you never revisited have begun to decay, naturally reducing noise.
The assistant surfaces context with increasing accuracy. It knows not just what you said, but which things you keep coming back to — and those are the things it remembers best.
Memory Tools Available
Once connected, your AI assistant has access to these core tools:
The full MCP server exposes 45 tools covering memory, todos, projects, reminders, and system health.
Performance Impact
shodh-memory runs as a separate process. It does not slow down your IDE or your AI assistant.
On a typical development machine, you won't notice it's running.
Troubleshooting
**"MCP server not found"** — Make sure the npm package is installed globally: `npm install -g @shodh/memory-mcp`
**"Connection refused"** — shodh-memory starts on port 3030 by default. Check if another process is using that port.
**"No memories found"** — Memory needs to accumulate. Use the assistant for a few sessions before expecting rich context retrieval.
**"High memory usage"** — The embedding model (MiniLM-L6-v2, ~22MB) loads on first use. This is a one-time cost and stays resident for fast inference.
Privacy Note
Everything runs on your machine. shodh-memory makes zero network requests. Your memories, embeddings, knowledge graph, and todos are stored in a local RocksDB database in your home directory. No telemetry. No cloud sync. No API keys required.
Your AI assistant's memory is your data, on your hardware, under your control.