Open Source — MIT License

You don't manage memory.
Cortex does.

Persistent memory for Claude Code — built on neuroscience research, not guesswork. Memory that learns, consolidates, forgets intelligently, and surfaces the right context at the right time. 41 scientific citations. 97.8% recall. No GPU.

> claude plugin marketplace add cdeust/Cortex

Requires Python 3.10+ and PostgreSQL 15+ with pgvector + pg_trgm.

Neural Graph Visualization

Three views: Graph, Board, and Pipeline. Launch with /cortex-visualize. Filter by domain, emotion, consolidation stage.

State-of-the-art recall. Proven.

Tested against published academic benchmarks with retrieval-only metrics — no LLM reader in the evaluation loop.

97.8% Recall@10 LongMemEval
92.6% Recall@10 LoCoMo
41 Scientific Citations
91/100 Security Audit
BenchmarkScoreNotes
LongMemEval (ICLR 2025)97.8% R@10, 0.882 MRR500 questions, 115K tokens — +19.6pp vs paper best
LoCoMo (ACL 2024)92.6% R@10, 0.794 MRR1,986 questions, 10 conversations
BEAM (ICLR 2026)0.546 MRRMulti-session, 355 questions — +91% vs LIGHT baseline

All benchmarks use retrieval-only metrics. High retrieval MRR guarantees high QA regardless of downstream model.

Memory is invisible.

You don't manage memory. Cortex does. Every lifecycle stage is automatic.

Session Start

Hot memories, anchored decisions, and team context inject automatically. No manual recall needed.

During Work

PostToolUse hooks capture significant actions. Decisions auto-detect and protect from forgetting. File edits prime related memories via spreading activation.

Session End

A “dream” cycle runs automatically: decay old memories, compress verbose ones, consolidate episodic into semantic knowledge (CLS).

Between Sessions

Memories cool naturally (Ebbinghaus forgetting curve). Important ones stay hot. Protected decisions never decay.

Retrieval Pipeline

Five signals fused server-side in PostgreSQL, then reranked client-side via FlashRank.

SignalSourcePaper
Vector similaritypgvector HNSW (384-dim)Bruch et al. 2023
Full-text searchtsvector + ts_rank_cdBruch et al. 2023
Trigram similaritypg_trgmBruch et al. 2023
Thermodynamic heatEbbinghaus decay modelEbbinghaus 1885
RecencyExponential time decay

18 agents. Dynamic synthesis. Transactive memory.

Based on Wegner 1987: teams store more knowledge than individuals because each member specializes, and a shared directory tells everyone who knows what. Decisions auto-propagate. The orchestrator synthesizes ephemeral agents on demand.

Engineering (9)

Engineer
Architect
Code Reviewer
Test Engineer
DBA
Frontend
DevOps
Security
UX

Research & Academic (8)

Research Scientist
Paper Writer
Experiment Runner
Data Scientist
MLOps
Academic Reviewer
LaTeX Engineer
Professor

Coordination

Orchestrator
+ Dynamic Synthesis

Specialization

Each agent writes to its own topic. Engineer's debugging notes don't clutter tester's recall.

Coordination

Decisions auto-protect and propagate. When engineer decides “use Redis over Memcached,” every agent sees it.

Directory

Entity-based queries span all topics. “What do we know about the reranker?” returns results from engineer, tester, and researcher.

Briefing

SubagentStart hook extracts task keywords, queries prior work, fetches team decisions, and injects as context prefix.

23 plasticity mechanisms. Zero GPU.

Every mechanism from computational neuroscience, implemented as pure server-side inference on PostgreSQL.

WRRF Retrieval Fusion

Vector similarity, full-text search, trigram matching, heat decay, temporal proximity, entity density, emotional resonance, access frequency, and consolidation state — fused server-side in PL/pgSQL.

Surprise Momentum

Test-time learning from Titans (NeurIPS 2025). Retrieval surprise modulates memory heat via EMA — agents learn during retrieval, not just at training time.

LTP / LTD / STDP

Long-term potentiation strengthens accessed memories. Long-term depression weakens neglected ones. Spike-timing-dependent plasticity adjusts Hebbian connection weights.

Coupled Neuromodulation

Dopamine, norepinephrine, acetylcholine, serotonin — with cross-channel coupling (Doya 2002, Schultz 1997). Modulates encoding strength and retrieval priority.

Microglial Pruning

Stale memories pruned during consolidation. Homeostatic plasticity and adaptive decay preserve important facts while cleaning noise.

Knowledge Graph

Causal discovery builds a directed graph with Hebbian weights, facilitation/depression, and release probability. Navigate via Successor Representation BFS.

Cognitive Profiling

JARVIS extracts your 12D reasoning signature — thinking style, entry patterns, blind spots, cross-domain bridges — and pre-loads it every session via EMA updates.

Sleep Compute

Dream replay, interference resolution, CLS (episodic-to-semantic transfer), and engram competition. Background consolidation runs at session end.

Neural Visualization

Interactive force-directed graph with 6-level hierarchy. Node size encodes importance, glow encodes heat, quality arcs show reliability. Real-time memory heatmap dashboard.

33 MCP tools. Organized by function.

From simple remember/recall to autonomous pipeline execution. All via natural language in Claude Code.

Memory

Store & Retrieve

  • remember
  • recall
  • recall_hierarchical
  • consolidate
  • checkpoint
  • forget
  • anchor
  • rate_memory
  • validate_memory
Navigation

Graph & Profiling

  • navigate_memory
  • get_causal_chain
  • detect_gaps
  • drill_down
  • detect_domain
  • explore_features
  • query_methodology
  • memory_stats
Advanced

Pipeline & Automation

  • run_pipeline
  • narrative
  • get_project_story
  • assess_coverage
  • create_trigger
  • add_rule
  • sync_instructions
  • seed_project
  • open_visualization

Five layers. Zero I/O in business logic.

Clean Architecture with strict inward-pointing dependencies. All retrieval runs server-side in PL/pgSQL stored procedures.

LayerResponsibilityKey Detail
CorePure business logic118 modules, zero I/O, imports only shared/
InfrastructureAll I/O33 modules — PostgreSQL, embeddings, file system
HandlersComposition roots62 handlers wiring core + infrastructure
HooksLifecycle automation7 hooks — SessionStart/End, PostToolUse, SubagentStart, etc.
SharedPure utilities12 modules, Python stdlib only

41 citations. The zetetic standard.

Every algorithm, constant, and threshold traces to a published paper, a measured ablation, or documented engineering source. Nothing is guessed.

Retrieval

Information Retrieval

  • Bruch et al. “Fusion Functions” (2023)
  • Nogueira & Cho “Passage Re-ranking” (2019)
  • Collins & Loftus “Spreading Activation” (1975)
  • Joren et al. “Sufficient Context” (2025)
Encoding

Neuroscience — Encoding

  • Friston “Cortical Responses” (2005)
  • Bastos et al. “Predictive Coding” (2012)
  • Wang & Bhatt “Emotional Modulation” (2024)
  • Doya “Metalearning” (2002)
  • Schultz “Prediction & Reward” (1997)
Plasticity

Plasticity & Maintenance

  • Hebb (1949), Bi & Poo (1998)
  • Turrigiano “Self-Tuning Neuron” (2008)
  • Tse et al. “Schemas & Consolidation” (2007)
  • Wang et al. “Microglial Pruning” (2020)
  • Ebbinghaus Memory (1885)
  • … and 9 more papers
Consolidation

Consolidation

  • Kandel “Molecular Biology of Memory” (2001)
  • McClelland et al. “CLS” (1995)
  • Frey & Morris “Synaptic Tagging” (1997)
  • Josselyn & Tonegawa “Engrams” (2020)
  • Borbely “Two-Process Sleep” (1982)
Navigation

Retrieval & Navigation

  • Behrouz et al. “Titans” (NeurIPS 2025)
  • Stachenfeld et al. “Predictive Map” (2017)
  • Ramsauer et al. “Hopfield Networks” (2021)
  • Kanerva “Hyperdimensional Computing” (2009)
Team

Team & Preemptive

  • Wegner “Transactive Memory” (1987)
  • Zhang et al. “LLM Collaboration” (2024)
  • Bar “The Proactive Brain” (2007)
  • Smith & Vela “Context-Dependent” (2001)
  • McGaugh “Amygdala Modulates” (2004)
  • Adcock et al. “Reward-Motivated” (2006)

Five ways to install.

Marketplace plugin, standalone MCP, setup script, Docker, or manual. Each gives you persistent memory for Claude Code.

Option A — Claude Code Marketplace (recommended)

> claude plugin marketplace add cdeust/Cortex
> claude plugin install cortex

Restart your Claude Code session, then run /cortex-setup-project. This handles everything: PostgreSQL + pgvector installation, database creation, embedding model download, cognitive profile building from session history, codebase seeding, conversation import, and hook registration. Zero manual steps.

Using Claude Cowork? Install Cortex-cowork instead — uses SQLite, no PostgreSQL required.
claude plugin marketplace add cdeust/Cortex-cowork

Option B — Standalone MCP (no plugin)

> claude mcp add cortex -- uvx --from "neuro-cortex-memory[postgresql]" neuro-cortex-memory

Adds Cortex as a standalone MCP server via uvx. No hooks, no skills — just the 33 MCP tools. Requires uv installed.

Option C — Clone + Setup Script

$ git clone https://github.com/cdeust/Cortex.git
$ cd Cortex
$ bash scripts/setup.sh        # macOS / Linux
$ python3 scripts/setup.py    # Windows / cross-platform

Installs PostgreSQL + pgvector (Homebrew on macOS, apt/dnf on Linux), creates the database, downloads the embedding model (~100 MB). On Windows, install PostgreSQL manually first, then run setup.py. Restart Claude Code after setup.

Option D — Docker

$ git clone https://github.com/cdeust/Cortex.git
$ cd Cortex
$ docker build -t cortex-runtime -f docker/Dockerfile .
$ docker run -it \
  -v $(pwd):/workspace \
  -v cortex-pgdata:/var/lib/postgresql/17/data \
  -v ~/.claude:/home/cortex/.claude-host:ro \
  -v ~/.claude.json:/home/cortex/.claude-host-json/.claude.json:ro \
  cortex-runtime

Container includes PostgreSQL 17, pgvector, embedding model, and Claude Code. Data persists via the cortex-pgdata volume.

Option E — Manual Setup

# 1. Install PostgreSQL + pgvector
$ brew install postgresql@17 pgvector
$ brew services start postgresql@17

# 2. Create database
$ createdb cortex
$ psql cortex -c "CREATE EXTENSION IF NOT EXISTS vector;"
$ psql cortex -c "CREATE EXTENSION IF NOT EXISTS pg_trgm;"

# 3. Install Python dependencies
$ pip install -e ".[postgresql]"
$ pip install sentence-transformers flashrank

# 4. Initialize schema + pre-cache embedding model
$ python3 -c "from sentence_transformers import SentenceTransformer; SentenceTransformer('all-MiniLM-L6-v2')"

# 5. Register MCP server
$ claude mcp add cortex -- uvx --from "neuro-cortex-memory[postgresql]" neuro-cortex-memory

# 6. Set database URL
$ export DATABASE_URL=postgresql://localhost:5432/cortex

Zetetic Agent Team (optional)

# Global install (all projects)
$ git clone https://github.com/cdeust/zetetic-team-subagents.git
$ cp zetetic-team-subagents/agents/*.md ~/.claude/agents/

# Per-project install
$ mkdir -p .claude/agents
$ cp zetetic-team-subagents/agents/*.md .claude/agents/

18 specialized agents + dynamic synthesis. Each agent integrates with Cortex memory automatically when available.

Configuration

VariableDefaultWhat It Controls
DATABASE_URLpostgresql://localhost:5432/cortexPostgreSQL connection string
CORTEX_RUNTIMEauto-detectedcli (strict) or cowork (SQLite fallback)
CORTEX_MEMORY_DECAY_FACTOR0.95Per-session heat decay rate
CORTEX_MEMORY_HOT_THRESHOLD0.7Heat level considered “hot”
CORTEX_MEMORY_WRRF_VECTOR_WEIGHT1.0Vector similarity weight in fusion
CORTEX_MEMORY_WRRF_FTS_WEIGHT0.5Full-text search weight in fusion
CORTEX_MEMORY_WRRF_HEAT_WEIGHT0.3Thermodynamic heat weight in fusion
CORTEX_MEMORY_DEFAULT_RECALL_LIMIT10Max memories returned per query

~40 tunable parameters total. See mcp_server/infrastructure/memory_config.py for the full list.

What gets installed

MCP Server33 tools for memory, retrieval, profiling, navigation
SessionStart hookInjects anchors + hot memories + team decisions + checkpoint
UserPromptSubmit hookAuto-recalls relevant memories based on user’s prompt
PostToolUse hooks (x2)Auto-captures significant actions; primes related memories via spreading activation
SessionEnd hookRuns dream cycle: decay, compress, CLS based on activity
Compaction hookSaves checkpoint; restores context after compaction
SubagentStart hookBriefs spawned agents with prior work + team decisions
14 SkillsWorkflow guides (invoke via /cortex-*)

Skills

CommandWhat It Does
/cortex-rememberStore a memory with full write gate
/cortex-recallSearch memories with intent-adaptive retrieval
/cortex-consolidateRun maintenance (decay, compress, CLS)
/cortex-explore-memoryNavigate memory by entity/domain
/cortex-navigate-knowledgeTraverse knowledge graph
/cortex-debug-memoryDiagnose memory system health
/cortex-visualizeLaunch neural graph in browser
/cortex-profileView cognitive methodology profile
/cortex-setup-projectBootstrap a new project
/cortex-developMemory-assisted development workflow
/cortex-automateCreate prospective triggers

Security audit: 91/100.

Cortex runs locally — MCP over stdio, PostgreSQL on localhost, visualization on 127.0.0.1. No data leaves your machine.

CategoryScoreNotes
SQL Injection95All queries parameterized. Dynamic columns via sql.Identifier()
Network Behavior92Model download on first run only. Viz servers bind 127.0.0.1
Data Flow90No external data exfiltration. Embeddings computed locally
Code Quality90Pydantic validation on all tools. Input length limits
Secrets Management90.env/credentials in .gitignore. No hardcoded secrets
Prompt Injection88Memory content escaped in HTML. Session injection uses data delimiters
Auth & Access85Docker PG uses scram-sha-256. MCP over stdio (no network auth needed)
Dependency Health80Floor-pinned deps. Background install version-bounded

Free & Open Source

MIT licensed. 2,080 tests. 41 citations. 18 zetetic agents + dynamic synthesis. Give your agent a brain.