PRODUCTS
Two ways to give your agent persistent memory. Both open source. Both local-first.
ORI MCP
Plug persistent memory into any MCP client
A local MCP server that gives any AI agent persistent, searchable memory backed by markdown files. Three-signal retrieval — vector, keyword, and graph — with no cloud dependencies, no API keys, and no vendor lock-in.
Works with Claude Code, Cursor, Windsurf, and any MCP-compatible client. Your agent remembers decisions, patterns, and context across sessions — and retrieval quality compounds over time through the Recursive Memory Harness architecture.
Everything is stored as plain markdown files on your machine. Open them in Obsidian. Grep them. Version them in Git. If Ori disappears tomorrow, every file still works.
npm i -g ori-memoryori initori serve --mcpORI CLI
Memory-native coding agent
A coding agent built on top of Ori's memory layer. The CLI gives your development workflow persistent memory — your agent carries context across sessions, learns your codebase patterns, and compounds knowledge over time.
Ships with the full Ori memory engine: three-signal retrieval, wiki-link knowledge graph, Ebbinghaus decay, and the Recursive Memory Harness. Every session makes the next one smarter.
Coming soonDEMO VIDEO
Coming soon