Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Provides structured long-term memory with semantic, keyword, and knowledge graph retrieval, entity extraction, temporal versioning, and experiential learning.
Provides structured long-term memory with semantic, keyword, and knowledge graph retrieval, entity extraction, temporal versioning, and experiential learning.
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Structured long-term memory for OpenClaw agents, powered by maasv. Replaces the default memory backend with a cognition layer that includes 3-signal retrieval (semantic + keyword + knowledge graph), entity extraction, temporal versioning, and experiential learning. maasv is entirely self-hosted. There is no maasv cloud service. You run the server on your own machine, and all data is stored in a SQLite file on your local disk that you own and control. Nothing is sent to maasv.
This skill requires the @maasv/openclaw-memory plugin and a running maasv server.
pip install "maasv[server,anthropic,voyage]" cp server.env.example .env # fill in API keys (see below) maasv-server
openclaw plugins install @maasv/openclaw-memory
// ~/.openclaw/openclaw.json { plugins: { slots: { memory: "memory-maasv" }, entries: { "memory-maasv": { enabled: true, config: { serverUrl: "http://127.0.0.1:18790", autoRecall: true, autoCapture: true, enableGraph: true } } } } }
The maasv server needs an LLM provider (for entity extraction) and an embedding provider (for semantic search). Configure these in your .env file: VariableRequiredPurposeMAASV_LLM_PROVIDERYesanthropic or openaiMAASV_ANTHROPIC_API_KEYIf using AnthropicLLM calls for entity extractionMAASV_OPENAI_API_KEYIf using OpenAILLM calls for entity extractionMAASV_EMBED_PROVIDERYesvoyage, openai, or ollamaMAASV_VOYAGE_API_KEYIf using VoyageEmbedding generationMAASV_API_KEYOptionalProtects maasv server endpoints with auth For fully local operation (no cloud calls), use ollama as your embed provider and a local LLM. maasv is optimized for Qwen3-Embedding-8B via Ollama, with built-in Matryoshka dimension truncation. See the maasv README for local setup.
maasv has no cloud service. The server runs on your machine, the database is a SQLite file on your disk. You own all of it. The only external calls are to your own LLM/embedding provider (Anthropic, OpenAI, Voyage) β using your own API keys, from your own machine. If you use ollama, zero data leaves your machine. The plugin talks only to localhost (127.0.0.1:18790). It makes no external network calls. autoCapture sends conversation summaries to your local maasv server for entity extraction. Extracted entities are stored in your local SQLite database. autoRecall reads from your local SQLite database and injects relevant memories into the agent's context. No telemetry, no analytics, no phone-home. maasv does not collect or transmit any data.
memory_search β 3-signal retrieval across your memory store memory_store β Dedup-aware memory storage memory_forget β Permanent deletion memory_graph β Knowledge graph: entity search, profiles, relationships memory_wisdom β Log reasoning, record outcomes, search past decisions
Plugin (npm): @maasv/openclaw-memory Server + core (PyPI): maasv Source: github.com/ascottbell/maasv
Agent frameworks, memory systems, reasoning layers, and model-native orchestration.
Largest current source with strong distribution and engagement signals.