Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Cloud knowledge backup and retrieval using Supermemory.ai free tier. Store high-value insights to the cloud and search them back when local memory is insuffi...
Cloud knowledge backup and retrieval using Supermemory.ai free tier. Store high-value insights to the cloud and search them back when local memory is insuffi...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.
Backs up important knowledge and insights to Supermemory.ai's cloud using the free tier API. Uses only /v3/documents (store) and /v3/search (retrieve) โ no Pro-only endpoints.
Set in .env SUPERMEMORY_OPENCLAW_API_KEY="sm_..."
Store a knowledge string to the cloud. python3 skills/supermemory-free/store.py "Your knowledge string here" # With optional container tag (namespace/filter) python3 skills/supermemory-free/store.py "knowledge string" --tag openclaw # With metadata python3 skills/supermemory-free/store.py "knowledge string" --tag fixes --source "session" # Output raw JSON python3 skills/supermemory-free/store.py "knowledge string" --json When to use: User asks to "remember" something permanently Important configuration/setup knowledge Resolved problems / solutions discovered Key facts you want cross-session persistence for
Search the cloud memory for relevant knowledge. python3 skills/supermemory-free/search.py "your query" # With container tag filter python3 skills/supermemory-free/search.py "your query" --tag openclaw # More results python3 skills/supermemory-free/search.py "your query" --limit 10 # Higher precision (less noise) python3 skills/supermemory-free/search.py "your query" --threshold 0.7 # Search across ALL tags python3 skills/supermemory-free/search.py "your query" --no-tag When to use: Local memory (MEMORY.md, daily logs) doesn't have the answer User references something from "a long time ago" Cross-session knowledge lookup "Do you remember when..." queries
Scans recent session memory logs and automatically pushes high-value insights to Supermemory cloud. # Run manually python3 skills/supermemory-free/auto_capture.py # Dry run (show what would be captured, no upload) python3 skills/supermemory-free/auto_capture.py --dry-run # Scan last N days (default: 3) python3 skills/supermemory-free/auto_capture.py --days 7 # Force re-upload even if already seen python3 skills/supermemory-free/auto_capture.py --force # Verbose mode python3 skills/supermemory-free/auto_capture.py --verbose Install cron job (runs daily at 2:00 AM UTC): bash skills/supermemory-free/install_cron.sh Remove cron job: bash skills/supermemory-free/install_cron.sh --remove Check cron status: bash skills/supermemory-free/install_cron.sh --status
The auto-capture script identifies "high-value" insights from memory logs using these heuristics: PatternLabelExampleResolved errors / fixesfixFixed: SSL cert error by running...Error contexterrorException: Connection refused on port 5432Configuration pathsconfig/etc/nginx/sites-available/defaultAPI/endpoint infoapiEndpoint: POST /v3/documents for storageUser preferencespreferenceUser prefers Python over Node for scriptsDecisions madedecisionDecided to use PostgreSQL because...Learned factsinsightLearned that cron syntax for...Installs / setupsetupInstalled nginx, configured with...Bullet-point blocksbullet- Key finding: X works better than Y Deduplication: Already-uploaded items are tracked in .capture_state.json โ re-running is safe.
Use --tag to namespace your memories: TagPurposeopenclawGeneral OpenClaw session knowledge (default)fixesBug fixes and solutionsconfigConfiguration and setupuser-prefsUser preferencesprojectsProject-specific knowledge
FilePurposestore.pyCLI tool: upload knowledge to cloudsearch.pyCLI tool: search cloud knowledgeauto_capture.pyCron script: auto-analyze memory logsinstall_cron.shInstall/remove/status of cron job.capture_state.jsonDedup state (auto-generated, gitignore)SKILL.mdThis file_meta.jsonSkill metadata
Base URL: https://api.supermemory.ai Store endpoint: POST /v3/documents Search endpoint: POST /v3/search Auth: Bearer token from SUPERMEMORY_OPENCLAW_API_KEY Free tier limits: Check https://console.supermemory.ai for current quotas Note: Cloudflare-compatible headers included โ avoids 1010 access denial errors
HTTP 403 / 1010 Access Denied: The scripts include proper User-Agent, Origin, and Referer headers to satisfy Cloudflare. If it recurs, verify the API key is valid at https://console.supermemory.ai. No memory files found: Auto-capture looks in memory/YYYY-MM-DD.md. Ensure your memory skill is writing daily logs there. Re-upload everything: Delete .capture_state.json or use --force to ignore the dedup state.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.