Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Preserve conversation continuity across token compaction cycles by extracting and archiving all prompts with date-wise entries. Automatically triggers at 95% token usage (pre-compaction) and 1% (new sprint start) to export session history, then ingests archived summaries on session restart to restore context.
Preserve conversation continuity across token compaction cycles by extracting and archiving all prompts with date-wise entries. Automatically triggers at 95% token usage (pre-compaction) and 1% (new sprint start) to export session history, then ingests archived summaries on session restart to restore context.
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
This skill maintains conversation continuity across token budget cycles by automatically archiving your session history before compaction and restoring it when a new session begins.
When token usage approaches 95%: Run export_prompts.py to extract current session history Format all prompts/responses with timestamps Append to memory/remember-all-prompts-daily.md with date-wise entry Marks the archive point so compaction can proceed
When a new session starts (fresh 1% token usage): Check if memory/remember-all-prompts-daily.md exists Read the most recent entry Ingest it as "past conversation summary" to restore context Continues naturally from where the previous session ended
# Remember All Prompts Daily ## [DATE: 2026-01-26] ### Session 1 (09:00 - 09:47) [All prompts and responses from session] ### Session 2 (10:15 - 11:30) [All prompts and responses from session]
Extracts all prompts/responses from current session and archives them. Usage: python scripts/export_prompts.py What it does: Uses sessions_history() to fetch all messages from current session Formats with timestamps and message IDs Appends to memory/remember-all-prompts-daily.md Includes metadata (token count, duration, etc.)
Reads the daily archive and injects it as context on session start. Usage: python scripts/ingest_prompts.py What it does: Reads memory/remember-all-prompts-daily.md (if exists) Extracts most recent session Returns formatted summary for ingestion into new session
Add to HEARTBEAT.md to monitor token usage: Check token usage - if >95%, export session history
For automatic triggers: # Check token at regular intervals clawdbot cron add --text "Check token usage and export if needed" --schedule "*/15 * * * *"
Session 1: Chat normally Token reaches 95% export_prompts.py runs automatically All prompts archived to daily file Session compacts Session 2 (New Sprint): Fresh 1% token budget ingest_prompts.py reads archive "Here's what we discussed yesterday..." Context restored, conversation continues seamlessly
python skills/remember-all-prompts-daily/scripts/export_prompts.py
cat memory/remember-all-prompts-daily.md | tail -100
python skills/remember-all-prompts-daily/scripts/ingest_prompts.py
Monitor token usage via: session_status # Shows current token usage % When you see token usage approaching 95%, the skill can auto-trigger, or you can manually export.
Runs only in main session (direct chat with Ateeb) Respects privacy β only stores your actual prompts and responses Daily file auto-rotates at midnight (one entry per date) Can be manually triggered anytime
Agent frameworks, memory systems, reasoning layers, and model-native orchestration.
Largest current source with strong distribution and engagement signals.