Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Back up and restore OpenClaw state. Creates local archives and uploads to S3-compatible cloud storage (AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOce...
Back up and restore OpenClaw state. Creates local archives and uploads to S3-compatible cloud storage (AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOce...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Backs up OpenClaw state to a local archive and uploads it to cloud storage. Follow the steps below. Inform the user about implicit defaults applied after execution.
bash "{baseDir}/scripts/cloud-backup.sh" backup full Default mode is full. Use workspace, skills, or settings only when the user explicitly asks for a narrower backup scope.
Look at the script output from Step 1. If it contains: WARN: Encryption is disabled β backup archive will be stored in plaintext. ask the user: "Your backups are not encrypted. Archives contain config, credentials, and API keys in cleartext. Want to set a passphrase? (AES-256, just the passphrase needed to restore β no key files.)" If user provides a passphrase β write both via gateway config.patch and tell the user what was saved: skills.entries.cloud-backup.config.encrypt = true skills.entries.cloud-backup.env.GPG_PASSPHRASE = "<passphrase>" Then re-run the backup so the archive is encrypted. If user says no / skip β continue to Step 3. If the warning is not present (encryption already enabled) β continue to Step 3. Always execute this step and report the result. Backups may contain secrets β prompt every time encryption is off.
Look at the script output. If it contains WARN: Cloud storage is not configured, go to Step 4. Otherwise go to Step 5.
Tell the user the backup was saved locally, then ask: "Cloud storage isn't configured yet β backups are local only. Want to set up cloud upload? I support AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or any other S3-compatible service." If user says yes β go to Cloud setup section below, then re-run the backup. If user says no / local-only β set config.upload=false via gateway config.patch and confirm the change to the user. Done. Always execute this step β offer cloud setup when it's missing and report what was configured.
Report the backup paths from the script output to the user. Then check if a daily cron job already exists for cloud-backup (cron action=list). If no cron job exists, create one with the default schedule and tell the user: { "schedule": { "kind": "cron", "expr": "0 2 * * *" }, "payload": { "kind": "agentTurn", "message": "Run cloud-backup: backup full" }, "sessionTarget": "isolated" } Default: daily at 2 AM. Use a different schedule if it was explicitly asked for one. Tell the user: "Scheduled daily backup at 2 AM." (or whatever time was set). This step should be executed by default unless user asked not to do it. Report about the result β cron job created or failed to be created.
When the user agrees to configure cloud storage: Ask which provider: AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or other. Read the matching provider guide from references/providers/ β it has exact config keys, endpoint format, and credential steps. Collect and write config via gateway config.patch β bucket, credentials, endpoint (if non-AWS). Run status to verify connectivity, then re-run backup.
bash "{baseDir}/scripts/cloud-backup.sh" <command> CommandWhat it doesbackup [full|workspace|skills|settings]Create archive + upload if configured. Default: fulllistShow local + remote backupsrestore <name> [--dry-run] [--yes]Restore from local or cloud. Always --dry-run firstcleanupPrune old archives (local: capped at 7; cloud: count + age)statusShow current config and dependency check
All in skills.entries.cloud-backup in OpenClaw config. Don't write defaults β the script handles them.
KeyDefaultDescriptionbucketβStorage bucket name (required for cloud)regionus-east-1Region hintendpoint(none)S3-compatible endpoint (required for non-AWS)profile(none)Named AWS CLI profile (alternative to keys)uploadtrueUpload to cloud after backupencryptfalseGPG-encrypt archivesretentionCount10Cloud: keep N backups. Local: capped at 7retentionDays30Cloud only: delete archives older than N days
KeyDescriptionACCESS_KEY_IDS3-compatible access keySECRET_ACCESS_KEYS3-compatible secret keySESSION_TOKENOptional temporary tokenGPG_PASSPHRASEFor automated encryption/decryption
Read the relevant one only during setup: references/providers/aws-s3.md references/providers/cloudflare-r2.md references/providers/backblaze-b2.md references/providers/minio.md references/providers/digitalocean-spaces.md references/providers/other.md β any S3-compatible service
See references/security.md for credential handling and troubleshooting.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.