Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Deploy a serverless Spark Bitcoin L2 proxy on Vercel with spending limits, auth, and Redis logging. Use when user wants to set up a new proxy, configure env...
Deploy a serverless Spark Bitcoin L2 proxy on Vercel with spending limits, auth, and Redis logging. Use when user wants to set up a new proxy, configure env...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
You are an expert in deploying and managing the sparkbtcbot-proxy โ a serverless middleware that wraps the Spark Bitcoin L2 SDK behind authenticated REST endpoints on Vercel.
Gives AI agents scoped wallet access without exposing the mnemonic: Role-based token auth (admin for full access, invoice for read + create invoices only) Token management via API โ create, list, revoke without redeploying Per-transaction and daily spending caps Activity logging to Redis Lazy detection of paid Lightning invoices
Ask the user for these upfront: Vercel account (free Hobby tier works) Upstash account email and API key (from https://console.upstash.com/account/api) โ OR existing UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN if they already have a database BIP39 mnemonic for the Spark wallet (or generate one in step 3) Node.js 20+ Generated during setup (don't ask for these): UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN โ created by the Upstash management API in step 2 API_AUTH_TOKEN โ generated in step 4
git clone https://github.com/echennells/sparkbtcbot-proxy.git cd sparkbtcbot-proxy npm install
If the user already has UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN, skip to step 3. Otherwise, create a database via the Upstash API. The user needs their Upstash email and API key from https://console.upstash.com/account/api: curl -X POST "https://api.upstash.com/v2/redis/database" \ -u "UPSTASH_EMAIL:UPSTASH_API_KEY" \ -H "Content-Type: application/json" \ -d '{"name": "sparkbtcbot-proxy", "region": "global", "primary_region": "us-east-1"}' Note: Regional database creation is deprecated. You must use "region": "global" with a "primary_region" field. The Upstash docs may not reflect this yet. The response includes rest_url and rest_token โ save these for step 5.
SparkWallet.initialize() returns { mnemonic, wallet } when called without a mnemonic. One-liner: node -e "import('@buildonspark/spark-sdk').then(({SparkWallet}) => SparkWallet.initialize({mnemonicOrSeed: null, options: {network: 'MAINNET'}}).then(r => { console.log(r.mnemonic); r.wallet.cleanupConnections() }))" Save the 12-word mnemonic securely โ it controls all funds in the wallet. There is no getMnemonic() method; you can only retrieve the mnemonic at initialization time. Or use any BIP39 mnemonic generator. 12 or 24 words.
openssl rand -base64 30
npx vercel --prod When prompted, accept the defaults. Then set environment variables. All 7 are required: VariableDescriptionExampleSPARK_MNEMONIC12-word BIP39 mnemonicfence connect trigger ...SPARK_NETWORKSpark networkMAINNETAPI_AUTH_TOKENAdmin fallback bearer tokenoutput of step 4UPSTASH_REDIS_REST_URLRedis REST endpointhttps://xxx.upstash.ioUPSTASH_REDIS_REST_TOKENRedis auth tokenfrom step 2MAX_TRANSACTION_SATSPer-transaction spending cap10000DAILY_BUDGET_SATSDaily spending cap (resets midnight UTC)100000 Important: Do NOT use vercel env add with heredoc/<<< input โ it appends newlines that break the Spark SDK. Either use the Vercel dashboard or the REST API: curl -X POST "https://api.vercel.com/v10/projects/<PROJECT_ID>/env?teamId=<TEAM_ID>" \ -H "Authorization: Bearer <VERCEL_TOKEN>" \ -H "Content-Type: application/json" \ -d '{"type":"encrypted","key":"SPARK_MNEMONIC","value":"your mnemonic here","target":["production","preview","development"]}' Redeploy after setting env vars: npx vercel --prod
curl -H "Authorization: Bearer <your-token>" https://<your-deployment>.vercel.app/api/balance Should return {"success":true,"data":{"balance":"0","tokenBalances":{}}}.
Use the admin token to create limited tokens for agents: curl -X POST -H "Authorization: Bearer <admin-token>" \ -H "Content-Type: application/json" \ -d '{"role": "invoice", "label": "my-agent"}' \ https://<your-deployment>.vercel.app/api/tokens The response includes the full token string โ save it, it's only shown once. See the Token Roles section below for details.
MethodRouteDescriptionGET/llms.txtAPI documentation for bots (no auth required)GET/api/balanceWallet balance (sats + tokens)GET/api/infoSpark address and identity pubkeyGET/api/transactionsTransfer history (?limit=&offset=)GET/api/deposit-addressBitcoin L1 deposit addressGET/api/fee-estimateLightning send fee estimate (?invoice=)GET/api/logsRecent activity logs (?limit=)POST/api/invoice/createCreate Lightning invoice ({amountSats, memo?, expirySeconds?})POST/api/invoice/sparkCreate Spark invoice ({amount?, memo?})POST/api/payPay Lightning invoice โ admin only ({invoice, maxFeeSats})POST/api/transferSpark transfer โ admin only ({receiverSparkAddress, amountSats})POST/api/l402Pay L402 paywall โ admin only ({url, method?, headers?, body?, maxFeeSats?})GET/api/l402/statusCheck/complete pending L402 (?id=<pendingId>)GET/api/tokensList API tokens โ admin onlyPOST/api/tokensCreate a new token โ admin only ({role, label})DELETE/api/tokensRevoke a token โ admin only ({token})
There are two token roles: RolePermissionsadminEverything โ read, create invoices, pay, transfer, manage tokensinvoiceRead (balance, info, transactions, logs, fee-estimate, deposit-address) + create invoices. Cannot pay or transfer. The API_AUTH_TOKEN env var is a hardcoded admin fallback โ it always works even if Redis is down or tokens get wiped. Use it to bootstrap: create scoped tokens via the API, then hand those out to agents.
Create an invoice-only token for a merchant bot: curl -X POST -H "Authorization: Bearer <admin-token>" \ -H "Content-Type: application/json" \ -d '{"role": "invoice", "label": "merchant-bot"}' \ https://<deployment>/api/tokens List all tokens (shows prefixes, labels, roles โ not full token strings): curl -H "Authorization: Bearer <admin-token>" https://<deployment>/api/tokens Revoke a token: curl -X DELETE -H "Authorization: Bearer <admin-token>" \ -H "Content-Type: application/json" \ -d '{"token": "<full-token-string>"}' \ https://<deployment>/api/tokens Tokens are stored in Redis (hash spark:tokens). They survive redeploys but not Redis flushes.
The proxy can pay L402 Lightning paywalls automatically. Send a URL, and the proxy will: Fetch the URL If 402 returned, parse the invoice and macaroon Pay the Lightning invoice Retry the request with the L402 Authorization header Return the protected content
curl -X POST -H "Authorization: Bearer <admin-token>" \ -H "Content-Type: application/json" \ -d '{"url": "https://lightningfaucet.com/api/l402/joke"}' \ https://<deployment>/api/l402
Lightning payments via Spark are asynchronous. The proxy polls for up to ~7.5 seconds, but if the preimage isn't available in time, it returns a pending status: { "success": true, "data": { "status": "pending", "pendingId": "a1b2c3d4e5f6...", "message": "Payment sent but preimage not yet available. Poll GET /api/l402/status?id=<pendingId> to complete.", "priceSats": 21 } } Your agent MUST handle this case. The payment has already been sent โ if you don't poll for completion, you lose the sats without getting the content. Retry loop (pseudocode): response = POST /api/l402 { url: "..." } if response.data.status == "pending": pendingId = response.data.pendingId for attempt in 1..10: sleep(3 seconds) status = GET /api/l402/status?id={pendingId} if status.data.status != "pending": return status.data # Success or failure # Give up after ~30 seconds raise "L402 payment timed out" else: return response.data # Immediate success Key points: Token caching: Paid L402 tokens are cached per-domain (up to 24 hours). Subsequent requests to the same domain reuse the cached token without paying again. If the token expires, the proxy pays for a new one automatically. Pending records expire after 1 hour The /api/l402/status endpoint polls Spark for up to 5 seconds per call If the payment failed on Spark's side, status will return an error Once complete, the pending record is deleted from Redis The proxy automatically retries the final fetch up to 3 times (200ms delay) if the response is empty โ some servers don't return content immediately after payment
Generate a new token: openssl rand -base64 30 Update API_AUTH_TOKEN in Vercel env vars Redeploy: npx vercel --prod Update any agents using the old token Redis-stored tokens are not affected by this โ they continue working.
Update MAX_TRANSACTION_SATS and DAILY_BUDGET_SATS in Vercel env vars and redeploy. Budget resets daily at midnight UTC.
curl -H "Authorization: Bearer <token>" https://<deployment>/api/logs?limit=20
Vercel serverless functions โ each request spins up, initializes the Spark SDK (~1.5s), handles the request, and shuts down. No always-on process, no billing when idle. Upstash Redis โ stores daily spend counters, activity logs, pending invoice tracking, and API tokens. Accessed over HTTP REST (no persistent connection needed). Free tier is limited to 1 database. Spark SDK โ @buildonspark/spark-sdk connects to Spark Signing Operators via gRPC over HTTP/2. Pure JavaScript, no native addons. Lazy invoice check โ on every request, the middleware checks Redis for pending invoices and compares against recent wallet transfers. Expired invoices are cleaned up, paid ones are logged. Max 5 checks per request, wrapped in try/catch so failures never affect the main request.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.