{
  "schemaVersion": "1.0",
  "item": {
    "slug": "sparkbtcbot-proxy-deploy",
    "name": "Deploy Spark Bitcoin L2 Proxy",
    "source": "tencent",
    "type": "skill",
    "category": "开发工具",
    "sourceUrl": "https://clawhub.ai/echennells/sparkbtcbot-proxy-deploy",
    "canonicalUrl": "https://clawhub.ai/echennells/sparkbtcbot-proxy-deploy",
    "targetPlatform": "OpenClaw"
  },
  "install": {
    "downloadMode": "redirect",
    "downloadUrl": "/downloads/sparkbtcbot-proxy-deploy",
    "sourceDownloadUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=sparkbtcbot-proxy-deploy",
    "sourcePlatform": "tencent",
    "targetPlatform": "OpenClaw",
    "installMethod": "Manual import",
    "extraction": "Extract archive",
    "prerequisites": [
      "OpenClaw"
    ],
    "packageFormat": "ZIP package",
    "includedAssets": [
      "SKILL.md"
    ],
    "primaryDoc": "SKILL.md",
    "quickSetup": [
      "Download the package from Yavira.",
      "Extract the archive and review SKILL.md first.",
      "Import or place the package into your OpenClaw setup."
    ],
    "agentAssist": {
      "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
      "steps": [
        "Download the package from Yavira.",
        "Extract it into a folder your agent can access.",
        "Paste one of the prompts below and point your agent at the extracted folder."
      ],
      "prompts": [
        {
          "label": "New install",
          "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
        },
        {
          "label": "Upgrade existing",
          "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
        }
      ]
    },
    "sourceHealth": {
      "source": "tencent",
      "status": "healthy",
      "reason": "direct_download_ok",
      "recommendedAction": "download",
      "checkedAt": "2026-04-30T16:55:25.780Z",
      "expiresAt": "2026-05-07T16:55:25.780Z",
      "httpStatus": 200,
      "finalUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
      "contentType": "application/zip",
      "probeMethod": "head",
      "details": {
        "probeUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
        "contentDisposition": "attachment; filename=\"network-1.0.0.zip\"",
        "redirectLocation": null,
        "bodySnippet": null
      },
      "scope": "source",
      "summary": "Source download looks usable.",
      "detail": "Yavira can redirect you to the upstream package for this source.",
      "primaryActionLabel": "Download for OpenClaw",
      "primaryActionHref": "/downloads/sparkbtcbot-proxy-deploy"
    },
    "validation": {
      "installChecklist": [
        "Use the Yavira download entry.",
        "Review SKILL.md after the package is downloaded.",
        "Confirm the extracted package contains the expected setup assets."
      ],
      "postInstallChecks": [
        "Confirm the extracted package includes the expected docs or setup files.",
        "Validate the skill or prompts are available in your target agent workspace.",
        "Capture any manual follow-up steps the agent could not complete."
      ]
    },
    "downloadPageUrl": "https://openagent3.xyz/downloads/sparkbtcbot-proxy-deploy",
    "agentPageUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent",
    "manifestUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent.md"
  },
  "agentAssist": {
    "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
    "steps": [
      "Download the package from Yavira.",
      "Extract it into a folder your agent can access.",
      "Paste one of the prompts below and point your agent at the extracted folder."
    ],
    "prompts": [
      {
        "label": "New install",
        "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
      },
      {
        "label": "Upgrade existing",
        "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
      }
    ]
  },
  "documentation": {
    "source": "clawhub",
    "primaryDoc": "SKILL.md",
    "sections": [
      {
        "title": "Deploy sparkbtcbot-proxy",
        "body": "You are an expert in deploying and managing the sparkbtcbot-proxy — a serverless middleware that wraps the Spark Bitcoin L2 SDK behind authenticated REST endpoints on Vercel."
      },
      {
        "title": "What This Proxy Does",
        "body": "Gives AI agents scoped wallet access without exposing the mnemonic:\n\nRole-based token auth (admin for full access, invoice for read + create invoices only)\nToken management via API — create, list, revoke without redeploying\nPer-transaction and daily spending caps\nActivity logging to Redis\nLazy detection of paid Lightning invoices"
      },
      {
        "title": "What You Need",
        "body": "Ask the user for these upfront:\n\nVercel account (free Hobby tier works)\nUpstash account email and API key (from https://console.upstash.com/account/api) — OR existing UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN if they already have a database\nBIP39 mnemonic for the Spark wallet (or generate one in step 3)\nNode.js 20+\n\nGenerated during setup (don't ask for these):\n\nUPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN — created by the Upstash management API in step 2\nAPI_AUTH_TOKEN — generated in step 4"
      },
      {
        "title": "1. Clone and install",
        "body": "git clone https://github.com/echennells/sparkbtcbot-proxy.git\ncd sparkbtcbot-proxy\nnpm install"
      },
      {
        "title": "2. Create Upstash Redis",
        "body": "If the user already has UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN, skip to step 3.\n\nOtherwise, create a database via the Upstash API. The user needs their Upstash email and API key from https://console.upstash.com/account/api:\n\ncurl -X POST \"https://api.upstash.com/v2/redis/database\" \\\n  -u \"UPSTASH_EMAIL:UPSTASH_API_KEY\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"name\": \"sparkbtcbot-proxy\", \"region\": \"global\", \"primary_region\": \"us-east-1\"}'\n\nNote: Regional database creation is deprecated. You must use \"region\": \"global\" with a \"primary_region\" field. The Upstash docs may not reflect this yet.\n\nThe response includes rest_url and rest_token — save these for step 5."
      },
      {
        "title": "3. Generate a wallet mnemonic (if needed)",
        "body": "SparkWallet.initialize() returns { mnemonic, wallet } when called without a mnemonic. One-liner:\n\nnode -e \"import('@buildonspark/spark-sdk').then(({SparkWallet}) => SparkWallet.initialize({mnemonicOrSeed: null, options: {network: 'MAINNET'}}).then(r => { console.log(r.mnemonic); r.wallet.cleanupConnections() }))\"\n\nSave the 12-word mnemonic securely — it controls all funds in the wallet. There is no getMnemonic() method; you can only retrieve the mnemonic at initialization time.\n\nOr use any BIP39 mnemonic generator. 12 or 24 words."
      },
      {
        "title": "4. Generate an API auth token",
        "body": "openssl rand -base64 30"
      },
      {
        "title": "5. Deploy to Vercel",
        "body": "npx vercel --prod\n\nWhen prompted, accept the defaults. Then set environment variables. All 7 are required:\n\nVariableDescriptionExampleSPARK_MNEMONIC12-word BIP39 mnemonicfence connect trigger ...SPARK_NETWORKSpark networkMAINNETAPI_AUTH_TOKENAdmin fallback bearer tokenoutput of step 4UPSTASH_REDIS_REST_URLRedis REST endpointhttps://xxx.upstash.ioUPSTASH_REDIS_REST_TOKENRedis auth tokenfrom step 2MAX_TRANSACTION_SATSPer-transaction spending cap10000DAILY_BUDGET_SATSDaily spending cap (resets midnight UTC)100000\n\nImportant: Do NOT use vercel env add with heredoc/<<< input — it appends newlines that break the Spark SDK. Either use the Vercel dashboard or the REST API:\n\ncurl -X POST \"https://api.vercel.com/v10/projects/<PROJECT_ID>/env?teamId=<TEAM_ID>\" \\\n  -H \"Authorization: Bearer <VERCEL_TOKEN>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"type\":\"encrypted\",\"key\":\"SPARK_MNEMONIC\",\"value\":\"your mnemonic here\",\"target\":[\"production\",\"preview\",\"development\"]}'\n\nRedeploy after setting env vars:\n\nnpx vercel --prod"
      },
      {
        "title": "6. Test",
        "body": "curl -H \"Authorization: Bearer <your-token>\" https://<your-deployment>.vercel.app/api/balance\n\nShould return {\"success\":true,\"data\":{\"balance\":\"0\",\"tokenBalances\":{}}}."
      },
      {
        "title": "7. Create scoped tokens (optional)",
        "body": "Use the admin token to create limited tokens for agents:\n\ncurl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"role\": \"invoice\", \"label\": \"my-agent\"}' \\\n  https://<your-deployment>.vercel.app/api/tokens\n\nThe response includes the full token string — save it, it's only shown once. See the Token Roles section below for details."
      },
      {
        "title": "API Routes",
        "body": "MethodRouteDescriptionGET/llms.txtAPI documentation for bots (no auth required)GET/api/balanceWallet balance (sats + tokens)GET/api/infoSpark address and identity pubkeyGET/api/transactionsTransfer history (?limit=&offset=)GET/api/deposit-addressBitcoin L1 deposit addressGET/api/fee-estimateLightning send fee estimate (?invoice=)GET/api/logsRecent activity logs (?limit=)POST/api/invoice/createCreate Lightning invoice ({amountSats, memo?, expirySeconds?})POST/api/invoice/sparkCreate Spark invoice ({amount?, memo?})POST/api/payPay Lightning invoice — admin only ({invoice, maxFeeSats})POST/api/transferSpark transfer — admin only ({receiverSparkAddress, amountSats})POST/api/l402Pay L402 paywall — admin only ({url, method?, headers?, body?, maxFeeSats?})GET/api/l402/statusCheck/complete pending L402 (?id=<pendingId>)GET/api/tokensList API tokens — admin onlyPOST/api/tokensCreate a new token — admin only ({role, label})DELETE/api/tokensRevoke a token — admin only ({token})"
      },
      {
        "title": "Token Roles",
        "body": "There are two token roles:\n\nRolePermissionsadminEverything — read, create invoices, pay, transfer, manage tokensinvoiceRead (balance, info, transactions, logs, fee-estimate, deposit-address) + create invoices. Cannot pay or transfer.\n\nThe API_AUTH_TOKEN env var is a hardcoded admin fallback — it always works even if Redis is down or tokens get wiped. Use it to bootstrap: create scoped tokens via the API, then hand those out to agents."
      },
      {
        "title": "Managing tokens",
        "body": "Create an invoice-only token for a merchant bot:\n\ncurl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"role\": \"invoice\", \"label\": \"merchant-bot\"}' \\\n  https://<deployment>/api/tokens\n\nList all tokens (shows prefixes, labels, roles — not full token strings):\n\ncurl -H \"Authorization: Bearer <admin-token>\" https://<deployment>/api/tokens\n\nRevoke a token:\n\ncurl -X DELETE -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"token\": \"<full-token-string>\"}' \\\n  https://<deployment>/api/tokens\n\nTokens are stored in Redis (hash spark:tokens). They survive redeploys but not Redis flushes."
      },
      {
        "title": "L402 Paywall Support",
        "body": "The proxy can pay L402 Lightning paywalls automatically. Send a URL, and the proxy will:\n\nFetch the URL\nIf 402 returned, parse the invoice and macaroon\nPay the Lightning invoice\nRetry the request with the L402 Authorization header\nReturn the protected content"
      },
      {
        "title": "Basic usage",
        "body": "curl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"url\": \"https://lightningfaucet.com/api/l402/joke\"}' \\\n  https://<deployment>/api/l402"
      },
      {
        "title": "Handling pending payments (important for agents)",
        "body": "Lightning payments via Spark are asynchronous. The proxy polls for up to ~7.5 seconds, but if the preimage isn't available in time, it returns a pending status:\n\n{\n  \"success\": true,\n  \"data\": {\n    \"status\": \"pending\",\n    \"pendingId\": \"a1b2c3d4e5f6...\",\n    \"message\": \"Payment sent but preimage not yet available. Poll GET /api/l402/status?id=<pendingId> to complete.\",\n    \"priceSats\": 21\n  }\n}\n\nYour agent MUST handle this case. The payment has already been sent — if you don't poll for completion, you lose the sats without getting the content.\n\nRetry loop (pseudocode):\n\nresponse = POST /api/l402 { url: \"...\" }\n\nif response.data.status == \"pending\":\n    pendingId = response.data.pendingId\n    for attempt in 1..10:\n        sleep(3 seconds)\n        status = GET /api/l402/status?id={pendingId}\n        if status.data.status != \"pending\":\n            return status.data  # Success or failure\n    # Give up after ~30 seconds\n    raise \"L402 payment timed out\"\nelse:\n    return response.data  # Immediate success\n\nKey points:\n\nToken caching: Paid L402 tokens are cached per-domain (up to 24 hours). Subsequent requests to the same domain reuse the cached token without paying again. If the token expires, the proxy pays for a new one automatically.\nPending records expire after 1 hour\nThe /api/l402/status endpoint polls Spark for up to 5 seconds per call\nIf the payment failed on Spark's side, status will return an error\nOnce complete, the pending record is deleted from Redis\nThe proxy automatically retries the final fetch up to 3 times (200ms delay) if the response is empty — some servers don't return content immediately after payment"
      },
      {
        "title": "Rotate the admin fallback token",
        "body": "Generate a new token: openssl rand -base64 30\nUpdate API_AUTH_TOKEN in Vercel env vars\nRedeploy: npx vercel --prod\nUpdate any agents using the old token\n\nRedis-stored tokens are not affected by this — they continue working."
      },
      {
        "title": "Adjust spending limits",
        "body": "Update MAX_TRANSACTION_SATS and DAILY_BUDGET_SATS in Vercel env vars and redeploy. Budget resets daily at midnight UTC."
      },
      {
        "title": "Check logs",
        "body": "curl -H \"Authorization: Bearer <token>\" https://<deployment>/api/logs?limit=20"
      },
      {
        "title": "Architecture",
        "body": "Vercel serverless functions — each request spins up, initializes the Spark SDK (~1.5s), handles the request, and shuts down. No always-on process, no billing when idle.\nUpstash Redis — stores daily spend counters, activity logs, pending invoice tracking, and API tokens. Accessed over HTTP REST (no persistent connection needed). Free tier is limited to 1 database.\nSpark SDK — @buildonspark/spark-sdk connects to Spark Signing Operators via gRPC over HTTP/2. Pure JavaScript, no native addons.\nLazy invoice check — on every request, the middleware checks Redis for pending invoices and compares against recent wallet transfers. Expired invoices are cleaned up, paid ones are logged. Max 5 checks per request, wrapped in try/catch so failures never affect the main request."
      }
    ],
    "body": "Deploy sparkbtcbot-proxy\n\nYou are an expert in deploying and managing the sparkbtcbot-proxy — a serverless middleware that wraps the Spark Bitcoin L2 SDK behind authenticated REST endpoints on Vercel.\n\nWhat This Proxy Does\n\nGives AI agents scoped wallet access without exposing the mnemonic:\n\nRole-based token auth (admin for full access, invoice for read + create invoices only)\nToken management via API — create, list, revoke without redeploying\nPer-transaction and daily spending caps\nActivity logging to Redis\nLazy detection of paid Lightning invoices\nWhat You Need\n\nAsk the user for these upfront:\n\nVercel account (free Hobby tier works)\nUpstash account email and API key (from https://console.upstash.com/account/api) — OR existing UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN if they already have a database\nBIP39 mnemonic for the Spark wallet (or generate one in step 3)\nNode.js 20+\n\nGenerated during setup (don't ask for these):\n\nUPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN — created by the Upstash management API in step 2\nAPI_AUTH_TOKEN — generated in step 4\nStep-by-Step Deployment\n1. Clone and install\ngit clone https://github.com/echennells/sparkbtcbot-proxy.git\ncd sparkbtcbot-proxy\nnpm install\n\n2. Create Upstash Redis\n\nIf the user already has UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN, skip to step 3.\n\nOtherwise, create a database via the Upstash API. The user needs their Upstash email and API key from https://console.upstash.com/account/api:\n\ncurl -X POST \"https://api.upstash.com/v2/redis/database\" \\\n  -u \"UPSTASH_EMAIL:UPSTASH_API_KEY\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"name\": \"sparkbtcbot-proxy\", \"region\": \"global\", \"primary_region\": \"us-east-1\"}'\n\n\nNote: Regional database creation is deprecated. You must use \"region\": \"global\" with a \"primary_region\" field. The Upstash docs may not reflect this yet.\n\nThe response includes rest_url and rest_token — save these for step 5.\n\n3. Generate a wallet mnemonic (if needed)\n\nSparkWallet.initialize() returns { mnemonic, wallet } when called without a mnemonic. One-liner:\n\nnode -e \"import('@buildonspark/spark-sdk').then(({SparkWallet}) => SparkWallet.initialize({mnemonicOrSeed: null, options: {network: 'MAINNET'}}).then(r => { console.log(r.mnemonic); r.wallet.cleanupConnections() }))\"\n\n\nSave the 12-word mnemonic securely — it controls all funds in the wallet. There is no getMnemonic() method; you can only retrieve the mnemonic at initialization time.\n\nOr use any BIP39 mnemonic generator. 12 or 24 words.\n\n4. Generate an API auth token\nopenssl rand -base64 30\n\n5. Deploy to Vercel\nnpx vercel --prod\n\n\nWhen prompted, accept the defaults. Then set environment variables. All 7 are required:\n\nVariable\tDescription\tExample\nSPARK_MNEMONIC\t12-word BIP39 mnemonic\tfence connect trigger ...\nSPARK_NETWORK\tSpark network\tMAINNET\nAPI_AUTH_TOKEN\tAdmin fallback bearer token\toutput of step 4\nUPSTASH_REDIS_REST_URL\tRedis REST endpoint\thttps://xxx.upstash.io\nUPSTASH_REDIS_REST_TOKEN\tRedis auth token\tfrom step 2\nMAX_TRANSACTION_SATS\tPer-transaction spending cap\t10000\nDAILY_BUDGET_SATS\tDaily spending cap (resets midnight UTC)\t100000\n\nImportant: Do NOT use vercel env add with heredoc/<<< input — it appends newlines that break the Spark SDK. Either use the Vercel dashboard or the REST API:\n\ncurl -X POST \"https://api.vercel.com/v10/projects/<PROJECT_ID>/env?teamId=<TEAM_ID>\" \\\n  -H \"Authorization: Bearer <VERCEL_TOKEN>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"type\":\"encrypted\",\"key\":\"SPARK_MNEMONIC\",\"value\":\"your mnemonic here\",\"target\":[\"production\",\"preview\",\"development\"]}'\n\n\nRedeploy after setting env vars:\n\nnpx vercel --prod\n\n6. Test\ncurl -H \"Authorization: Bearer <your-token>\" https://<your-deployment>.vercel.app/api/balance\n\n\nShould return {\"success\":true,\"data\":{\"balance\":\"0\",\"tokenBalances\":{}}}.\n\n7. Create scoped tokens (optional)\n\nUse the admin token to create limited tokens for agents:\n\ncurl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"role\": \"invoice\", \"label\": \"my-agent\"}' \\\n  https://<your-deployment>.vercel.app/api/tokens\n\n\nThe response includes the full token string — save it, it's only shown once. See the Token Roles section below for details.\n\nAPI Routes\nMethod\tRoute\tDescription\nGET\t/llms.txt\tAPI documentation for bots (no auth required)\nGET\t/api/balance\tWallet balance (sats + tokens)\nGET\t/api/info\tSpark address and identity pubkey\nGET\t/api/transactions\tTransfer history (?limit=&offset=)\nGET\t/api/deposit-address\tBitcoin L1 deposit address\nGET\t/api/fee-estimate\tLightning send fee estimate (?invoice=)\nGET\t/api/logs\tRecent activity logs (?limit=)\nPOST\t/api/invoice/create\tCreate Lightning invoice ({amountSats, memo?, expirySeconds?})\nPOST\t/api/invoice/spark\tCreate Spark invoice ({amount?, memo?})\nPOST\t/api/pay\tPay Lightning invoice — admin only ({invoice, maxFeeSats})\nPOST\t/api/transfer\tSpark transfer — admin only ({receiverSparkAddress, amountSats})\nPOST\t/api/l402\tPay L402 paywall — admin only ({url, method?, headers?, body?, maxFeeSats?})\nGET\t/api/l402/status\tCheck/complete pending L402 (?id=<pendingId>)\nGET\t/api/tokens\tList API tokens — admin only\nPOST\t/api/tokens\tCreate a new token — admin only ({role, label})\nDELETE\t/api/tokens\tRevoke a token — admin only ({token})\nToken Roles\n\nThere are two token roles:\n\nRole\tPermissions\nadmin\tEverything — read, create invoices, pay, transfer, manage tokens\ninvoice\tRead (balance, info, transactions, logs, fee-estimate, deposit-address) + create invoices. Cannot pay or transfer.\n\nThe API_AUTH_TOKEN env var is a hardcoded admin fallback — it always works even if Redis is down or tokens get wiped. Use it to bootstrap: create scoped tokens via the API, then hand those out to agents.\n\nManaging tokens\n\nCreate an invoice-only token for a merchant bot:\n\ncurl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"role\": \"invoice\", \"label\": \"merchant-bot\"}' \\\n  https://<deployment>/api/tokens\n\n\nList all tokens (shows prefixes, labels, roles — not full token strings):\n\ncurl -H \"Authorization: Bearer <admin-token>\" https://<deployment>/api/tokens\n\n\nRevoke a token:\n\ncurl -X DELETE -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"token\": \"<full-token-string>\"}' \\\n  https://<deployment>/api/tokens\n\n\nTokens are stored in Redis (hash spark:tokens). They survive redeploys but not Redis flushes.\n\nL402 Paywall Support\n\nThe proxy can pay L402 Lightning paywalls automatically. Send a URL, and the proxy will:\n\nFetch the URL\nIf 402 returned, parse the invoice and macaroon\nPay the Lightning invoice\nRetry the request with the L402 Authorization header\nReturn the protected content\nBasic usage\ncurl -X POST -H \"Authorization: Bearer <admin-token>\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"url\": \"https://lightningfaucet.com/api/l402/joke\"}' \\\n  https://<deployment>/api/l402\n\nHandling pending payments (important for agents)\n\nLightning payments via Spark are asynchronous. The proxy polls for up to ~7.5 seconds, but if the preimage isn't available in time, it returns a pending status:\n\n{\n  \"success\": true,\n  \"data\": {\n    \"status\": \"pending\",\n    \"pendingId\": \"a1b2c3d4e5f6...\",\n    \"message\": \"Payment sent but preimage not yet available. Poll GET /api/l402/status?id=<pendingId> to complete.\",\n    \"priceSats\": 21\n  }\n}\n\n\nYour agent MUST handle this case. The payment has already been sent — if you don't poll for completion, you lose the sats without getting the content.\n\nRetry loop (pseudocode):\n\nresponse = POST /api/l402 { url: \"...\" }\n\nif response.data.status == \"pending\":\n    pendingId = response.data.pendingId\n    for attempt in 1..10:\n        sleep(3 seconds)\n        status = GET /api/l402/status?id={pendingId}\n        if status.data.status != \"pending\":\n            return status.data  # Success or failure\n    # Give up after ~30 seconds\n    raise \"L402 payment timed out\"\nelse:\n    return response.data  # Immediate success\n\n\nKey points:\n\nToken caching: Paid L402 tokens are cached per-domain (up to 24 hours). Subsequent requests to the same domain reuse the cached token without paying again. If the token expires, the proxy pays for a new one automatically.\nPending records expire after 1 hour\nThe /api/l402/status endpoint polls Spark for up to 5 seconds per call\nIf the payment failed on Spark's side, status will return an error\nOnce complete, the pending record is deleted from Redis\nThe proxy automatically retries the final fetch up to 3 times (200ms delay) if the response is empty — some servers don't return content immediately after payment\nCommon Operations\nRotate the admin fallback token\nGenerate a new token: openssl rand -base64 30\nUpdate API_AUTH_TOKEN in Vercel env vars\nRedeploy: npx vercel --prod\nUpdate any agents using the old token\n\nRedis-stored tokens are not affected by this — they continue working.\n\nAdjust spending limits\n\nUpdate MAX_TRANSACTION_SATS and DAILY_BUDGET_SATS in Vercel env vars and redeploy. Budget resets daily at midnight UTC.\n\nCheck logs\ncurl -H \"Authorization: Bearer <token>\" https://<deployment>/api/logs?limit=20\n\nArchitecture\nVercel serverless functions — each request spins up, initializes the Spark SDK (~1.5s), handles the request, and shuts down. No always-on process, no billing when idle.\nUpstash Redis — stores daily spend counters, activity logs, pending invoice tracking, and API tokens. Accessed over HTTP REST (no persistent connection needed). Free tier is limited to 1 database.\nSpark SDK — @buildonspark/spark-sdk connects to Spark Signing Operators via gRPC over HTTP/2. Pure JavaScript, no native addons.\nLazy invoice check — on every request, the middleware checks Redis for pending invoices and compares against recent wallet transfers. Expired invoices are cleaned up, paid ones are logged. Max 5 checks per request, wrapped in try/catch so failures never affect the main request."
  },
  "trust": {
    "sourceLabel": "tencent",
    "provenanceUrl": "https://clawhub.ai/echennells/sparkbtcbot-proxy-deploy",
    "publisherUrl": "https://clawhub.ai/echennells/sparkbtcbot-proxy-deploy",
    "owner": "echennells",
    "version": "1.0.0",
    "license": null,
    "verificationStatus": "Indexed source record"
  },
  "links": {
    "detailUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy",
    "downloadUrl": "https://openagent3.xyz/downloads/sparkbtcbot-proxy-deploy",
    "agentUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent",
    "manifestUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/sparkbtcbot-proxy-deploy/agent.md"
  }
}