{
  "schemaVersion": "1.0",
  "item": {
    "slug": "lite-sqlite",
    "name": "Lite Sqlite",
    "source": "tencent",
    "type": "skill",
    "category": "其他",
    "sourceUrl": "https://clawhub.ai/omprasad122007-rgb/lite-sqlite",
    "canonicalUrl": "https://clawhub.ai/omprasad122007-rgb/lite-sqlite",
    "targetPlatform": "OpenClaw"
  },
  "install": {
    "downloadMode": "redirect",
    "downloadUrl": "/downloads/lite-sqlite",
    "sourceDownloadUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=lite-sqlite",
    "sourcePlatform": "tencent",
    "targetPlatform": "OpenClaw",
    "installMethod": "Manual import",
    "extraction": "Extract archive",
    "prerequisites": [
      "OpenClaw"
    ],
    "packageFormat": "ZIP package",
    "includedAssets": [
      "scripts/sqlite_cli.py",
      "scripts/sqlite_connector.py",
      "SKILL.md"
    ],
    "primaryDoc": "SKILL.md",
    "quickSetup": [
      "Download the package from Yavira.",
      "Extract the archive and review SKILL.md first.",
      "Import or place the package into your OpenClaw setup."
    ],
    "agentAssist": {
      "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
      "steps": [
        "Download the package from Yavira.",
        "Extract it into a folder your agent can access.",
        "Paste one of the prompts below and point your agent at the extracted folder."
      ],
      "prompts": [
        {
          "label": "New install",
          "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
        },
        {
          "label": "Upgrade existing",
          "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
        }
      ]
    },
    "sourceHealth": {
      "source": "tencent",
      "slug": "lite-sqlite",
      "status": "healthy",
      "reason": "direct_download_ok",
      "recommendedAction": "download",
      "checkedAt": "2026-04-29T07:25:12.071Z",
      "expiresAt": "2026-05-06T07:25:12.071Z",
      "httpStatus": 200,
      "finalUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=lite-sqlite",
      "contentType": "application/zip",
      "probeMethod": "head",
      "details": {
        "probeUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=lite-sqlite",
        "contentDisposition": "attachment; filename=\"lite-sqlite-1.0.0.zip\"",
        "redirectLocation": null,
        "bodySnippet": null,
        "slug": "lite-sqlite"
      },
      "scope": "item",
      "summary": "Item download looks usable.",
      "detail": "Yavira can redirect you to the upstream package for this item.",
      "primaryActionLabel": "Download for OpenClaw",
      "primaryActionHref": "/downloads/lite-sqlite"
    },
    "validation": {
      "installChecklist": [
        "Use the Yavira download entry.",
        "Review SKILL.md after the package is downloaded.",
        "Confirm the extracted package contains the expected setup assets."
      ],
      "postInstallChecks": [
        "Confirm the extracted package includes the expected docs or setup files.",
        "Validate the skill or prompts are available in your target agent workspace.",
        "Capture any manual follow-up steps the agent could not complete."
      ]
    },
    "downloadPageUrl": "https://openagent3.xyz/downloads/lite-sqlite",
    "agentPageUrl": "https://openagent3.xyz/skills/lite-sqlite/agent",
    "manifestUrl": "https://openagent3.xyz/skills/lite-sqlite/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/lite-sqlite/agent.md"
  },
  "agentAssist": {
    "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
    "steps": [
      "Download the package from Yavira.",
      "Extract it into a folder your agent can access.",
      "Paste one of the prompts below and point your agent at the extracted folder."
    ],
    "prompts": [
      {
        "label": "New install",
        "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
      },
      {
        "label": "Upgrade existing",
        "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
      }
    ]
  },
  "documentation": {
    "source": "clawhub",
    "primaryDoc": "SKILL.md",
    "sections": [
      {
        "title": "Lite SQLite - Lightweight Local Database",
        "body": "Ultra-lightweight SQLite database management optimized for OpenClaw agents with minimal RAM (~2-5MB) and storage overhead."
      },
      {
        "title": "Why SQLite?",
        "body": "✅ Zero setup - No server, no configuration, file-based\n✅ Minimal RAM - 2-5MB typical usage\n✅ Fast - Millions of queries/second\n✅ Portable - Single .db file\n✅ Reliable - ACID compliant, crash-proof\n✅ Cross-platform - Works everywhere Python works"
      },
      {
        "title": "Core Features",
        "body": "In-memory mode for temporary data (even faster!)\nWAL mode for concurrent access\nConnection pooling\nAutomatic schema migration\nBuilt-in backup/restore\nQuery optimization hints"
      },
      {
        "title": "Basic Database Operations",
        "body": "from sqlite_connector import SQLiteDB\n\n# Create database (auto-wal mode enabled)\ndb = SQLiteDB(\"agent_data.db\")\n\n# Create table\ndb.create_table(\"memos\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"title\": \"TEXT NOT NULL\",\n    \"content\": \"TEXT\",\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"tags\": \"TEXT\"\n})\n\n# Insert data\ndb.insert(\"memos\", [title=\"First memo\", content=\"Hello world\", tags=\"test\"])\n\n# Query data\nresults = db.query(\"SELECT * FROM memos WHERE tags = ?\", (\"test\",))\n\n# Update data\ndb.update(\"memos\", \"id = ?\", [content=\"Updated content\"], (1,))\n\n# Delete data\ndb.delete(\"memos\", \"id = ?\", (1,))\n\n# Close connection\ndb.close()"
      },
      {
        "title": "In-Memory Database (Fastest)",
        "body": "# Fastest mode - RAM only, no disk I/O\ndb = SQLiteDB(\":memory:\")\n\n# Perfect for temporary operations\ndb.create_table(\"temp\", {...})\n\n# Data persists only during session\n# Use for caching, computations, temporary storage"
      },
      {
        "title": "Essential Settings",
        "body": "import sqlite3\n\n# WAL mode (Write-Ahead Logging) - 3-4x faster\nconn = sqlite3.connect(\"agent_data.db\")\nconn.execute(\"PRAGMA journal_mode=WAL\")\n\n# Sync OFF (faster writes, crash-safe with proper shutdown)\nconn.execute(\"PRAGMA synchronous=NORMAL\")\n\n# Memory optimization\nconn.execute(\"PRAGMA cache_size=-64000\")  # 64MB cache\nconn.execute(\"PRAGMA page_size=4096\")\n\n# Temp store in RAM\nconn.execute(\"PRAGMA temp_store=MEMORY\")"
      },
      {
        "title": "Query Optimization",
        "body": "# Use indexes for frequent queries\ndb.create_index(\"memos\", \"tags\")\ndb.create_index(\"memos\", \"created_at\")\n\n# Use prepared statements (automatic in our wrapper)\ndb.query(\"SELECT * FROM memos WHERE id = ?\", (id,))\n\n# Batch inserts for large datasets\ndb.batch_insert(\"memos\", rows_data)"
      },
      {
        "title": "Agent Memo Schema (Memory Store)",
        "body": "db.create_table(\"agent_memos\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"agent_id\": \"TEXT NOT NULL\",           # Which agent created it\n    \"key\": \"TEXT NOT NULL\",               # Lookup key\n    \"value\": \"TEXT\",                      # Stored value\n    \"priority\": \"INTEGER DEFAULT 0\",       # For retrieval ordering\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"expires_at\": \"TEXT\"                  # Optional TTL\n})\n\n# Create indexes\ndb.create_index(\"agent_memos\", \"agent_id\")\ndb.create_index(\"agent_memos\", \"key\")\ndb.create_index(\"agent_memos\", \"expires_at\")"
      },
      {
        "title": "Session Log Schema",
        "body": "db.create_table(\"session_logs\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"session_id\": \"TEXT NOT NULL\",\n    \"agent\": \"TEXT NOT NULL\",\n    \"message\": \"TEXT\",\n    \"metadata\": \"TEXT\",                   # JSON\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\"\n})\n\ndb.create_index(\"session_logs\", \"session_id\")\ndb.create_index(\"session_logs\", \"created_at\")"
      },
      {
        "title": "Cache Schema (TTL-based)",
        "body": "db.create_table(\"cache\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"key\": \"TEXT UNIQUE NOT NULL\",\n    \"value\": \"BLOB\",                      # Supports binary data\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"expires_at\": \"TEXT NOT NULL\"\n})\n\n# Auto-cleanup expired entries\ndb.query(\"DELETE FROM cache WHERE expires_at < ?\", (datetime.now().isoformat(),))\n\ndb.create_index(\"cache\", \"key\")\ndb.create_index(\"cache\", \"expires_at\")"
      },
      {
        "title": "Connection Pooling",
        "body": "from sqlite_connector import ConnectionPool\n\n# Pool of connections for concurrent access\npool = ConnectionPool(\"agent_data.db\", max_connections=5)\n\n# Get connection\nconn = pool.get_connection()\n# Use conn...\npool.release_connection(conn)"
      },
      {
        "title": "Automatic Backup",
        "body": "# Backup database\ndb.backup(\"agent_data_backup.db\")\n\n# Automatic daily backup\ndb.auto_backup(\"backups/\", \"daily\")"
      },
      {
        "title": "Schema Migration",
        "body": "# Add column if not exists\ndb.add_column(\"memos\", \"updated_at\", \"TEXT DEFAULT CURRENT_TIMESTAMP\")\n\n# Migrate data\ndb.migrate(\"memos\", {\n    \"old_column\": \"new_column\"\n})"
      },
      {
        "title": "Typical Performance",
        "body": "OperationRowsTime (In-Memory)Time (Disk)Insert10,0000.05s0.3sSelect (indexed)10,0000.001s0.01sSelect (full scan)10,0000.05s0.5sUpdate1,0000.01s0.1sDelete1,0000.01s0.1s"
      },
      {
        "title": "Memory Usage",
        "body": "Base Memory: 2-5MB\nWith 100K rows: ~10-15MB\nWith 1M rows: ~50-100MB\nIn-memory mode: Same as data size + overhead"
      },
      {
        "title": "1. Choose the Right Mode",
        "body": "# Use :memory: for temporary operations\ntemp_db = SQLiteDB(\":memory:\")\n\n# Use file DB for persistent storage\npersist_db = SQLiteDB(\"agent_storage.db\")"
      },
      {
        "title": "2. Use Proper Indexes",
        "body": "# Always index columns used in WHERE clauses\ndb.create_index(\"table\", \"column_name\")\n\n# Index multiple columns for composite queries\ndb.create_index(\"table\", \"col1, col2\")"
      },
      {
        "title": "3. Batch Operations",
        "body": "# Instead of individual inserts:\nfor row in rows:\n    db.insert(\"table\", row)  # Slow!\n\n# Use batch insert:\ndb.batch_insert(\"table\", rows)  # Fast!"
      },
      {
        "title": "4. Use TTL for Expiring Data",
        "body": "# Auto-cleanup old data\ndb.cleanup_expired(\"cache\", \"expires_at\")\ndb.cleanup_old(\"logs\", \"created_at\", days=7)"
      },
      {
        "title": "5. Compact Database Periodically",
        "body": "# Reclaim space after many deletes\ndb.vacuum()  # Should be run during downtime"
      },
      {
        "title": "DuckDB Alternative (Analytics)",
        "body": "For analytical queries (aggregations, joins on large datasets), consider DuckDB:\n\nimport duckdb\n\nconn = duckdb.connect(\":memory:\")\n\n# Faster than SQLite for complex analytics\nconn.execute(\"\"\"\n    SELECT COUNT(*) as rows,\n           AVG(value) as avg_value\n    FROM large_table\n\"\"\").fetchall()\n\nWhen to use DuckDB:\n\nAnalytics on large datasets (>100M rows)\nComplex aggregations and joins\nColumnar data operations\nStatistical analysis\n\nWhen to use SQLite:\n\nTransactional operations\nSmall to medium datasets (<100M rows)\nPoint queries and updates\nGeneral-purpose storage"
      },
      {
        "title": "1. Memo Storage",
        "body": "def save_memo(db, agent_id, key, value, ttl_hours=24):\n    expires_at = (datetime.now() + timedelta(hours=ttl_hours)).isoformat()\n    db.insert(\"agent_memos\", {\n        \"agent_id\": agent_id,\n        \"key\": key,\n        \"value\": json.dumps(value),\n        \"expires_at\": expires_at\n    })"
      },
      {
        "title": "2. Session Persistence",
        "body": "def save_session(db, session_id, agent, message, metadata=None):\n    db.insert(\"session_logs\", {\n        \"session_id\": session_id,\n        \"agent\": agent,\n        \"message\": message,\n        \"metadata\": json.dumps(metadata) if metadata else None\n    })"
      },
      {
        "title": "3. Caching Layer",
        "body": "def cache_get(db, key):\n    if expired_key := db.query_one(\n        \"SELECT value FROM cache WHERE key = ? AND expires_at > ?\",\n        (key, datetime.now().isoformat())\n    ):\n        return json.loads(expired_key)\n    return None\n\ndef cache_set(db, key, value, ttl_seconds=3600):\n    expires_at = (datetime.now() + timedelta(seconds=ttl_seconds)).isoformat()\n    db.insert_or_replace(\"cache\", {\n        \"key\": key,\n        \"value\": json.dumps(value),\n        \"expires_at\": expires_at\n    })"
      },
      {
        "title": "Error Handling",
        "body": "try:\n    db.insert(\"metrics\", {...})\nexcept sqlite3.IntegrityError:\n    # Duplicate key violation\n    pass\nexcept sqlite3.OperationalError:\n    # Table doesn't exist or database locked\n    pass"
      },
      {
        "title": "Reduce Storage",
        "body": "Use appropriate data types:\n\nINTEGER instead of TEXT for numbers\nREAL instead of TEXT for floats\nUse CHECK constraints for validation\n\n\n\nNormalize data:\n\nStore JSON as TEXT\nUse TEXT for variable-length strings\nAvoid storing redundant data\n\n\n\nVacuum regularly:\ndb.vacuum()  # Reclaims space after deletes\n\n\n\nUse WAL instead of journal:\nconn.execute(\"PRAGMA journal_mode=WAL\")"
      },
      {
        "title": "From JSON Files",
        "body": "# Load JSON into SQLite\nimport json\n\nwith open(\"data.json\") as f:\n    data = json.load(f)\n\ndb.create_table(\"json_data\", {key: \"TEXT\" for key in data[0].keys()})\ndb.batch_insert(\"json_data\", data)"
      },
      {
        "title": "From CSV Files",
        "body": "import pandas as pd\n\ndf = pd.read_csv(\"data.csv\")\ndf.to_sql(\"csv_data\", conn, if_exists=\"replace\", index=False)"
      },
      {
        "title": "Database Locked Error",
        "body": "# Use WAL mode for concurrent access\nconn.execute(\"PRAGMA journal_mode=WAL\")\n\n# Or use connection pool\npool = ConnectionPool(\"db.db\", timeout=5.0)"
      },
      {
        "title": "Slow Queries",
        "body": "# Check query plan\nplan = conn.execute(\"EXPLAIN QUERY PLAN SELECT * FROM ...\").fetchall()\n\n# Add indexes\ndb.create_index(\"table\", \"column\")\n\n# Use ANALYZE\nconn.execute(\"ANALYZE\")"
      },
      {
        "title": "Large Database Size",
        "body": "# Check size info\nsize_info = conn.execute(\"PRAGMA page_count, page_size\").fetchone()\nprint(f\"Size: {(page_count * page_size) / (1024*1024):.2f} MB\")\n\n# Vacuum to reclaim space\ndb.vacuum()"
      },
      {
        "title": "CLI Tool",
        "body": "The bundled sqlite_cli.py provides command-line access:\n\n# Create database\npython scripts/sqlite_cli.py create agent_data.db\n\n# Add table\npython scripts/sqlite_cli.py create-table agent_memos -c id:INTEGER:P -c title:TEXT -c content:TEXT\n\n# Insert data\npython scripts/sqlite_cli.py insert agent_memos '{\"title\": \"Test\", \"content\": \"Hello\"}'\n\n# Query data\npython scripts/sqlite_cli.py query \"SELECT * FROM agent_memos\"\n\n# Optimize\npython scripts/sqlite_cli.py optimize agent_data.db"
      },
      {
        "title": "Resources",
        "body": "SQLite Documentation: https://www.sqlite.org/docs.html\nPython sqlite3: https://docs.python.org/3/library/sqlite3.html\nDuckDB: https://duckdb.org/docs/\nPerformance: https://www.sqlite.org/optoverview.html"
      }
    ],
    "body": "Lite SQLite - Lightweight Local Database\n\nUltra-lightweight SQLite database management optimized for OpenClaw agents with minimal RAM (~2-5MB) and storage overhead.\n\nWhy SQLite?\n\n✅ Zero setup - No server, no configuration, file-based ✅ Minimal RAM - 2-5MB typical usage ✅ Fast - Millions of queries/second ✅ Portable - Single .db file ✅ Reliable - ACID compliant, crash-proof ✅ Cross-platform - Works everywhere Python works\n\nCore Features\nIn-memory mode for temporary data (even faster!)\nWAL mode for concurrent access\nConnection pooling\nAutomatic schema migration\nBuilt-in backup/restore\nQuery optimization hints\nQuick Start\nBasic Database Operations\nfrom sqlite_connector import SQLiteDB\n\n# Create database (auto-wal mode enabled)\ndb = SQLiteDB(\"agent_data.db\")\n\n# Create table\ndb.create_table(\"memos\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"title\": \"TEXT NOT NULL\",\n    \"content\": \"TEXT\",\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"tags\": \"TEXT\"\n})\n\n# Insert data\ndb.insert(\"memos\", [title=\"First memo\", content=\"Hello world\", tags=\"test\"])\n\n# Query data\nresults = db.query(\"SELECT * FROM memos WHERE tags = ?\", (\"test\",))\n\n# Update data\ndb.update(\"memos\", \"id = ?\", [content=\"Updated content\"], (1,))\n\n# Delete data\ndb.delete(\"memos\", \"id = ?\", (1,))\n\n# Close connection\ndb.close()\n\nIn-Memory Database (Fastest)\n# Fastest mode - RAM only, no disk I/O\ndb = SQLiteDB(\":memory:\")\n\n# Perfect for temporary operations\ndb.create_table(\"temp\", {...})\n\n# Data persists only during session\n# Use for caching, computations, temporary storage\n\nPerformance Optimization\nEssential Settings\nimport sqlite3\n\n# WAL mode (Write-Ahead Logging) - 3-4x faster\nconn = sqlite3.connect(\"agent_data.db\")\nconn.execute(\"PRAGMA journal_mode=WAL\")\n\n# Sync OFF (faster writes, crash-safe with proper shutdown)\nconn.execute(\"PRAGMA synchronous=NORMAL\")\n\n# Memory optimization\nconn.execute(\"PRAGMA cache_size=-64000\")  # 64MB cache\nconn.execute(\"PRAGMA page_size=4096\")\n\n# Temp store in RAM\nconn.execute(\"PRAGMA temp_store=MEMORY\")\n\nQuery Optimization\n# Use indexes for frequent queries\ndb.create_index(\"memos\", \"tags\")\ndb.create_index(\"memos\", \"created_at\")\n\n# Use prepared statements (automatic in our wrapper)\ndb.query(\"SELECT * FROM memos WHERE id = ?\", (id,))\n\n# Batch inserts for large datasets\ndb.batch_insert(\"memos\", rows_data)\n\nPredefined Schemas\nAgent Memo Schema (Memory Store)\ndb.create_table(\"agent_memos\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"agent_id\": \"TEXT NOT NULL\",           # Which agent created it\n    \"key\": \"TEXT NOT NULL\",               # Lookup key\n    \"value\": \"TEXT\",                      # Stored value\n    \"priority\": \"INTEGER DEFAULT 0\",       # For retrieval ordering\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"expires_at\": \"TEXT\"                  # Optional TTL\n})\n\n# Create indexes\ndb.create_index(\"agent_memos\", \"agent_id\")\ndb.create_index(\"agent_memos\", \"key\")\ndb.create_index(\"agent_memos\", \"expires_at\")\n\nSession Log Schema\ndb.create_table(\"session_logs\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"session_id\": \"TEXT NOT NULL\",\n    \"agent\": \"TEXT NOT NULL\",\n    \"message\": \"TEXT\",\n    \"metadata\": \"TEXT\",                   # JSON\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\"\n})\n\ndb.create_index(\"session_logs\", \"session_id\")\ndb.create_index(\"session_logs\", \"created_at\")\n\nCache Schema (TTL-based)\ndb.create_table(\"cache\", {\n    \"id\": \"INTEGER PRIMARY KEY AUTOINCREMENT\",\n    \"key\": \"TEXT UNIQUE NOT NULL\",\n    \"value\": \"BLOB\",                      # Supports binary data\n    \"created_at\": \"TEXT DEFAULT CURRENT_TIMESTAMP\",\n    \"expires_at\": \"TEXT NOT NULL\"\n})\n\n# Auto-cleanup expired entries\ndb.query(\"DELETE FROM cache WHERE expires_at < ?\", (datetime.now().isoformat(),))\n\ndb.create_index(\"cache\", \"key\")\ndb.create_index(\"cache\", \"expires_at\")\n\nAdvanced Features\nConnection Pooling\nfrom sqlite_connector import ConnectionPool\n\n# Pool of connections for concurrent access\npool = ConnectionPool(\"agent_data.db\", max_connections=5)\n\n# Get connection\nconn = pool.get_connection()\n# Use conn...\npool.release_connection(conn)\n\nAutomatic Backup\n# Backup database\ndb.backup(\"agent_data_backup.db\")\n\n# Automatic daily backup\ndb.auto_backup(\"backups/\", \"daily\")\n\nSchema Migration\n# Add column if not exists\ndb.add_column(\"memos\", \"updated_at\", \"TEXT DEFAULT CURRENT_TIMESTAMP\")\n\n# Migrate data\ndb.migrate(\"memos\", {\n    \"old_column\": \"new_column\"\n})\n\nPerformance Benchmarks\nTypical Performance\nOperation\tRows\tTime (In-Memory)\tTime (Disk)\nInsert\t10,000\t0.05s\t0.3s\nSelect (indexed)\t10,000\t0.001s\t0.01s\nSelect (full scan)\t10,000\t0.05s\t0.5s\nUpdate\t1,000\t0.01s\t0.1s\nDelete\t1,000\t0.01s\t0.1s\nMemory Usage\nBase Memory: 2-5MB\nWith 100K rows: ~10-15MB\nWith 1M rows: ~50-100MB\nIn-memory mode: Same as data size + overhead\nBest Practices for OpenClaw Agents\n1. Choose the Right Mode\n# Use :memory: for temporary operations\ntemp_db = SQLiteDB(\":memory:\")\n\n# Use file DB for persistent storage\npersist_db = SQLiteDB(\"agent_storage.db\")\n\n2. Use Proper Indexes\n# Always index columns used in WHERE clauses\ndb.create_index(\"table\", \"column_name\")\n\n# Index multiple columns for composite queries\ndb.create_index(\"table\", \"col1, col2\")\n\n3. Batch Operations\n# Instead of individual inserts:\nfor row in rows:\n    db.insert(\"table\", row)  # Slow!\n\n# Use batch insert:\ndb.batch_insert(\"table\", rows)  # Fast!\n\n4. Use TTL for Expiring Data\n# Auto-cleanup old data\ndb.cleanup_expired(\"cache\", \"expires_at\")\ndb.cleanup_old(\"logs\", \"created_at\", days=7)\n\n5. Compact Database Periodically\n# Reclaim space after many deletes\ndb.vacuum()  # Should be run during downtime\n\nDuckDB Alternative (Analytics)\n\nFor analytical queries (aggregations, joins on large datasets), consider DuckDB:\n\nimport duckdb\n\nconn = duckdb.connect(\":memory:\")\n\n# Faster than SQLite for complex analytics\nconn.execute(\"\"\"\n    SELECT COUNT(*) as rows,\n           AVG(value) as avg_value\n    FROM large_table\n\"\"\").fetchall()\n\n\nWhen to use DuckDB:\n\nAnalytics on large datasets (>100M rows)\nComplex aggregations and joins\nColumnar data operations\nStatistical analysis\n\nWhen to use SQLite:\n\nTransactional operations\nSmall to medium datasets (<100M rows)\nPoint queries and updates\nGeneral-purpose storage\nCommon Patterns\n1. Memo Storage\ndef save_memo(db, agent_id, key, value, ttl_hours=24):\n    expires_at = (datetime.now() + timedelta(hours=ttl_hours)).isoformat()\n    db.insert(\"agent_memos\", {\n        \"agent_id\": agent_id,\n        \"key\": key,\n        \"value\": json.dumps(value),\n        \"expires_at\": expires_at\n    })\n\n2. Session Persistence\ndef save_session(db, session_id, agent, message, metadata=None):\n    db.insert(\"session_logs\", {\n        \"session_id\": session_id,\n        \"agent\": agent,\n        \"message\": message,\n        \"metadata\": json.dumps(metadata) if metadata else None\n    })\n\n3. Caching Layer\ndef cache_get(db, key):\n    if expired_key := db.query_one(\n        \"SELECT value FROM cache WHERE key = ? AND expires_at > ?\",\n        (key, datetime.now().isoformat())\n    ):\n        return json.loads(expired_key)\n    return None\n\ndef cache_set(db, key, value, ttl_seconds=3600):\n    expires_at = (datetime.now() + timedelta(seconds=ttl_seconds)).isoformat()\n    db.insert_or_replace(\"cache\", {\n        \"key\": key,\n        \"value\": json.dumps(value),\n        \"expires_at\": expires_at\n    })\n\nError Handling\ntry:\n    db.insert(\"metrics\", {...})\nexcept sqlite3.IntegrityError:\n    # Duplicate key violation\n    pass\nexcept sqlite3.OperationalError:\n    # Table doesn't exist or database locked\n    pass\n\nSize Optimization Tips\nReduce Storage\n\nUse appropriate data types:\n\nINTEGER instead of TEXT for numbers\nREAL instead of TEXT for floats\nUse CHECK constraints for validation\n\nNormalize data:\n\nStore JSON as TEXT\nUse TEXT for variable-length strings\nAvoid storing redundant data\n\nVacuum regularly:\n\ndb.vacuum()  # Reclaims space after deletes\n\n\nUse WAL instead of journal:\n\nconn.execute(\"PRAGMA journal_mode=WAL\")\n\nMigration from Other Stores\nFrom JSON Files\n# Load JSON into SQLite\nimport json\n\nwith open(\"data.json\") as f:\n    data = json.load(f)\n\ndb.create_table(\"json_data\", {key: \"TEXT\" for key in data[0].keys()})\ndb.batch_insert(\"json_data\", data)\n\nFrom CSV Files\nimport pandas as pd\n\ndf = pd.read_csv(\"data.csv\")\ndf.to_sql(\"csv_data\", conn, if_exists=\"replace\", index=False)\n\nTroubleshooting\nDatabase Locked Error\n# Use WAL mode for concurrent access\nconn.execute(\"PRAGMA journal_mode=WAL\")\n\n# Or use connection pool\npool = ConnectionPool(\"db.db\", timeout=5.0)\n\nSlow Queries\n# Check query plan\nplan = conn.execute(\"EXPLAIN QUERY PLAN SELECT * FROM ...\").fetchall()\n\n# Add indexes\ndb.create_index(\"table\", \"column\")\n\n# Use ANALYZE\nconn.execute(\"ANALYZE\")\n\nLarge Database Size\n# Check size info\nsize_info = conn.execute(\"PRAGMA page_count, page_size\").fetchone()\nprint(f\"Size: {(page_count * page_size) / (1024*1024):.2f} MB\")\n\n# Vacuum to reclaim space\ndb.vacuum()\n\nCLI Tool\n\nThe bundled sqlite_cli.py provides command-line access:\n\n# Create database\npython scripts/sqlite_cli.py create agent_data.db\n\n# Add table\npython scripts/sqlite_cli.py create-table agent_memos -c id:INTEGER:P -c title:TEXT -c content:TEXT\n\n# Insert data\npython scripts/sqlite_cli.py insert agent_memos '{\"title\": \"Test\", \"content\": \"Hello\"}'\n\n# Query data\npython scripts/sqlite_cli.py query \"SELECT * FROM agent_memos\"\n\n# Optimize\npython scripts/sqlite_cli.py optimize agent_data.db\n\nResources\nSQLite Documentation: https://www.sqlite.org/docs.html\nPython sqlite3: https://docs.python.org/3/library/sqlite3.html\nDuckDB: https://duckdb.org/docs/\nPerformance: https://www.sqlite.org/optoverview.html"
  },
  "trust": {
    "sourceLabel": "tencent",
    "provenanceUrl": "https://clawhub.ai/omprasad122007-rgb/lite-sqlite",
    "publisherUrl": "https://clawhub.ai/omprasad122007-rgb/lite-sqlite",
    "owner": "omprasad122007-rgb",
    "version": "1.0.0",
    "license": null,
    "verificationStatus": "Indexed source record"
  },
  "links": {
    "detailUrl": "https://openagent3.xyz/skills/lite-sqlite",
    "downloadUrl": "https://openagent3.xyz/downloads/lite-sqlite",
    "agentUrl": "https://openagent3.xyz/skills/lite-sqlite/agent",
    "manifestUrl": "https://openagent3.xyz/skills/lite-sqlite/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/lite-sqlite/agent.md"
  }
}