{
  "schemaVersion": "1.0",
  "item": {
    "slug": "nodetool",
    "name": "Nodetool",
    "source": "tencent",
    "type": "skill",
    "category": "AI 智能",
    "sourceUrl": "https://clawhub.ai/georgi/nodetool",
    "canonicalUrl": "https://clawhub.ai/georgi/nodetool",
    "targetPlatform": "OpenClaw"
  },
  "install": {
    "downloadMode": "redirect",
    "downloadUrl": "/downloads/nodetool",
    "sourceDownloadUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=nodetool",
    "sourcePlatform": "tencent",
    "targetPlatform": "OpenClaw",
    "installMethod": "Manual import",
    "extraction": "Extract archive",
    "prerequisites": [
      "OpenClaw"
    ],
    "packageFormat": "ZIP package",
    "includedAssets": [
      "package.json",
      "SKILL.md"
    ],
    "primaryDoc": "SKILL.md",
    "quickSetup": [
      "Download the package from Yavira.",
      "Extract the archive and review SKILL.md first.",
      "Import or place the package into your OpenClaw setup."
    ],
    "agentAssist": {
      "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
      "steps": [
        "Download the package from Yavira.",
        "Extract it into a folder your agent can access.",
        "Paste one of the prompts below and point your agent at the extracted folder."
      ],
      "prompts": [
        {
          "label": "New install",
          "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
        },
        {
          "label": "Upgrade existing",
          "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
        }
      ]
    },
    "sourceHealth": {
      "source": "tencent",
      "status": "healthy",
      "reason": "direct_download_ok",
      "recommendedAction": "download",
      "checkedAt": "2026-04-30T16:55:25.780Z",
      "expiresAt": "2026-05-07T16:55:25.780Z",
      "httpStatus": 200,
      "finalUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
      "contentType": "application/zip",
      "probeMethod": "head",
      "details": {
        "probeUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
        "contentDisposition": "attachment; filename=\"network-1.0.0.zip\"",
        "redirectLocation": null,
        "bodySnippet": null
      },
      "scope": "source",
      "summary": "Source download looks usable.",
      "detail": "Yavira can redirect you to the upstream package for this source.",
      "primaryActionLabel": "Download for OpenClaw",
      "primaryActionHref": "/downloads/nodetool"
    },
    "validation": {
      "installChecklist": [
        "Use the Yavira download entry.",
        "Review SKILL.md after the package is downloaded.",
        "Confirm the extracted package contains the expected setup assets."
      ],
      "postInstallChecks": [
        "Confirm the extracted package includes the expected docs or setup files.",
        "Validate the skill or prompts are available in your target agent workspace.",
        "Capture any manual follow-up steps the agent could not complete."
      ]
    },
    "downloadPageUrl": "https://openagent3.xyz/downloads/nodetool",
    "agentPageUrl": "https://openagent3.xyz/skills/nodetool/agent",
    "manifestUrl": "https://openagent3.xyz/skills/nodetool/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/nodetool/agent.md"
  },
  "agentAssist": {
    "summary": "Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.",
    "steps": [
      "Download the package from Yavira.",
      "Extract it into a folder your agent can access.",
      "Paste one of the prompts below and point your agent at the extracted folder."
    ],
    "prompts": [
      {
        "label": "New install",
        "body": "I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete."
      },
      {
        "label": "Upgrade existing",
        "body": "I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run."
      }
    ]
  },
  "documentation": {
    "source": "clawhub",
    "primaryDoc": "SKILL.md",
    "sections": [
      {
        "title": "NodeTool",
        "body": "Visual AI workflow builder combining ComfyUI's node-based flexibility with n8n's automation power. Build LLM agents, RAG pipelines, and multimodal data flows on your local machine."
      },
      {
        "title": "Quick Start",
        "body": "# See system info\nnodetool info\n\n# List workflows\nnodetool workflows list\n\n# Run a workflow interactively\nnodetool run <workflow_id>\n\n# Start of chat interface\nnodetool chat\n\n# Start of web server\nnodetool serve"
      },
      {
        "title": "Linux / macOS",
        "body": "Quick one-line installation:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash\n\nWith custom directory:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool\n\nNon-interactive mode (automatic, no prompts):\n\nBoth scripts support silent installation:\n\n# Linux/macOS - use -y\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y\n\n# Windows - use -Yes\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\\install.ps1 -Yes\n\nWhat happens with non-interactive mode:\n\nAll confirmation prompts are skipped automatically\nInstallation proceeds without requiring user input\nPerfect for CI/CD pipelines or automated setups"
      },
      {
        "title": "Windows",
        "body": "Quick one-line installation:\n\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex\n\nWith custom directory:\n\n.\\install.ps1 -Prefix \"C:\\nodetool\"\n\nNon-interactive mode:\n\n.\\install.ps1 -Yes"
      },
      {
        "title": "Workflows",
        "body": "Manage and execute NodeTool workflows:\n\n# List all workflows (user + example)\nnodetool workflows list\n\n# Get details for a specific workflow\nnodetool workflows get <workflow_id>\n\n# Run workflow by ID\nnodetool run <workflow_id>\n\n# Run workflow from file\nnodetool run workflow.json\n\n# Run with JSONL output (for automation)\nnodetool run <workflow_id> --jsonl"
      },
      {
        "title": "Run Options",
        "body": "Execute workflows in different modes:\n\n# Interactive mode (default) - pretty output\nnodetool run workflow_abc123\n\n# JSONL mode - streaming JSON for subprocess use\nnodetool run workflow_abc123 --jsonl\n\n# Stdin mode - pipe RunJobRequest JSON\necho '{\"workflow_id\":\"abc\",\"user_id\":\"1\",\"auth_token\":\"token\",\"params\":{}}' | nodetool run --stdin --jsonl\n\n# With custom user ID\nnodetool run workflow_abc123 --user-id \"custom_user_id\"\n\n# With auth token\nnodetool run workflow_abc123 --auth-token \"my_auth_token\""
      },
      {
        "title": "Assets",
        "body": "Manage workflow assets (nodes, models, files):\n\n# List all assets\nnodetool assets list\n\n# Get asset details\nnodetool assets get <asset_id>"
      },
      {
        "title": "Packages",
        "body": "Manage NodeTool packages (export workflows, generate docs):\n\n# List packages\nnodetool package list\n\n# Generate documentation\nnodetool package docs\n\n# Generate node documentation\nnodetool package node-docs\n\n# Generate workflow documentation (Jekyll)\nnodetool package workflow-docs\n\n# Scan directory for nodes and create package\nnodetool package scan\n\n# Initialize new package project\nnodetool package init"
      },
      {
        "title": "Jobs",
        "body": "Manage background job executions:\n\n# List jobs for a user\nnodetool jobs list\n\n# Get job details\nnodetool jobs get <job_id>\n\n# Get job logs\nnodetool jobs logs <job_id>\n\n# Start background job for workflow\nnodetool jobs start <workflow_id>"
      },
      {
        "title": "Deployment",
        "body": "Deploy NodeTool to cloud platforms (RunPod, GCP, Docker):\n\n# Initialize deployment.yaml\nnodetool deploy init\n\n# List deployments\nnodetool deploy list\n\n# Add new deployment\nnodetool deploy add\n\n# Apply deployment configuration\nnodetool deploy apply\n\n# Check deployment status\nnodetool deploy status <deployment_name>\n\n# View deployment logs\nnodetool deploy logs <deployment_name>\n\n# Destroy deployment\nnodetool deploy destroy <deployment_name>\n\n# Manage collections on deployed instance\nnodetool deploy collections\n\n# Manage database on deployed instance\nnodetool deploy database\n\n# Manage workflows on deployed instance\nnodetool deploy workflows\n\n# See what changes will be made\nnodetool deploy plan"
      },
      {
        "title": "Model Management",
        "body": "Discover and manage AI models (HuggingFace, Ollama):\n\n# List cached HuggingFace models by type\nnodetool model list-hf <hf_type>\n\n# List all HuggingFace cache entries\nnodetool model list-hf-all\n\n# List supported HF types\nnodetool model hf-types\n\n# Inspect HuggingFace cache\nnodetool model hf-cache\n\n# Scan cache for info\nnodetool admin scan-cache"
      },
      {
        "title": "Admin",
        "body": "Maintain model caches and clean up:\n\n# Calculate total cache size\nnodetool admin cache-size\n\n# Delete HuggingFace model from cache\nnodetool admin delete-hf <model_name>\n\n# Download HuggingFace models with progress\nnodetool admin download-hf <model_name>\n\n# Download Ollama models\nnodetool admin download-ollama <model_name>"
      },
      {
        "title": "Chat & Server",
        "body": "Interactive chat and web interface:\n\n# Start CLI chat\nnodetool chat\n\n# Start chat server (WebSocket + SSE)\nnodetool chat-server\n\n# Start FastAPI backend server\nnodetool serve --host 0.0.0.0 --port 8000\n\n# With static assets folder\nnodetool serve --static-folder ./static --apps-folder ./apps\n\n# Development mode with auto-reload\nnodetool serve --reload\n\n# Production mode\nnodetool serve --production"
      },
      {
        "title": "Proxy",
        "body": "Start reverse proxy with HTTPS:\n\n# Start proxy server\nnodetool proxy\n\n# Check proxy status\nnodetool proxy-status\n\n# Validate proxy config\nnodetool proxy-validate-config\n\n# Run proxy daemon with ACME HTTP + HTTPS\nnodetool proxy-daemon"
      },
      {
        "title": "Other Commands",
        "body": "# View settings and secrets\nnodetool settings show\n\n# Generate custom HTML app for workflow\nnodetool vibecoding\n\n# Run workflow and export as Python DSL\nnodetool dsl-export\n\n# Export workflow as Gradio app\nnodetool gradio-export\n\n# Regenerate DSL\nnodetool codegen\n\n# Manage database migrations\nnodetool migrations\n\n# Synchronize database with remote\nnodetool sync"
      },
      {
        "title": "Workflow Execution",
        "body": "Run a NodeTool workflow and get structured output:\n\n# Run workflow interactively\nnodetool run my_workflow_id\n\n# Run and stream JSONL output\nnodetool run my_workflow_id --jsonl | jq -r '.[] | \"\\(.status) | \\(.output)\"'"
      },
      {
        "title": "Package Creation",
        "body": "Generate documentation for a custom package:\n\n# Scan for nodes and create package\nnodetool package scan\n\n# Generate complete documentation\nnodetool package docs"
      },
      {
        "title": "Deployment",
        "body": "Deploy a NodeTool instance to the cloud:\n\n# Initialize deployment config\nnodetool deploy init\n\n# Add RunPod deployment\nnodetool deploy add\n\n# Deploy and start\nnodetool deploy apply"
      },
      {
        "title": "Model Management",
        "body": "Check and manage cached AI models:\n\n# List all available models\nnodetool model list-hf-all\n\n# Inspect cache\nnodetool model hf-cache"
      },
      {
        "title": "Linux / macOS",
        "body": "Quick one-line installation:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash\n\nWith custom directory:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool\n\nNon-interactive mode (automatic, no prompts):\n\nBoth scripts support silent installation:\n\n# Linux/macOS - use -y\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y\n\n# Windows - use -Yes\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\\install.ps1 -Yes\n\nWhat happens with non-interactive mode:\n\nAll confirmation prompts are skipped automatically\nInstallation proceeds without requiring user input\nPerfect for CI/CD pipelines or automated setups"
      },
      {
        "title": "Windows",
        "body": "Quick one-line installation:\n\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex\n\nWith custom directory:\n\n.\\install.ps1 -Prefix \"C:\\nodetool\"\n\nNon-interactive mode:\n\n.\\install.ps1 -Yes"
      },
      {
        "title": "What Gets Installed",
        "body": "The installer sets up:\n\nmicromamba — Python package manager (conda replacement)\nNodeTool environment — Conda env at ~/.nodetool/env\nPython packages — nodetool-core, nodetool-base from NodeTool registry\nWrapper scripts — nodetool CLI available from any terminal"
      },
      {
        "title": "Environment Setup",
        "body": "After installation, these variables are automatically configured:\n\n# Conda environment\nexport MAMBA_ROOT_PREFIX=\"$HOME/.nodetool/micromamba\"\nexport PATH=\"$HOME/.nodetool/env/bin:$HOME/.nodetool/env/Library/bin:$PATH\"\n\n# Model cache directories\nexport HF_HOME=\"$HOME/.nodetool/cache/huggingface\"\nexport OLLAMA_MODELS=\"$HOME/.nodetool/cache/ollama\""
      },
      {
        "title": "System Info",
        "body": "Check NodeTool environment and installed packages:\n\nnodetool info\n\nOutput shows:\n\nVersion\nPython version\nPlatform/Architecture\nInstalled AI packages (OpenAI, Anthropic, Google, HF, Ollama, fal-client)\nEnvironment variables\nAPI key status"
      }
    ],
    "body": "NodeTool\n\nVisual AI workflow builder combining ComfyUI's node-based flexibility with n8n's automation power. Build LLM agents, RAG pipelines, and multimodal data flows on your local machine.\n\nQuick Start\n# See system info\nnodetool info\n\n# List workflows\nnodetool workflows list\n\n# Run a workflow interactively\nnodetool run <workflow_id>\n\n# Start of chat interface\nnodetool chat\n\n# Start of web server\nnodetool serve\n\nInstallation\nLinux / macOS\n\nQuick one-line installation:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash\n\n\nWith custom directory:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool\n\n\nNon-interactive mode (automatic, no prompts):\n\nBoth scripts support silent installation:\n\n# Linux/macOS - use -y\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y\n\n# Windows - use -Yes\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\\install.ps1 -Yes\n\n\nWhat happens with non-interactive mode:\n\nAll confirmation prompts are skipped automatically\nInstallation proceeds without requiring user input\nPerfect for CI/CD pipelines or automated setups\nWindows\n\nQuick one-line installation:\n\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex\n\n\nWith custom directory:\n\n.\\install.ps1 -Prefix \"C:\\nodetool\"\n\n\nNon-interactive mode:\n\n.\\install.ps1 -Yes\n\nCore Commands\nWorkflows\n\nManage and execute NodeTool workflows:\n\n# List all workflows (user + example)\nnodetool workflows list\n\n# Get details for a specific workflow\nnodetool workflows get <workflow_id>\n\n# Run workflow by ID\nnodetool run <workflow_id>\n\n# Run workflow from file\nnodetool run workflow.json\n\n# Run with JSONL output (for automation)\nnodetool run <workflow_id> --jsonl\n\nRun Options\n\nExecute workflows in different modes:\n\n# Interactive mode (default) - pretty output\nnodetool run workflow_abc123\n\n# JSONL mode - streaming JSON for subprocess use\nnodetool run workflow_abc123 --jsonl\n\n# Stdin mode - pipe RunJobRequest JSON\necho '{\"workflow_id\":\"abc\",\"user_id\":\"1\",\"auth_token\":\"token\",\"params\":{}}' | nodetool run --stdin --jsonl\n\n# With custom user ID\nnodetool run workflow_abc123 --user-id \"custom_user_id\"\n\n# With auth token\nnodetool run workflow_abc123 --auth-token \"my_auth_token\"\n\nAssets\n\nManage workflow assets (nodes, models, files):\n\n# List all assets\nnodetool assets list\n\n# Get asset details\nnodetool assets get <asset_id>\n\nPackages\n\nManage NodeTool packages (export workflows, generate docs):\n\n# List packages\nnodetool package list\n\n# Generate documentation\nnodetool package docs\n\n# Generate node documentation\nnodetool package node-docs\n\n# Generate workflow documentation (Jekyll)\nnodetool package workflow-docs\n\n# Scan directory for nodes and create package\nnodetool package scan\n\n# Initialize new package project\nnodetool package init\n\nJobs\n\nManage background job executions:\n\n# List jobs for a user\nnodetool jobs list\n\n# Get job details\nnodetool jobs get <job_id>\n\n# Get job logs\nnodetool jobs logs <job_id>\n\n# Start background job for workflow\nnodetool jobs start <workflow_id>\n\nDeployment\n\nDeploy NodeTool to cloud platforms (RunPod, GCP, Docker):\n\n# Initialize deployment.yaml\nnodetool deploy init\n\n# List deployments\nnodetool deploy list\n\n# Add new deployment\nnodetool deploy add\n\n# Apply deployment configuration\nnodetool deploy apply\n\n# Check deployment status\nnodetool deploy status <deployment_name>\n\n# View deployment logs\nnodetool deploy logs <deployment_name>\n\n# Destroy deployment\nnodetool deploy destroy <deployment_name>\n\n# Manage collections on deployed instance\nnodetool deploy collections\n\n# Manage database on deployed instance\nnodetool deploy database\n\n# Manage workflows on deployed instance\nnodetool deploy workflows\n\n# See what changes will be made\nnodetool deploy plan\n\nModel Management\n\nDiscover and manage AI models (HuggingFace, Ollama):\n\n# List cached HuggingFace models by type\nnodetool model list-hf <hf_type>\n\n# List all HuggingFace cache entries\nnodetool model list-hf-all\n\n# List supported HF types\nnodetool model hf-types\n\n# Inspect HuggingFace cache\nnodetool model hf-cache\n\n# Scan cache for info\nnodetool admin scan-cache\n\nAdmin\n\nMaintain model caches and clean up:\n\n# Calculate total cache size\nnodetool admin cache-size\n\n# Delete HuggingFace model from cache\nnodetool admin delete-hf <model_name>\n\n# Download HuggingFace models with progress\nnodetool admin download-hf <model_name>\n\n# Download Ollama models\nnodetool admin download-ollama <model_name>\n\nChat & Server\n\nInteractive chat and web interface:\n\n# Start CLI chat\nnodetool chat\n\n# Start chat server (WebSocket + SSE)\nnodetool chat-server\n\n# Start FastAPI backend server\nnodetool serve --host 0.0.0.0 --port 8000\n\n# With static assets folder\nnodetool serve --static-folder ./static --apps-folder ./apps\n\n# Development mode with auto-reload\nnodetool serve --reload\n\n# Production mode\nnodetool serve --production\n\nProxy\n\nStart reverse proxy with HTTPS:\n\n# Start proxy server\nnodetool proxy\n\n# Check proxy status\nnodetool proxy-status\n\n# Validate proxy config\nnodetool proxy-validate-config\n\n# Run proxy daemon with ACME HTTP + HTTPS\nnodetool proxy-daemon\n\nOther Commands\n# View settings and secrets\nnodetool settings show\n\n# Generate custom HTML app for workflow\nnodetool vibecoding\n\n# Run workflow and export as Python DSL\nnodetool dsl-export\n\n# Export workflow as Gradio app\nnodetool gradio-export\n\n# Regenerate DSL\nnodetool codegen\n\n# Manage database migrations\nnodetool migrations\n\n# Synchronize database with remote\nnodetool sync\n\nUse Cases\nWorkflow Execution\n\nRun a NodeTool workflow and get structured output:\n\n# Run workflow interactively\nnodetool run my_workflow_id\n\n# Run and stream JSONL output\nnodetool run my_workflow_id --jsonl | jq -r '.[] | \"\\(.status) | \\(.output)\"'\n\nPackage Creation\n\nGenerate documentation for a custom package:\n\n# Scan for nodes and create package\nnodetool package scan\n\n# Generate complete documentation\nnodetool package docs\n\nDeployment\n\nDeploy a NodeTool instance to the cloud:\n\n# Initialize deployment config\nnodetool deploy init\n\n# Add RunPod deployment\nnodetool deploy add\n\n# Deploy and start\nnodetool deploy apply\n\nModel Management\n\nCheck and manage cached AI models:\n\n# List all available models\nnodetool model list-hf-all\n\n# Inspect cache\nnodetool model hf-cache\n\nInstallation\nLinux / macOS\n\nQuick one-line installation:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash\n\n\nWith custom directory:\n\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool\n\n\nNon-interactive mode (automatic, no prompts):\n\nBoth scripts support silent installation:\n\n# Linux/macOS - use -y\ncurl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y\n\n# Windows - use -Yes\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\\install.ps1 -Yes\n\n\nWhat happens with non-interactive mode:\n\nAll confirmation prompts are skipped automatically\nInstallation proceeds without requiring user input\nPerfect for CI/CD pipelines or automated setups\nWindows\n\nQuick one-line installation:\n\nirm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex\n\n\nWith custom directory:\n\n.\\install.ps1 -Prefix \"C:\\nodetool\"\n\n\nNon-interactive mode:\n\n.\\install.ps1 -Yes\n\nWhat Gets Installed\n\nThe installer sets up:\n\nmicromamba — Python package manager (conda replacement)\nNodeTool environment — Conda env at ~/.nodetool/env\nPython packages — nodetool-core, nodetool-base from NodeTool registry\nWrapper scripts — nodetool CLI available from any terminal\nEnvironment Setup\n\nAfter installation, these variables are automatically configured:\n\n# Conda environment\nexport MAMBA_ROOT_PREFIX=\"$HOME/.nodetool/micromamba\"\nexport PATH=\"$HOME/.nodetool/env/bin:$HOME/.nodetool/env/Library/bin:$PATH\"\n\n# Model cache directories\nexport HF_HOME=\"$HOME/.nodetool/cache/huggingface\"\nexport OLLAMA_MODELS=\"$HOME/.nodetool/cache/ollama\"\n\nSystem Info\n\nCheck NodeTool environment and installed packages:\n\nnodetool info\n\n\nOutput shows:\n\nVersion\nPython version\nPlatform/Architecture\nInstalled AI packages (OpenAI, Anthropic, Google, HF, Ollama, fal-client)\nEnvironment variables\nAPI key status"
  },
  "trust": {
    "sourceLabel": "tencent",
    "provenanceUrl": "https://clawhub.ai/georgi/nodetool",
    "publisherUrl": "https://clawhub.ai/georgi/nodetool",
    "owner": "georgi",
    "version": "0.6.3",
    "license": null,
    "verificationStatus": "Indexed source record"
  },
  "links": {
    "detailUrl": "https://openagent3.xyz/skills/nodetool",
    "downloadUrl": "https://openagent3.xyz/downloads/nodetool",
    "agentUrl": "https://openagent3.xyz/skills/nodetool/agent",
    "manifestUrl": "https://openagent3.xyz/skills/nodetool/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/nodetool/agent.md"
  }
}