← All skills
Tencent SkillHub Β· AI

Nodetool

Visual AI workflow builder - ComfyUI meets n8n for LLM agents, RAG pipelines, and multimodal data flows. Local-first, open source (AGPL-3.0).

skill openclawclawhub Free
0 Downloads
0 Stars
0 Installs
0 Score
High Signal

Visual AI workflow builder - ComfyUI meets n8n for LLM agents, RAG pipelines, and multimodal data flows. Local-first, open source (AGPL-3.0).

⬇ 0 downloads β˜… 0 stars Unverified but indexed

Install for OpenClaw

Quick setup
  1. Download the package from Yavira.
  2. Extract the archive and review SKILL.md first.
  3. Import or place the package into your OpenClaw setup.

Requirements

Target platform
OpenClaw
Install method
Manual import
Extraction
Extract archive
Prerequisites
OpenClaw
Primary doc
SKILL.md

Package facts

Download mode
Yavira redirect
Package format
ZIP package
Source platform
Tencent SkillHub
What's included
package.json, SKILL.md

Validation

  • Use the Yavira download entry.
  • Review SKILL.md after the package is downloaded.
  • Confirm the extracted package contains the expected setup assets.

Install with your agent

Agent handoff

Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.

  1. Download the package from Yavira.
  2. Extract it into a folder your agent can access.
  3. Paste one of the prompts below and point your agent at the extracted folder.
New install

I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.

Upgrade existing

I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.

Trust & source

Release facts

Source
Tencent SkillHub
Verification
Indexed source record
Version
0.6.3

Documentation

ClawHub primary doc Primary doc: SKILL.md 24 sections Open source page

NodeTool

Visual AI workflow builder combining ComfyUI's node-based flexibility with n8n's automation power. Build LLM agents, RAG pipelines, and multimodal data flows on your local machine.

Quick Start

# See system info nodetool info # List workflows nodetool workflows list # Run a workflow interactively nodetool run <workflow_id> # Start of chat interface nodetool chat # Start of web server nodetool serve

Linux / macOS

Quick one-line installation: curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash With custom directory: curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool Non-interactive mode (automatic, no prompts): Both scripts support silent installation: # Linux/macOS - use -y curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y # Windows - use -Yes irm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\install.ps1 -Yes What happens with non-interactive mode: All confirmation prompts are skipped automatically Installation proceeds without requiring user input Perfect for CI/CD pipelines or automated setups

Windows

Quick one-line installation: irm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex With custom directory: .\install.ps1 -Prefix "C:\nodetool" Non-interactive mode: .\install.ps1 -Yes

Workflows

Manage and execute NodeTool workflows: # List all workflows (user + example) nodetool workflows list # Get details for a specific workflow nodetool workflows get <workflow_id> # Run workflow by ID nodetool run <workflow_id> # Run workflow from file nodetool run workflow.json # Run with JSONL output (for automation) nodetool run <workflow_id> --jsonl

Run Options

Execute workflows in different modes: # Interactive mode (default) - pretty output nodetool run workflow_abc123 # JSONL mode - streaming JSON for subprocess use nodetool run workflow_abc123 --jsonl # Stdin mode - pipe RunJobRequest JSON echo '{"workflow_id":"abc","user_id":"1","auth_token":"token","params":{}}' | nodetool run --stdin --jsonl # With custom user ID nodetool run workflow_abc123 --user-id "custom_user_id" # With auth token nodetool run workflow_abc123 --auth-token "my_auth_token"

Assets

Manage workflow assets (nodes, models, files): # List all assets nodetool assets list # Get asset details nodetool assets get <asset_id>

Packages

Manage NodeTool packages (export workflows, generate docs): # List packages nodetool package list # Generate documentation nodetool package docs # Generate node documentation nodetool package node-docs # Generate workflow documentation (Jekyll) nodetool package workflow-docs # Scan directory for nodes and create package nodetool package scan # Initialize new package project nodetool package init

Jobs

Manage background job executions: # List jobs for a user nodetool jobs list # Get job details nodetool jobs get <job_id> # Get job logs nodetool jobs logs <job_id> # Start background job for workflow nodetool jobs start <workflow_id>

Deployment

Deploy NodeTool to cloud platforms (RunPod, GCP, Docker): # Initialize deployment.yaml nodetool deploy init # List deployments nodetool deploy list # Add new deployment nodetool deploy add # Apply deployment configuration nodetool deploy apply # Check deployment status nodetool deploy status <deployment_name> # View deployment logs nodetool deploy logs <deployment_name> # Destroy deployment nodetool deploy destroy <deployment_name> # Manage collections on deployed instance nodetool deploy collections # Manage database on deployed instance nodetool deploy database # Manage workflows on deployed instance nodetool deploy workflows # See what changes will be made nodetool deploy plan

Model Management

Discover and manage AI models (HuggingFace, Ollama): # List cached HuggingFace models by type nodetool model list-hf <hf_type> # List all HuggingFace cache entries nodetool model list-hf-all # List supported HF types nodetool model hf-types # Inspect HuggingFace cache nodetool model hf-cache # Scan cache for info nodetool admin scan-cache

Admin

Maintain model caches and clean up: # Calculate total cache size nodetool admin cache-size # Delete HuggingFace model from cache nodetool admin delete-hf <model_name> # Download HuggingFace models with progress nodetool admin download-hf <model_name> # Download Ollama models nodetool admin download-ollama <model_name>

Chat & Server

Interactive chat and web interface: # Start CLI chat nodetool chat # Start chat server (WebSocket + SSE) nodetool chat-server # Start FastAPI backend server nodetool serve --host 0.0.0.0 --port 8000 # With static assets folder nodetool serve --static-folder ./static --apps-folder ./apps # Development mode with auto-reload nodetool serve --reload # Production mode nodetool serve --production

Proxy

Start reverse proxy with HTTPS: # Start proxy server nodetool proxy # Check proxy status nodetool proxy-status # Validate proxy config nodetool proxy-validate-config # Run proxy daemon with ACME HTTP + HTTPS nodetool proxy-daemon

Other Commands

# View settings and secrets nodetool settings show # Generate custom HTML app for workflow nodetool vibecoding # Run workflow and export as Python DSL nodetool dsl-export # Export workflow as Gradio app nodetool gradio-export # Regenerate DSL nodetool codegen # Manage database migrations nodetool migrations # Synchronize database with remote nodetool sync

Workflow Execution

Run a NodeTool workflow and get structured output: # Run workflow interactively nodetool run my_workflow_id # Run and stream JSONL output nodetool run my_workflow_id --jsonl | jq -r '.[] | "\(.status) | \(.output)"'

Package Creation

Generate documentation for a custom package: # Scan for nodes and create package nodetool package scan # Generate complete documentation nodetool package docs

Deployment

Deploy a NodeTool instance to the cloud: # Initialize deployment config nodetool deploy init # Add RunPod deployment nodetool deploy add # Deploy and start nodetool deploy apply

Model Management

Check and manage cached AI models: # List all available models nodetool model list-hf-all # Inspect cache nodetool model hf-cache

Linux / macOS

Quick one-line installation: curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash With custom directory: curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash --prefix ~/.nodetool Non-interactive mode (automatic, no prompts): Both scripts support silent installation: # Linux/macOS - use -y curl -fsSL https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.sh | bash -y # Windows - use -Yes irm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex; .\install.ps1 -Yes What happens with non-interactive mode: All confirmation prompts are skipped automatically Installation proceeds without requiring user input Perfect for CI/CD pipelines or automated setups

Windows

Quick one-line installation: irm https://raw.githubusercontent.com/nodetool-ai/nodetool/refs/heads/main/install.ps1 | iex With custom directory: .\install.ps1 -Prefix "C:\nodetool" Non-interactive mode: .\install.ps1 -Yes

What Gets Installed

The installer sets up: micromamba β€” Python package manager (conda replacement) NodeTool environment β€” Conda env at ~/.nodetool/env Python packages β€” nodetool-core, nodetool-base from NodeTool registry Wrapper scripts β€” nodetool CLI available from any terminal

Environment Setup

After installation, these variables are automatically configured: # Conda environment export MAMBA_ROOT_PREFIX="$HOME/.nodetool/micromamba" export PATH="$HOME/.nodetool/env/bin:$HOME/.nodetool/env/Library/bin:$PATH" # Model cache directories export HF_HOME="$HOME/.nodetool/cache/huggingface" export OLLAMA_MODELS="$HOME/.nodetool/cache/ollama"

System Info

Check NodeTool environment and installed packages: nodetool info Output shows: Version Python version Platform/Architecture Installed AI packages (OpenAI, Anthropic, Google, HF, Ollama, fal-client) Environment variables API key status

Category context

Agent frameworks, memory systems, reasoning layers, and model-native orchestration.

Source: Tencent SkillHub

Largest current source with strong distribution and engagement signals.

Package contents

Included in package
1 Docs1 Config
  • SKILL.md Primary doc
  • package.json Config