# Send Ollama Local to your agent
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
## Fast path
- Download the package from Yavira.
- Extract it into a folder your agent can access.
- Paste one of the prompts below and point your agent at the extracted folder.
## Suggested prompts
### New install

```text
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
```
### Upgrade existing

```text
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
```
## Machine-readable fields
```json
{
  "schemaVersion": "1.0",
  "item": {
    "slug": "ollama-local",
    "name": "Ollama Local",
    "source": "tencent",
    "type": "skill",
    "category": "通讯协作",
    "sourceUrl": "https://clawhub.ai/Timverhoogt/ollama-local",
    "canonicalUrl": "https://clawhub.ai/Timverhoogt/ollama-local",
    "targetPlatform": "OpenClaw"
  },
  "install": {
    "downloadUrl": "/downloads/ollama-local",
    "sourceDownloadUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=ollama-local",
    "sourcePlatform": "tencent",
    "targetPlatform": "OpenClaw",
    "packageFormat": "ZIP package",
    "primaryDoc": "SKILL.md",
    "includedAssets": [
      "SKILL.md",
      "references/models.md",
      "scripts/ollama.py",
      "scripts/ollama_tools.py"
    ],
    "downloadMode": "redirect",
    "sourceHealth": {
      "source": "tencent",
      "status": "healthy",
      "reason": "direct_download_ok",
      "recommendedAction": "download",
      "checkedAt": "2026-04-30T16:55:25.780Z",
      "expiresAt": "2026-05-07T16:55:25.780Z",
      "httpStatus": 200,
      "finalUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
      "contentType": "application/zip",
      "probeMethod": "head",
      "details": {
        "probeUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=network",
        "contentDisposition": "attachment; filename=\"network-1.0.0.zip\"",
        "redirectLocation": null,
        "bodySnippet": null
      },
      "scope": "source",
      "summary": "Source download looks usable.",
      "detail": "Yavira can redirect you to the upstream package for this source.",
      "primaryActionLabel": "Download for OpenClaw",
      "primaryActionHref": "/downloads/ollama-local"
    },
    "validation": {
      "installChecklist": [
        "Use the Yavira download entry.",
        "Review SKILL.md after the package is downloaded.",
        "Confirm the extracted package contains the expected setup assets."
      ],
      "postInstallChecks": [
        "Confirm the extracted package includes the expected docs or setup files.",
        "Validate the skill or prompts are available in your target agent workspace.",
        "Capture any manual follow-up steps the agent could not complete."
      ]
    }
  },
  "links": {
    "detailUrl": "https://openagent3.xyz/skills/ollama-local",
    "downloadUrl": "https://openagent3.xyz/downloads/ollama-local",
    "agentUrl": "https://openagent3.xyz/skills/ollama-local/agent",
    "manifestUrl": "https://openagent3.xyz/skills/ollama-local/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/ollama-local/agent.md"
  }
}
```
## Documentation

### Ollama Local

Work with local Ollama models for inference, embeddings, and tool use.

### Configuration

Set your Ollama host (defaults to http://localhost:11434):

export OLLAMA_HOST="http://localhost:11434"
# Or for remote server:
export OLLAMA_HOST="http://192.168.1.100:11434"

### Quick Reference

# List models
python3 scripts/ollama.py list

# Pull a model
python3 scripts/ollama.py pull llama3.1:8b

# Remove a model
python3 scripts/ollama.py rm modelname

# Show model details
python3 scripts/ollama.py show qwen3:4b

# Chat with a model
python3 scripts/ollama.py chat qwen3:4b "What is the capital of France?"

# Chat with system prompt
python3 scripts/ollama.py chat llama3.1:8b "Review this code" -s "You are a code reviewer"

# Generate completion (non-chat)
python3 scripts/ollama.py generate qwen3:4b "Once upon a time"

# Get embeddings
python3 scripts/ollama.py embed bge-m3 "Text to embed"

### Model Selection

See references/models.md for full model list and selection guide.

Quick picks:

Fast answers: qwen3:4b
Coding: qwen2.5-coder:7b
General: llama3.1:8b
Reasoning: deepseek-r1:8b

### Tool Use

Some local models support function calling. Use ollama_tools.py:

# Single request with tools
python3 scripts/ollama_tools.py single qwen2.5-coder:7b "What's the weather in Amsterdam?"

# Full tool loop (model calls tools, gets results, responds)
python3 scripts/ollama_tools.py loop qwen3:4b "Search for Python tutorials and summarize"

# Show available example tools
python3 scripts/ollama_tools.py tools

Tool-capable models: qwen2.5-coder, qwen3, llama3.1, mistral

### OpenClaw Sub-Agents

Spawn local model sub-agents with sessions_spawn:

# Example: spawn a coding agent
sessions_spawn(
    task="Review this Python code for bugs",
    model="ollama/qwen2.5-coder:7b",
    label="code-review"
)

Model path format: ollama/<model-name>

### Parallel Agents (Think Tank Pattern)

Spawn multiple local agents for collaborative tasks:

agents = [
    {"label": "architect", "model": "ollama/gemma3:12b", "task": "Design the system architecture"},
    {"label": "coder", "model": "ollama/qwen2.5-coder:7b", "task": "Implement the core logic"},
    {"label": "reviewer", "model": "ollama/llama3.1:8b", "task": "Review for bugs and improvements"},
]

for a in agents:
    sessions_spawn(task=a["task"], model=a["model"], label=a["label"])

### Direct API

For custom integrations, use the Ollama API directly:

# Chat
curl $OLLAMA_HOST/api/chat -d '{
  "model": "qwen3:4b",
  "messages": [{"role": "user", "content": "Hello"}],
  "stream": false
}'

# Generate
curl $OLLAMA_HOST/api/generate -d '{
  "model": "qwen3:4b",
  "prompt": "Why is the sky blue?",
  "stream": false
}'

# List models
curl $OLLAMA_HOST/api/tags

# Pull model
curl $OLLAMA_HOST/api/pull -d '{"name": "phi3:mini"}'

### Troubleshooting

Connection refused?

Check Ollama is running: ollama serve
Verify OLLAMA_HOST is correct
For remote servers, ensure firewall allows port 11434

Model not loading?

Check VRAM: larger models may need CPU offload
Try a smaller model first

Slow responses?

Model may be running on CPU
Use smaller quantization (e.g., :7b instead of :30b)

OpenClaw sub-agent falls back to default model?

Ensure ollama:default auth profile exists in OpenClaw config
Check model path format: ollama/modelname:tag
## Trust
- Source: tencent
- Verification: Indexed source record
- Publisher: Timverhoogt
- Version: 1.1.0
## Source health
- Status: healthy
- Source download looks usable.
- Yavira can redirect you to the upstream package for this source.
- Health scope: source
- Reason: direct_download_ok
- Checked at: 2026-04-30T16:55:25.780Z
- Expires at: 2026-05-07T16:55:25.780Z
- Recommended action: Download for OpenClaw
## Links
- [Detail page](https://openagent3.xyz/skills/ollama-local)
- [Send to Agent page](https://openagent3.xyz/skills/ollama-local/agent)
- [JSON manifest](https://openagent3.xyz/skills/ollama-local/agent.json)
- [Markdown brief](https://openagent3.xyz/skills/ollama-local/agent.md)
- [Download page](https://openagent3.xyz/downloads/ollama-local)