# Send Offline Llama to your agent
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
## Fast path
- Download the package from Yavira.
- Extract it into a folder your agent can access.
- Paste one of the prompts below and point your agent at the extracted folder.
## Suggested prompts
### New install

```text
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
```
### Upgrade existing

```text
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
```
## Machine-readable fields
```json
{
  "schemaVersion": "1.0",
  "item": {
    "slug": "offline-llama",
    "name": "Offline Llama",
    "source": "tencent",
    "type": "skill",
    "category": "AI 智能",
    "sourceUrl": "https://clawhub.ai/and-ray-m/offline-llama",
    "canonicalUrl": "https://clawhub.ai/and-ray-m/offline-llama",
    "targetPlatform": "OpenClaw"
  },
  "install": {
    "downloadUrl": "/downloads/offline-llama",
    "sourceDownloadUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=offline-llama",
    "sourcePlatform": "tencent",
    "targetPlatform": "OpenClaw",
    "packageFormat": "ZIP package",
    "primaryDoc": "SKILL.md",
    "includedAssets": [
      "SKILL.md"
    ],
    "downloadMode": "redirect",
    "sourceHealth": {
      "source": "tencent",
      "slug": "offline-llama",
      "status": "healthy",
      "reason": "direct_download_ok",
      "recommendedAction": "download",
      "checkedAt": "2026-05-02T19:37:12.027Z",
      "expiresAt": "2026-05-09T19:37:12.027Z",
      "httpStatus": 200,
      "finalUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=offline-llama",
      "contentType": "application/zip",
      "probeMethod": "head",
      "details": {
        "probeUrl": "https://wry-manatee-359.convex.site/api/v1/download?slug=offline-llama",
        "contentDisposition": "attachment; filename=\"offline-llama-1.0.0.zip\"",
        "redirectLocation": null,
        "bodySnippet": null,
        "slug": "offline-llama"
      },
      "scope": "item",
      "summary": "Item download looks usable.",
      "detail": "Yavira can redirect you to the upstream package for this item.",
      "primaryActionLabel": "Download for OpenClaw",
      "primaryActionHref": "/downloads/offline-llama"
    },
    "validation": {
      "installChecklist": [
        "Use the Yavira download entry.",
        "Review SKILL.md after the package is downloaded.",
        "Confirm the extracted package contains the expected setup assets."
      ],
      "postInstallChecks": [
        "Confirm the extracted package includes the expected docs or setup files.",
        "Validate the skill or prompts are available in your target agent workspace.",
        "Capture any manual follow-up steps the agent could not complete."
      ]
    }
  },
  "links": {
    "detailUrl": "https://openagent3.xyz/skills/offline-llama",
    "downloadUrl": "https://openagent3.xyz/downloads/offline-llama",
    "agentUrl": "https://openagent3.xyz/skills/offline-llama/agent",
    "manifestUrl": "https://openagent3.xyz/skills/offline-llama/agent.json",
    "briefUrl": "https://openagent3.xyz/skills/offline-llama/agent.md"
  }
}
```
## Documentation

### offline-llama

Autonomously manage and use local Ollama models for continuous operation without internet dependency. Includes model health monitoring, automatic fallback, and self-healing capabilities.

### Overview

This skill enables autonomous operation with local Ollama models. It monitors model health, automatically switches between models when issues occur, and maintains functionality even without internet connectivity. The skill includes self-healing capabilities to restart services and clear resources when needed.

### Model Management

Health Monitoring: Continuously check model availability and performance
Automatic Fallback: Switch to alternative models when primary fails
Model Switching: Dynamically select best available model for task

### Self-Healing

Service Restart: Automatically restart Ollama when models become unavailable
Resource Management: Clear cache and temporary files to free resources
Model Reinstallation: Reinstall problematic models automatically

### Connectivity Awareness

Internet Detection: Monitor internet connectivity status
Smart Fallback: Switch to remote models when local models unavailable and internet is present
Offline Mode: Maintain full functionality without internet

### Models

Primary: llama-3.1-8b-instruct (general tasks)
Secondary: mistral-7b-instruct (faster responses)
Specialized: code-llama-7b (coding tasks)

### Health Checks

Model Status: Monitor availability every 30 seconds
Latency Tracking: Monitor response times every minute
Resource Usage: Monitor GPU/CPU and memory every 5 minutes

### Fallback Strategies

Model Switching: Automatically switch to alternative local models
Response Retry: Retry failed requests with exponential backoff
Degraded Mode: Continue with limited functionality if all models unavailable

### When Internet is Available

Use local models primarily
Fallback to remote models if local models unavailable
Maintain optimal performance

### When Internet is Unavailable

Use local models exclusively
Continue all operations without interruption
Provide degraded functionality if needed

### Model Management

model_status - Check current model health
switch_model - Manually switch between models
restart_ollama - Restart Ollama service

### Health Monitoring

check_health - Run comprehensive health check
monitor_resources - Monitor system resources
clear_cache - Clear model cache and temporary files

### Automatic Actions

Service Restart: Triggered when model becomes unavailable
Resource Cleanup: Triggered when high memory usage detected
Model Reinstallation: Triggered when persistent failures occur

### Manual Intervention

Manual Restart: User can manually restart services
Cache Clearing: User can manually clear resources
Model Updates: User can update models as needed

### Security Considerations

All operations performed locally
No external dependencies required
Secure model management
Privacy-preserving by default

### Performance Optimization

Resource Monitoring: Track GPU/CPU usage and memory
Latency Tracking: Monitor response times and performance
Model Selection: Choose optimal model based on task requirements

### Regular Tasks

Health Checks: Run periodic health checks
Cache Management: Clear unused cache regularly
Model Updates: Keep models updated when possible

### Troubleshooting

Log Analysis: Monitor logs for issues
Performance Metrics: Track performance over time
Error Handling: Graceful error handling and recovery

### Integration

This skill integrates with:

Ollama: Local model management
System Resources: Monitor and manage system resources
Network: Detect internet connectivity
OpenClaw: Seamless integration with existing tools

### Future Enhancements

Model Training: Support for custom model training
Advanced Routing: Intelligent model selection based on task
Multi-GPU Support: Scale across multiple GPUs
Cloud Sync: Optional cloud backup and synchronization

### License

This skill is part of the OpenClaw ecosystem and follows the same licensing terms as OpenClaw itself.
## Trust
- Source: tencent
- Verification: Indexed source record
- Publisher: and-ray-m
- Version: 1.0.0
## Source health
- Status: healthy
- Item download looks usable.
- Yavira can redirect you to the upstream package for this item.
- Health scope: item
- Reason: direct_download_ok
- Checked at: 2026-05-02T19:37:12.027Z
- Expires at: 2026-05-09T19:37:12.027Z
- Recommended action: Download for OpenClaw
## Links
- [Detail page](https://openagent3.xyz/skills/offline-llama)
- [Send to Agent page](https://openagent3.xyz/skills/offline-llama/agent)
- [JSON manifest](https://openagent3.xyz/skills/offline-llama/agent.json)
- [Markdown brief](https://openagent3.xyz/skills/offline-llama/agent.md)
- [Download page](https://openagent3.xyz/downloads/offline-llama)