Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
How to use the coala-client CLI for chat with LLMs, MCP servers, and skills. Use when the user asks how to use coala, run coala chat, add MCP servers, import...
How to use the coala-client CLI for chat with LLMs, MCP servers, and skills. Use when the user asks how to use coala, run coala chat, add MCP servers, import...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Part of the coala ecosystem. CLI for chat with OpenAI-compatible LLMs (OpenAI, Gemini, Ollama) and MCP (Model Context Protocol) servers. Supports importing CWL toolsets as MCP servers, importing skills.
MCP config and toolsets: ~/.config/coala/mcps/ mcp_servers.json β server definitions <toolset>/ β per-toolset dirs with run_mcp.py and CWL files Skills: ~/.config/coala/skills/ (one subfolder per imported source) Env: ~/.config/coala/env (optional; key=value for providers and MCP env)
Init (first time) coala init β creates ~/.config/coala/mcps/mcp_servers.json and env. Set API key e.g. export OPENAI_API_KEY=... or export GEMINI_API_KEY=.... Ollama needs no key. Chat coala or coala chat β interactive chat with MCP tools. coala ask "question" β single prompt with MCP. Options -p, --provider (openai|gemini|ollama|custom), -m, --model, --no-mcp.
No API key needed for MCP import, list, or call β only for chat/ask with an LLM. Import (creates toolset under ~/.config/coala/mcps/<TOOLSET>/ and registers server): coala mcp-import <TOOLSET> <SOURCES...> or alias coala mcp ... SOURCES: local .cwl files, a .zip, or http(s) URLs to a .cwl or .zip. Requires the coala package where the MCP server runs (for run_mcp.py). List coala mcp-list β list server names. coala mcp-list <SERVER_NAME> β print each toolβs schema (name, description, inputSchema). Call coala mcp-call <SERVER>.<TOOL> --args '<JSON>' Example: coala mcp-call gene-variant.ncbi_datasets_gene --args '{"data": [{"gene": "TP53", "taxon": "human"}]}'
Import (into ~/.config/coala/skills/, one subfolder per source): coala skill <SOURCES...> SOURCES: GitHub tree URL (e.g. https://github.com/owner/repo/tree/main/skills), zip URL, or local zip/dir. In chat /skill β list installed skills. /skill <name> β load skill from ~/.config/coala/skills/<name>/ (e.g. SKILL.md) into context.
/help, /exit, /quit, /clear /tools β list MCP tools /servers β list connected MCP servers /skill β list skills; /skill <name> β load a skill /model β show model info /switch <provider> β switch provider
All off: coala --no-mcp (or coala ask "..." --no-mcp). One server off: remove its entry from ~/.config/coala/mcps/mcp_servers.json. On: default when --no-mcp is not used; add or restore servers in mcp_servers.json.
Set provider via -p or env PROVIDER. Set keys and URLs per provider (e.g. OPENAI_API_KEY, GEMINI_API_KEY, OLLAMA_BASE_URL). Optional: put vars in ~/.config/coala/env. coala config β print current config paths and provider/model info.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.