โ† All skills
Tencent SkillHub ยท Developer Tools

Exa Search (Rust)

Neural web search, similar-page discovery, and URL content fetching via the Exa AI search engine. USE WHEN: user asks to search the web, find articles/repos/...

skill openclawclawhub Free
0 Downloads
0 Stars
0 Installs
0 Score
High Signal

Neural web search, similar-page discovery, and URL content fetching via the Exa AI search engine. USE WHEN: user asks to search the web, find articles/repos/...

โฌ‡ 0 downloads โ˜… 0 stars Unverified but indexed

Install for OpenClaw

Quick setup
  1. Download the package from Yavira.
  2. Extract the archive and review SKILL.md first.
  3. Import or place the package into your OpenClaw setup.

Requirements

Target platform
OpenClaw
Install method
Manual import
Extraction
Extract archive
Prerequisites
OpenClaw
Primary doc
SKILL.md

Package facts

Download mode
Yavira redirect
Package format
ZIP package
Source platform
Tencent SkillHub
What's included
Cargo.toml, README.md, SKILL.md, install.sh, src/client.rs, src/main.rs

Validation

  • Use the Yavira download entry.
  • Review SKILL.md after the package is downloaded.
  • Confirm the extracted package contains the expected setup assets.

Install with your agent

Agent handoff

Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.

  1. Download the package from Yavira.
  2. Extract it into a folder your agent can access.
  3. Paste one of the prompts below and point your agent at the extracted folder.
New install

I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.

Upgrade existing

I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.

Trust & source

Release facts

Source
Tencent SkillHub
Verification
Indexed source record
Version
1.0.3

Documentation

ClawHub primary doc Primary doc: SKILL.md 13 sections Open source page

exa-search skill

Use this skill to search the web, find similar pages, or fetch page contents via the Exa AI search engine โ€” fast, neural, and certificate-aware. The skill invokes a native Rust binary (bin/exa-search) via Bash. Run install.sh once to build it.

When to Use / When NOT to Use

โœ… USE when: Searching the web for articles, documentation, repos, papers, tools, people, companies Finding recent news or announcements (use livecrawl: "fallback" or "always" for recency) Fetching full text of a known URL without browser automation Finding pages similar to a reference URL (competitor analysis, alternative tools) Any web lookup that isn't a specific tweet or video โŒ NOT FOR: Fetching tweets/X posts โ†’ use fxtwitter skill (Exa can't fetch tweet URLs) Downloading video/audio โ†’ use yt-dlp Scraping dynamic or Cloudflare-protected pages โ†’ use scrapling Local file/code search โ†’ use rg, find, or grep Querying structured APIs (GitHub, weather, etc.) โ†’ use their dedicated skills When the query is already a direct URL with known content โ†’ prefer get_contents action

Prerequisites

EXA_API_KEY set in ~/.openclaw/workspace/.env (get one at exa.ai) Rust installed (rustup) โ€” only needed for the one-time build bin/exa-search binary present (run bash install.sh to build)

1. Search

echo '{"query":"your query here","num_results":5}' \ | EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') \ ~/.openclaw/workspace/skills/exa-search/bin/exa-search Full params: { "query": "rust async programming", "num_results": 5, "type": "neural", "livecrawl": "never", "include_domains": ["github.com", "docs.rs"], "exclude_domains": ["reddit.com"], "start_published_date": "2025-01-01", "end_published_date": "2026-12-31", "category": "research paper", "use_autoprompt": true, "text": { "max_characters": 2000 }, "highlights": { "num_sentences": 3, "highlights_per_url": 2 }, "summary": { "query": "key takeaways" } } type options: auto (default) ยท neural ยท keyword livecrawl options: "never" โ€” fastest (~300-600ms), pure cached index. Best for reference material, docs, courses. "fallback" โ€” use cache, crawl live if not cached. Good default. "preferred" โ€” prefer live crawl. Slower but fresher. "always" โ€” always crawl live. For breaking news or rapidly-changing pages.

2. Find Similar

Find pages similar to a given URL: echo '{"action":"find_similar","url":"https://example.com","num_results":5}' \ | EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') \ ~/.openclaw/workspace/skills/exa-search/bin/exa-search Params: same contents options as search (text, highlights, summary, livecrawl)

3. Get Contents

Fetch full contents for one or more URLs: echo '{"action":"get_contents","urls":["https://example.com","https://other.com"],"text":{"max_characters":1000}}' \ | EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') \ ~/.openclaw/workspace/skills/exa-search/bin/exa-search

Output format

All actions return JSON on stdout: { "ok": true, "action": "search", "results": [ { "url": "https://...", "title": "...", "score": 0.87, "author": "...", "published_date": "2026-01-15", "image": "https://...", "favicon": "https://...", "text": "...", "highlights": ["..."], "summary": "..." } ], "formatted": "## [Title](url)\n..." } On error: { "ok": false, "error": "..." } The formatted field is ready-to-use markdown โ€” you can send it directly to the user.

Speed reference (same query, 3 runs)

ModeAvgPeaklivecrawl: "never" (instant)~440ms308msDefault (no livecrawl)~927ms629ms ~18.7ร— faster than Exa MCP at peak.

Helper: load API key

EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') Or export once at the top of a longer workflow: export EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') echo '{"query":"..."}' | ~/.openclaw/workspace/skills/exa-search/bin/exa-search

Invocation pattern

EXA_API_KEY=$(grep -E "^EXA_API_KEY=" ~/.openclaw/workspace/.env | cut -d= -f2 | tr -d '"') echo '{"query":"...","num_results":5,"livecrawl":"never"}' \ | EXA_API_KEY="$EXA_API_KEY" ~/.openclaw/workspace/skills/exa-search/bin/exa-search The formatted field in the output is ready-to-use markdown โ€” send it directly to the user.

Mode selection (be deliberate, every search)

SituationlivecrawltypeDocs, tutorials, courses, reference material"never""neural"General research โ€” people, tools, concepts, companies"never""neural"Exact function names, error messages, package names"never""keyword"Recent releases, changelogs, GitHub repos"fallback""auto"News or announcements from the last 1-2 weeks"fallback""neural"Breaking news, live prices, today's events"always""neural"Unsure"fallback""auto" Default when in doubt: livecrawl: "never", type: "neural" โ€” fastest, works for 80% of searches.

When to use each action

search (default) โ€” use for any information retrieval from a query string. find_similar โ€” use when you have a URL and want more like it: related articles, alternative tools, similar repos, competing products. echo '{"action":"find_similar","url":"https://...","num_results":5,"livecrawl":"never"}' | EXA_API_KEY="$EXA_API_KEY" ... get_contents โ€” use when you have a specific URL and need its full text: docs pages, blog posts, GitHub READMEs, papers. Faster than search when the URL is already known. echo '{"action":"get_contents","urls":["https://..."],"text":{"max_characters":3000}}' | EXA_API_KEY="$EXA_API_KEY" ...

Enrich results for research tasks

When writing reports, summaries, or comparing multiple results โ€” request highlights or summary per result: { "query": "...", "num_results": 5, "highlights": { "num_sentences": 3, "highlights_per_url": 2 }, "summary": { "query": "key takeaways for a developer" } } Note: highlights/summary add latency (~200-500ms extra). Only use when you actually need them.

Category context

Code helpers, APIs, CLIs, browser automation, testing, and developer operations.

Source: Tencent SkillHub

Largest current source with strong distribution and engagement signals.

Package contents

Included in package
3 Files2 Docs1 Scripts
  • SKILL.md Primary doc
  • README.md Docs
  • install.sh Scripts
  • Cargo.toml Files
  • src/client.rs Files
  • src/main.rs Files