← All skills
Tencent SkillHub · Developer Tools

HARPA AI

Automate web browsers, scrape pages, search the web, and run AI prompts on live websites via HARPA AI Grid REST API

skill openclawclawhub Free
0 Downloads
0 Stars
0 Installs
0 Score
High Signal

Automate web browsers, scrape pages, search the web, and run AI prompts on live websites via HARPA AI Grid REST API

⬇ 0 downloads ★ 0 stars Unverified but indexed

Install for OpenClaw

Quick setup
  1. Download the package from Yavira.
  2. Extract the archive and review SKILL.md first.
  3. Import or place the package into your OpenClaw setup.

Requirements

Target platform
OpenClaw
Install method
Manual import
Extraction
Extract archive
Prerequisites
OpenClaw
Primary doc
SKILL.md

Package facts

Download mode
Yavira redirect
Package format
ZIP package
Source platform
Tencent SkillHub
What's included
SKILL.md

Validation

  • Use the Yavira download entry.
  • Review SKILL.md after the package is downloaded.
  • Confirm the extracted package contains the expected setup assets.

Install with your agent

Agent handoff

Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.

  1. Download the package from Yavira.
  2. Extract it into a folder your agent can access.
  3. Paste one of the prompts below and point your agent at the extracted folder.
New install

I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.

Upgrade existing

I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.

Trust & source

Release facts

Source
Tencent SkillHub
Verification
Indexed source record
Version
1.0.0

Documentation

ClawHub primary doc Primary doc: SKILL.md 11 sections Open source page

HARPA Grid — Browser Automation API

HARPA Grid lets you orchestrate real web browsers remotely. You can scrape pages, search the web, run built-in or custom AI commands, and send AI prompts with full page context — all through a single REST endpoint.

Prerequisites

The user must have: HARPA AI Chrome Extension installed from https://harpa.ai At least one active Node — a browser with HARPA running (configured in the extension's AUTOMATE tab) A HARPA API key — obtained from the HARPA extension AUTOMATE tab. The key is provided as the HARPA_API_KEY environment variable. If the user hasn't set up HARPA yet, direct them to: https://harpa.ai/grid/browser-automation-node-setup

API Reference

Endpoint: POST https://api.harpa.ai/api/v1/grid Auth: Authorization: Bearer $HARPA_API_KEY Content-Type: application/json Full reference: https://harpa.ai/grid/grid-rest-api-reference

1. Scrape a Web Page

Extract full page content (as markdown) or specific elements via CSS/XPath/text selectors. Full page scrape: curl -s -X POST https://api.harpa.ai/api/v1/grid \ -H "Authorization: Bearer $HARPA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "action": "scrape", "url": "https://example.com", "timeout": 15000 }' Targeted element scrape (grab): curl -s -X POST https://api.harpa.ai/api/v1/grid \ -H "Authorization: Bearer $HARPA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "action": "scrape", "url": "https://example.com/products", "grab": [ { "selector": ".product-title", "selectorType": "css", "at": "all", "take": "innerText", "label": "titles" }, { "selector": ".product-price", "selectorType": "css", "at": "all", "take": "innerText", "label": "prices" } ], "timeout": 15000 }' Grab fields: FieldRequiredDefaultValuesselectoryes—CSS (.class, #id), XPath (//h2), or text contentselectorTypenoautoauto, css, xpath, textatnofirstall, first, last, or a numbertakenoinnerTextinnerText, textContent, innerHTML, outerHTML, href, value, id, className, attributes, styles, [attrName], (styleName)labelnodataCustom label for extracted data

2. Search the Web (SERP)

Perform a web search. Supports operators like site:, intitle:. curl -s -X POST https://api.harpa.ai/api/v1/grid \ -H "Authorization: Bearer $HARPA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "action": "serp", "query": "OpenClaw AI agent framework", "timeout": 15000 }'

3. Run an AI Command

Execute one of 100+ built-in HARPA commands or a custom automation on a target page. curl -s -X POST https://api.harpa.ai/api/v1/grid \ -H "Authorization: Bearer $HARPA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "action": "command", "url": "https://example.com/article", "name": "Extract data", "inputs": "List all headings with their word counts", "connection": "HARPA AI", "resultParam": "message", "timeout": 30000 }' name — command name (e.g. "Summary", "Extract data", or any custom command) inputs — pre-filled user inputs for multi-step commands resultParam — HARPA parameter to return as result (default: "message") connection — AI model to use (e.g. "HARPA AI", "gpt-4o", "claude-3.5-sonnet")

4. Run an AI Prompt

Send a custom AI prompt with page context. Use {{page}} to inject the page content. curl -s -X POST https://api.harpa.ai/api/v1/grid \ -H "Authorization: Bearer $HARPA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "action": "prompt", "url": "https://example.com", "prompt": "Analyze the current page and extract all contact information. Webpage: {{page}}", "connection": "CHAT AUTO", "timeout": 30000 }'

Common Parameters

ParameterRequiredDefaultDescriptionactionyes—scrape, serp, command, or prompturlno—Target page URL (ignored by serp)nodeno—Node ID ("r2d2"), multiple ("r2d2 c3po"), first N ("5"), or all ("*")timeoutno300000Max wait time in ms (max 5 minutes)resultsWebhookno—URL to POST results to asynchronously (retained 30 days)connectionno—AI model for command/prompt actions

Node Targeting

Omit node to use the default node "node": "mynode" — target a specific node by ID "node": "node1 node2" — target multiple nodes "node": "3" — use first 3 available nodes "node": "*" — broadcast to all nodes

Async Results via Webhook

Set resultsWebhook to receive results asynchronously. The action stays alive for up to 30 days, useful when target nodes are temporarily offline. { "action": "scrape", "url": "https://example.com", "resultsWebhook": "https://your-server.com/webhook", "timeout": 15000 }

Tips

Scraping behind-login pages works because HARPA runs inside a real browser session with the user's cookies and auth state. Use the grab array with multiple selectors to extract structured data in a single request. For long-running AI commands, increase timeout (max 300000ms / 5 min) or use resultsWebhook. The {{page}} variable in prompts injects the full page content — use it to give AI context about the current page.

Category context

Code helpers, APIs, CLIs, browser automation, testing, and developer operations.

Source: Tencent SkillHub

Largest current source with strong distribution and engagement signals.

Package contents

Included in package
1 Docs
  • SKILL.md Primary doc