Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Score leads 0-100 by analyzing a domain's website, DNS, sitemap, and social presence. Uses customizable JSON scoring profiles so users can define what signal...
Score leads 0-100 by analyzing a domain's website, DNS, sitemap, and social presence. Uses customizable JSON scoring profiles so users can define what signal...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Analyze a domain and return a 0-100 lead score with detailed breakdown. The key feature is customizable scoring profiles β JSON configs that define which signals matter and their weights.
DNS Analysis β MX records (Google Workspace/M365 = real business), SPF/DMARC Sitemap Parsing β URL count, last modified dates, content volume Website Scraping β Blog detection, tech stack, meta tags, social links, contact info Signal Scoring β Each signal scored against the profile weights Grade Assignment β A (80-100), B (60-79), C (40-59), D (20-39), F (0-19)
pip3 install dnspython
python3 scripts/score_lead.py example.com
python3 scripts/score_lead.py example.com --profile clearscope.json
python3 scripts/score_lead.py domain1.com domain2.com domain3.com
python3 scripts/score_lead.py --csv leads.csv --domain-column "Website"
--profile FILE β Scoring profile JSON (default: default.json, resolved from scripts/profiles/) --csv FILE β CSV file with domains --domain-column NAME β Column name for domains in CSV (default: domain) --scrape-delay SECONDS β Delay between HTTP requests (default: 0.5) --output FILE β Write results to file instead of stdout
JSON to stdout with overall score, per-signal breakdown, raw data, and summary: { "domain": "example.com", "score": 72, "grade": "B", "profile": "default", "signals": { "has_blog": {"score": 20, "max": 20, "evidence": "Blog found at /blog; 234 URLs in sitemap"}, "business_legitimacy": {"score": 15, "max": 20, "evidence": "MX: Google Workspace; SPF configured"} }, "raw_data": { "sitemap_urls": 234, "mx_provider": "Google Workspace", "tech_stack": ["WordPress", "Cloudflare"] }, "summary": "Strong in: has blog, business legitimacy. Good lead, worth pursuing." }
Profiles are the key differentiator. They let you define what matters for YOUR use case.
{ "name": "my-profile", "description": "What this profile scores for", "signals": { "signal_name": { "weight": 25, "description": "What this signal measures", "keywords": ["optional", "keyword", "list"] } } }
SignalWhat it checkshas_blogBlog/content section existence + sitemap volumebusiness_legitimacyMX provider, SPF/DMARC, about page, meta tagscontent_velocitySitemap dates β recency and frequency of updatestech_stackCMS, analytics, chat tools detected in page sourceaudience_sizeSocial media links (Twitter, LinkedIn, YouTube, Facebook)contact_findabilityContact page, emails on site, LinkedIn linkseo_toolsKeyword matching in homepage text (requires keywords array)
Any signal with a keywords array will match those terms against the homepage text. This is how you detect competitors, tools, or industry terms: { "name": "crm-seller", "signals": { "uses_crm": { "weight": 30, "description": "Already uses a CRM", "keywords": ["salesforce", "hubspot", "pipedrive", "zoho crm", "close.io"] }, "has_sales_team": { "weight": 25, "description": "Mentions sales roles or team", "keywords": ["sales team", "account executive", "sdr", "business development"] } } }
default.json β Generic scoring for any SaaS/content company clearscope.json β Example profile for SEO tool partnership leads Create your own in scripts/profiles/ or pass any path with --profile.
The script is polite by default: --scrape-delay 0.5 β 500ms between HTTP requests (default) Each domain makes ~5-8 requests (homepage, blog, about, contact, sitemap, DNS) For batch mode, there's an additional delay between domains Increase delay for large batches: --scrape-delay 2 All requests use a generic User-Agent string
Batch sizeDelayEst. time1-100.5s (default)~30s-2min10-501.0s~5-15min50+2.0s~30min+
If a signal can't be gathered (site down, DNS timeout, etc.), it scores 0 with an explanation in the evidence field. The script never crashes on a single domain failure β it logs the issue to stderr and continues.
Start with default profile, review results, then customize Weights should sum to 100 for intuitive scoring (not required β auto-normalizes) Keywords are powerful β add competitor names, industry terms, technology mentions Pipe to jq for quick filtering: python3 scripts/score_lead.py domain.com | jq '.score' Batch + sort: Score a CSV, then sort by score to prioritize outreach
Agent frameworks, memory systems, reasoning layers, and model-native orchestration.
Largest current source with strong distribution and engagement signals.