← All skills
Tencent SkillHub Β· Developer Tools

finviz-crawler

Continuous financial news crawler for finviz.com with SQLite storage, article extraction, and query tool. Use when monitoring financial markets, building new...

skill openclawclawhub Free
0 Downloads
0 Stars
0 Installs
0 Score
High Signal

Continuous financial news crawler for finviz.com with SQLite storage, article extraction, and query tool. Use when monitoring financial markets, building new...

⬇ 0 downloads β˜… 0 stars Unverified but indexed

Install for OpenClaw

Quick setup
  1. Download the package from Yavira.
  2. Extract the archive and review SKILL.md first.
  3. Import or place the package into your OpenClaw setup.

Requirements

Target platform
OpenClaw
Install method
Manual import
Extraction
Extract archive
Prerequisites
OpenClaw
Primary doc
SKILL.md

Package facts

Download mode
Yavira redirect
Package format
ZIP package
Source platform
Tencent SkillHub
What's included
README.md, SKILL.md, scripts/finviz_crawler.py, scripts/finviz_query.py, scripts/install.py

Validation

  • Use the Yavira download entry.
  • Review SKILL.md after the package is downloaded.
  • Confirm the extracted package contains the expected setup assets.

Install with your agent

Agent handoff

Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.

  1. Download the package from Yavira.
  2. Extract it into a folder your agent can access.
  3. Paste one of the prompts below and point your agent at the extracted folder.
New install

I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.

Upgrade existing

I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.

Trust & source

Release facts

Source
Tencent SkillHub
Verification
Indexed source record
Version
3.0.0

Documentation

ClawHub primary doc Primary doc: SKILL.md 14 sections Open source page

Why This Skill?

πŸ“° Your own financial news database β€” most finance skills just wrap an API for one-shot queries. This skill runs continuously, building a local archive of every headline and article from Finviz. Query your history anytime β€” no API limits, no missing data. πŸ†“ No API key, no subscription β€” scrapes finviz.com directly using Crawl4AI + RSS. Bloomberg, Reuters, Yahoo Finance, CNBC articles extracted automatically. Zero cost. πŸ€– Built for AI summarization β€” the query tool outputs clean text/JSON optimized for LLM digests. Pair with an OpenClaw cron job for automated morning briefings, evening wrap-ups, or weekly investment summaries. πŸ’Ύ Auto-cleanup β€” configurable expiry automatically deletes old articles from both the database and disk. Set --expiry-days 30 to keep a month of history, or 0 to keep everything forever. πŸ”„ Daemon architecture β€” runs as a background service that starts/stops with OpenClaw. No manual intervention needed after setup. Works with systemd (Linux) and launchd (macOS).

Install

python3 scripts/install.py Works on macOS, Linux, and Windows. Installs Python packages (crawl4ai, feedparser), sets up Playwright browsers, creates data directories, and verifies everything.

Manual install

pip install crawl4ai feedparser crawl4ai-setup # or: python -m playwright install chromium

Run the crawler

# Default: ~/workspace/finviz/, 7-day expiry python3 scripts/finviz_crawler.py # Custom paths and settings python3 scripts/finviz_crawler.py --db /path/to/finviz.db --articles-dir /path/to/articles/ # Keep 30 days of articles python3 scripts/finviz_crawler.py --expiry-days 30 # Never auto-delete (keep everything) python3 scripts/finviz_crawler.py --expiry-days 0 # Custom crawl interval (default: 300s) python3 scripts/finviz_crawler.py --sleep 600

Query articles

# Last 24 hours of headlines python3 scripts/finviz_query.py --hours 24 # Titles only (compact, good for LLM summarization) python3 scripts/finviz_query.py --hours 12 --titles-only # With full article content python3 scripts/finviz_query.py --hours 12 --with-content # List downloaded articles with content status python3 scripts/finviz_query.py --list-articles --hours 24 # Database stats python3 scripts/finviz_query.py --stats

Manage tickers

# List all tracked tickers python3 scripts/finviz_query.py --list-tickers # Add single ticker (auto-generates keywords from symbol) python3 scripts/finviz_query.py --add-ticker NVDA # Add with custom keywords python3 scripts/finviz_query.py --add-ticker "NVDA:nvidia,jensen huang" # Add multiple tickers (batch) python3 scripts/finviz_query.py --add-ticker NVDA TSLA AAPL python3 scripts/finviz_query.py --add-ticker "NVDA:nvidia,jensen" "TSLA:tesla,elon musk" # Remove tickers (batch) python3 scripts/finviz_query.py --remove-ticker NVDA TSLA # Custom DB path python3 scripts/finviz_query.py --list-tickers --db /path/to/finviz.db Tickers are stored in the tickers table inside finviz.db alongside articles. The crawler reads this table each cycle to know which ticker pages to scrape.

Configuration

SettingCLI flagEnv varDefaultDatabase path--dbβ€”~/workspace/finviz/finviz.dbArticles directory--articles-dirβ€”~/workspace/finviz/articles/Crawl interval--sleepβ€”300 (5 min)Article expiry--expiry-daysFINVIZ_EXPIRY_DAYS7 daysTimezoneβ€”FINVIZ_TZ or TZSystem default

πŸ’¬ Chat Commands (OpenClaw Agent)

When this skill is installed, the agent recognizes /finviz as a shortcut: CommandAction/finviz listShow tracked tickers/finviz add NVDA, TSLAAdd tickers to track/finviz remove NVDARemove a ticker/finviz statsShow article/ticker counts/finviz helpShow available commands The agent runs these via the finviz_query.py CLI internally.

πŸ“± PrivateApp Dashboard

A companion mobile dashboard is available in PrivateApp β€” a personal PWA dashboard for your home server. The Finviz app provides: Headlines browser with time-range filters (12h / 24h / Week) Ticker-specific news filtering LLM-powered summaries on demand Install PrivateApp, and the Finviz dashboard is built-in β€” no extra setup needed.

Architecture

Crawler daemon (finviz_crawler.py): Crawls finviz.com/news.ashx headlines every 5 minutes Fetches article content via Crawl4AI (Playwright) or RSS (paywalled sites) Bot/paywall detection rejects garbage content Per-domain rate limiting, user-agent rotation Deduplicates via SHA-256 title hash Auto-expires old articles (configurable) Clean shutdown on SIGTERM/SIGINT Query tool (finviz_query.py): Read-only SQLite queries (no HTTP, stdlib only) Filter by time window, export titles or full content Designed for LLM summarization pipelines

systemd (Linux)

[Unit] Description=Finviz News Crawler [Service] ExecStart=python3 /path/to/scripts/finviz_crawler.py --expiry-days 30 Restart=on-failure RestartSec=30 [Install] WantedBy=default.target

launchd (macOS)

<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Label</key><string>com.finviz.crawler</string> <key>ProgramArguments</key> <array> <string>python3</string> <string>/path/to/scripts/finviz_crawler.py</string> <string>--expiry-days</string> <string>30</string> </array> <key>RunAtLoad</key><true/> <key>KeepAlive</key><true/> </dict> </plist>

Data layout

~/workspace/finviz/ β”œβ”€β”€ finviz.db # SQLite: articles + tickers (single DB) β”œβ”€β”€ articles/ # Full article content as .md files β”‚ β”œβ”€β”€ market/ # General market headlines β”‚ β”œβ”€β”€ nvda/ # Per-ticker articles β”‚ └── tsla/ └── summaries/ # LLM summary cache (.json)

Cron integration

Pair with an OpenClaw cron job for automated digests: Schedule: 0 6 * * * (6 AM daily) Task: Query last 24h β†’ LLM summarize β†’ deliver to Matrix/Telegram/Discord

Category context

Code helpers, APIs, CLIs, browser automation, testing, and developer operations.

Source: Tencent SkillHub

Largest current source with strong distribution and engagement signals.

Package contents

Included in package
3 Scripts2 Docs
  • SKILL.md Primary doc
  • README.md Docs
  • scripts/finviz_crawler.py Scripts
  • scripts/finviz_query.py Scripts
  • scripts/install.py Scripts