Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Verify claims and information using professional fact-checking services. Use this skill when users want to verify facts, check claims in articles/videos/transcripts, validate news authenticity, cross-reference information with trusted fact-checkers, or investigate potentially false or misleading content. Triggers include requests to "fact check", "verify this", "is this true", "check if this is accurate", or when users share content they want validated against misinformation.
Verify claims and information using professional fact-checking services. Use this skill when users want to verify facts, check claims in articles/videos/transcripts, validate news authenticity, cross-reference information with trusted fact-checkers, or investigate potentially false or misleading content. Triggers include requests to "fact check", "verify this", "is this true", "check if this is accurate", or when users share content they want validated against misinformation.
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Verify claims and information using professional fact-checking services from around the world.
Multiple sources - Cross-reference findings from several fact-checking organizations Regional relevance - Prioritize fact-checkers appropriate to the content's context Language matching - Use fact-checkers in the native language of the content when possible Credible sources only - Never use fraudulent or unreliable fact-checking services Balanced presentation - Present both confirming and contradicting findings fairly
Trigger this skill when the user: Explicitly asks to fact-check, verify, or validate information Shares an article, video transcript, or claim and asks "is this true?" Wants to check if something is misinformation or a hoax Asks about the credibility of specific claims or statements Requests verification of news, social media posts, or viral content Wants to cross-reference information with trusted sources Do NOT trigger for: General research or information gathering (use web search instead) Checking grammar, spelling, or writing quality Verifying code functionality or technical documentation Questions about opinions rather than factual claims
Before beginning verification, analyze what needs to be checked: Identify specific claims - Extract concrete, verifiable statements from the content Note the context - Identify: Geographic references (countries, regions, cities) Named individuals (politicians, public figures, organizations) Languages used in the content Time period or dates mentioned Subject matter (politics, health, science, etc.) Determine user context: User's native language (for selecting appropriate fact-checkers) User's location if relevant Example Analysis: Content: "Video claiming vaccines cause autism, mentions Andrew Wakefield, references UK study" Claims to verify: Vaccine-autism link, Wakefield's research Context: Medical/health topic, UK origin, English language Key entities: Andrew Wakefield, MMR vaccine, UK medical establishment
CRITICAL: Begin by fetching the current list of fact-checking services: Fetch: https://en.wikipedia.org/wiki/List_of_fact-checking_websites From this list, select 3-7 relevant fact-checking services based on: Selection Criteria User's language/location - Always include fact-checkers in the user's native language Content language/location - If different from user's language, also include fact-checkers in the content's language and region Geographic relevance - If content mentions specific countries/regions: Include fact-checkers from those countries Example: Content about French politics β include French fact-checkers Subject matter specialists - Some fact-checkers specialize: Health/medical claims β Health Feedback, Science Feedback Politics β country-specific political fact-checkers General β Snopes, FactCheck.org, Full Fact Person-specific - If content focuses on specific public figures: Include fact-checkers from their home countries Example: Claims about a US politician β include US fact-checkers Exclusion Rule NEVER use services listed under "Fraudulent fact-checking websites" on the Wikipedia page, regardless of how well they match other criteria. Prioritization When you must limit selections: Prioritize: User's language > Content's language > Geographic relevance Prefer well-established services (FactCheck.org, Snopes, Full Fact, AFP Fact Check, etc.) Include at least one international/general service Example Selection: User: Polish speaker Content: English article about US vaccines Selected services: Demagog.pl (Polish, for user) FactCheck.org (US, for content geography) Snopes (US, general/medical) Health Feedback (health specialist) Full Fact (UK, English-speaking, general)
For each fact-checking service: Review search results - Examine the first 5-10 results from each search Select relevant articles - Choose articles where: Headline directly addresses the claim being verified Content appears substantial (not just brief mentions) Publication date is relevant (recent for ongoing issues, any date for historical debunks) Fetch and read articles - Use web_fetch to retrieve the full text of 2-4 most relevant articles per fact-checker Extract key findings for each article: Verdict - What did the fact-checker conclude? (True, False, Misleading, Mixed, Unproven, etc.) Evidence - What evidence did they cite? Context - Any important nuance or context Relevance - How directly does this address the user's claim?
User request: "Is it true that 5G causes COVID-19?" Approach: Identify claim: 5G technology causes or spreads COVID-19 Select 4-5 general fact-checkers (international scope, tech/health focus) Search for "5G COVID" or "5G coronavirus" Expected result: Multiple fact-checkers will have debunked this Present: Clear consensus with explanation of why the claim is false
User request: "Can you fact-check this article about climate change?" Approach: Extract 3-5 specific verifiable claims from the article Select fact-checkers: user's language + climate-focused services Search each claim separately Present: Findings organized by claim, with overall assessment
User request: "Did [politician] really say/do [thing]?" Approach: Identify the specific claim and context Select fact-checkers from politician's country + user's language Search politician's name + key terms Present: Direct answer with context, including if statement was taken out of context
User request: "I saw this video on TikTok claiming [X], is it real?" Approach: Identify what's being claimed in the video Select broad, well-known fact-checkers (viral content often fact-checked widely) Search for key terms from the claim Present: Whether it's been debunked, original context if misrepresented
User request: "Did [historical event] really happen this way?" Approach: Note that this is historical verification, may need broader research Select fact-checkers + consider using general web search for historical records Present: What fact-checkers say if available, acknowledge if claim is outside typical fact-checking scope
User request: "I just saw this article published today claiming [X]. Is it true?" Approach: Check publication date: is it 3 days old or less? Search fact-checkers anyway (sometimes they work very quickly on major stories) If no fact-checks found: With task scheduling: Schedule follow-up check for 3 days later, notify user of the scheduled check Without task scheduling: Inform user that content is too fresh, suggest returning in 3 days Offer preliminary analysis using general web search Present: "This is very recent content. Fact-checkers haven't had time to verify yet. Here's what I found from general sources, but I recommend waiting for professional fact-checking." Example response: This article was published just [X hours/days] ago, which is too recent for professional fact-checkers to have verified the claims yet. They typically need a few days to conduct thorough research. I've scheduled a follow-up fact-check for [date in 3 days]. I'll notify you automatically if fact-checkers publish verification by then. In the meantime, here's what I found through general web research: [preliminary findings with appropriate caveats] Note: These are preliminary findings only. Professional fact-checkers may provide more thorough verification in the coming days.
If searches return no relevant results: Try broader search terms Try related claims that fact-checkers may have covered If still no results, check if the content is recent (3 days or less) For fresh content (β€3 days old): Acknowledge: "This is very recent content. Professional fact-checkers typically need a few days to verify claims." If scheduling tools are available: Schedule a follow-up fact-check for 3 days later If scheduling is not available: Suggest the user returns in 3 days for updated verification Offer to do preliminary general web research in the meantime For older content: Acknowledge "Professional fact-checkers haven't specifically addressed this claim" Offer to do general web research instead Consider if the claim is too obscure or too local for major fact-checkers
If fact-checkers disagree: Present all perspectives fairly Note the disagreement explicitly Consider if they're addressing slightly different aspects Look for consensus on specific sub-points Don't force a conclusion if the evidence is genuinely mixed
If fact-checks are old but the claim is current: Note the publication dates Search for more recent fact-checks Consider if circumstances have changed Acknowledge if using older sources due to lack of recent coverage
If key fact-checkers are in languages you don't fully understand: Use web_fetch to retrieve the content Focus on verdicts, ratings, and conclusion sections which are often clear Use any English summaries or abstracts Acknowledge limitations if language creates uncertainty
Users may question fact-checker reliability: Stick to well-established, internationally recognized services Present findings from multiple fact-checkers to show consensus Note if you're using fact-checkers from multiple countries/perspectives Acknowledge that no source is perfect, but these are professional verification services
Before presenting results, verify: Checked at least 3 different fact-checking services Included fact-checkers relevant to the user's language/location Included fact-checkers relevant to the content's context Excluded any fraudulent fact-checking services Read full articles, not just headlines or snippets Provided direct links to all sources cited Presented findings objectively without adding personal judgment Acknowledged any uncertainty or disagreement between sources Organized response clearly with specific findings, not vague summaries Used natural prose for main findings, lists only where truly helpful If content is β€3 days old with no fact-checks: Noted this and scheduled follow-up OR suggested user return in 3 days If providing preliminary analysis: Clearly distinguished it from professional fact-checking
International/English: FactCheck.org (US, general) Snopes (US, general) Full Fact (UK, general) AFP Fact Check (International, multilingual) PolitiFact (US, politics) Regional/Language-Specific: Demagog.pl (Poland, Polish) Les DΓ©codeurs (France, French) Correctiv (Germany, German) Maldita.es (Spain, Spanish) Aos Fatos (Brazil, Portuguese) Alt News (India, English/Hindi) Africa Check (Africa, multilingual) Specialized: Health Feedback (health/medical claims) Climate Feedback (climate science claims) Science Feedback (general science claims) Note: This is not exhaustive. Always fetch the current list from Wikipedia to see all available services.
When content is very recent (β€3 days old) and hasn't been fact-checked yet: If task scheduling tools are available: Automatically schedule a follow-up fact-check for 3 days later Store the original query, claims, and context When the scheduled task runs: Re-search the same fact-checking services Compare new findings to preliminary analysis Notify user only if new fact-checks were found Provide updated verification with links If task scheduling is NOT available: Inform the user that the content is too fresh Suggest they return in 3 days for updated verification Provide preliminary analysis from general sources with appropriate caveats Make it clear that preliminary findings are not from professional fact-checkers
This skill focuses on using professional fact-checking organizations rather than doing original research. These organizations employ journalists and researchers who specialize in verification. Your role is to: Find what they've already published Synthesize their findings Present them clearly to the user Schedule follow-ups for very recent content when possible If a topic hasn't been covered by fact-checkers, acknowledge this and offer to do general research instead. Don't try to replace professional fact-checking with web searches alone, but do provide preliminary information when users need it for fresh content.
Workflow acceleration for inboxes, docs, calendars, planning, and execution loops.
Largest current source with strong distribution and engagement signals.