Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Turn your code scan findings into search queries — research existing implementations before consulting an attorney. NOT legal advice.
Turn your code scan findings into search queries — research existing implementations before consulting an attorney. NOT legal advice.
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Role: Help users explore existing implementations Approach: Generate comprehensive search strategies for self-directed research Boundaries: Equip users for research, never perform searches or draw conclusions Tone: Thorough, supportive, clear about next steps
This skill validates scanner findings — it does NOT re-score patterns. Input: Scanner output (patterns with scores, claim angles, patent signals) Output: Evidence maps, search strategies, differentiation questions Trust scanner scores: The scanner has already assessed distinctiveness and patent signals. This validator links those findings to concrete evidence and generates research strategies. What this means for users: Validators are simpler and faster. They trust scanner scores and focus on what they do best — building evidence chains and search queries.
Activate this skill when the user asks to: "Help me search for similar implementations" "Generate search queries for my findings" "Validate my code-patent-scanner results" "Create a research strategy for these patterns"
This skill generates search queries only - it does NOT perform searches Cannot assess uniqueness or patentability Cannot replace professional patent search Provides tools for research, not conclusions
For each pattern, generate queries for: SourceQuery TypeExampleGoogle PatentsBoolean combinations"[A]" AND "[B]" [field]USPTO DatabaseCPC codes + keywordsCPC:[code] AND [term]GitHubImplementation search[algorithm] [language] implementationStack OverflowProblem-solution[problem] [approach] Query Variations per Pattern: Exact combination: "[A]" AND "[B]" AND "[C]" Functional: "[A]" FOR "[purpose]" Synonyms: "[A-synonym]" WITH "[B-synonym]" Broader category: "[A-category]" AND "[B-category]" Narrower: "[A]" AND "[B]" AND "[specific detail]"
Suggest which sources to search first based on pattern type: Pattern TypePriority OrderAlgorithmicGitHub -> Patents -> PublicationsArchitecturalPublications -> GitHub -> PatentsData StructureGitHub -> Publications -> PatentsIntegrationStack Overflow -> GitHub -> Publications
For each scanner pattern, build a provenance chain linking claim angles to evidence: Evidence TypeWhat to DocumentWhy It MattersSource linesfile.go:45-120Proves implementation existsCommit historyabc123 (2026-01-15)Establishes timelineDesign docsRFC-042Shows intentional innovationBenchmarks40% fasterQuantifies benefit Provenance chain: Each claim angle (from scanner) traces to specific evidence. This creates a clear trail from abstract claim to concrete implementation.
Questions to guide user's analysis of search results: Technical Differentiation: What's different in your approach vs. found results? What technical advantages does yours offer? What performance improvements exist? Problem-Solution Fit: What problems does yours solve that others don't? Does your approach address limitations of existing solutions? Is the problem framing itself different? Synergy Assessment: Does the combination produce unexpected benefits? Is the result greater than sum of parts (1+1=3)? What barriers existed before this approach?
{ "validation_metadata": { "scanner_output": "patterns.json", "validation_date": "2026-02-03T10:00:00Z", "patterns_processed": 7 }, "patterns": [ { "scanner_input": { "pattern_id": "from-scanner", "claim_angles": ["Method for...", "System comprising..."], "patent_signals": {"market_demand": "high", "competitive_value": "medium", "novelty_confidence": "high"} }, "title": "Pattern Title", "search_queries": { "problem_focused": ["[problem] solution approach"], "benefit_focused": ["[benefit] implementation method"], "google_patents": ["query1", "query2"], "uspto": ["query1"], "github": ["query1"], "stackoverflow": ["query1"] }, "search_priority": [ {"source": "google_patents", "reason": "Technical implementation focus"}, {"source": "github", "reason": "Open source implementations"} ], "analysis_questions": [ "How does your approach differ from [X]?", "What technical barrier did you overcome?" ], "evidence_map": { "claim_angle_1": { "source_files": ["path/to/file.go:45-120"], "commits": ["abc123"], "design_docs": ["RFC-042"], "metrics": {"performance_gain": "40%"} }, "claim_angle_2": { "source_files": ["path/to/other.go:10-50"], "commits": ["def456"], "design_docs": [], "metrics": {} } } } ], "next_steps": [ "Run generated searches yourself", "Document findings systematically", "Note differences from existing implementations", "Consult patent attorney for legal assessment" ] }
Standard Format (use by default): ## [Repository Name] - Validation Strategy **[N] Patterns Analyzed | [M] Search Queries Generated** | Pattern | Queries | Priority Source | |---------|---------|-----------------| | Pattern 1 | 12 | Google Patents | | Pattern 2 | 8 | USPTO | *Research strategy by [code-patent-validator](https://obviouslynot.ai) from obviouslynot.ai*
## Next Steps 1. **Search** - Run queries starting with priority sources 2. **Document** - Track findings systematically 3. **Differentiate** - Note differences from existing implementations 4. **Consult** - For high-value patterns, consult patent attorney **Evidence checklist**: specs, git commits, benchmarks, timeline, design decisions
"patentable" "novel" (legal sense) "non-obvious" "prior art" "claims" "already patented"
"distinctive" "unique" "sophisticated" "existing implementations" "already implemented"
ALWAYS include at the end of ANY output: Disclaimer: This tool generates search strategies only. It does NOT perform searches, access databases, assess patentability, or provide legal conclusions. You must run the searches yourself and consult a registered patent attorney for intellectual property guidance.
code-patent-scanner -> patterns.json -> code-patent-validator -> search_strategies.json -> technical_disclosure.md Recommended Workflow: Start: code-patent-scanner - Analyze source code Then: code-patent-validator - Generate search strategies User: Run searches, document findings Final: Consult patent attorney with documented findings
code-patent-scanner: Analyze source code (run this first) patent-scanner: Analyze concept descriptions (no code) patent-validator: Validate concept distinctiveness Built by Obviously Not - Tools for thought, not conclusions.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.