Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Scrapes viral content from TikTok, Instagram, YouTube, and Reddit, analyzes hooks with AI, generates scripts and captions, and auto-schedules posts across pl...
Scrapes viral content from TikTok, Instagram, YouTube, and Reddit, analyzes hooks with AI, generates scripts and captions, and auto-schedules posts across pl...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
This skill builds a fully automated content factory that runs 24/7: Apify scrapes the most viral content across TikTok, Instagram, YouTube, and Reddit Claude (OpenClaw) extracts the hooks, reverse-engineers why it went viral, generates scripts, captions, carousels, and threads A scheduler batches all posts and queues them for auto-publishing The result: a near fully-automated content channel that feeds itself. ๐ Apify: https://www.apify.com/?fpr=dx06p
Scrape the top viral content across multiple platforms every few hours Extract the exact hooks, structures, and formats that made content go viral Repurpose viral content into original scripts, captions, carousels, and threads Generate a full weekly content calendar automatically Batch and schedule posts across platforms (Instagram, TikTok, LinkedIn, Twitter/X) Track which generated content performs best and feed that signal back into the pipeline Run completely autonomously once configured โ minimal human input needed
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ AUTOMATED CONTENT GENERATION PIPELINE โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ LAYER 1 โ VIRAL CONTENT SCRAPING (Apify) โ โ โ โ TikTok โ Instagram โ YouTube โ Reddit โ Twitter/X โ โ โ โ Top posts by hashtag, views, engagement, shares โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ LAYER 2 โ AI CONTENT ENGINE (Claude / OpenClaw) โ โ โ โ โ โ โ โ โข Hook Extractor โ why did this go viral? โ โ โ โ โข Script Generator โ original video scripts โ โ โ โ โข Caption Writer โ post captions + hashtags โ โ โ โ โข Carousel Builder โ slide-by-slide content โ โ โ โ โข Thread Writer โ Twitter/X and LinkedIn threads โ โ โ โ โข Calendar Planner โ weekly posting schedule โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ LAYER 3 โ SCHEDULED PUBLISHING โ โ โ โ Buffer โ Later โ Hootsuite โ Custom Webhook โ โ โ โ Posts queued, timed, and published automatically โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Sign up at https://www.apify.com/?fpr=dx06p Go to Settings โ Integrations Copy your token: export APIFY_TOKEN=apify_api_xxxxxxxxxxxxxxxx
Get your API key from your OpenClaw or Anthropic account Store it: export CLAUDE_API_KEY=sk-ant-xxxxxxxxxxxxxxxx
npm install apify-client axios node-cron dotenv
import ApifyClient from 'apify-client'; const apify = new ApifyClient({ token: process.env.APIFY_TOKEN }); // Define your niche and topics const NICHE_TOPICS = [ "productivity", "entrepreneurship", "ai tools", "personal finance", "self improvement", "marketing" ]; async function scrapeViralContent() { console.log("๐ Scraping viral content..."); const [tiktok, instagram, reddit] = await Promise.all([ // TikTok โ top videos by hashtag apify.actor("apify/tiktok-hashtag-scraper").call({ hashtags: NICHE_TOPICS, resultsPerPage: 30, shouldDownloadVideos: false }).then(run => run.dataset().getData()), // Instagram โ top posts by hashtag apify.actor("apify/instagram-hashtag-scraper").call({ hashtags: NICHE_TOPICS, resultsLimit: 30 }).then(run => run.dataset().getData()), // Reddit โ hottest posts in relevant subreddits apify.actor("apify/reddit-scraper").call({ startUrls: [ { url: "https://www.reddit.com/r/Entrepreneur/" }, { url: "https://www.reddit.com/r/productivity/" }, { url: "https://www.reddit.com/r/personalfinance/" } ], maxPostCount: 20, sort: "hot" }).then(run => run.dataset().getData()) ]); // Normalize all platforms to a common schema const normalized = [ ...tiktok.items.map(p => ({ platform: "tiktok", text: p.text, likes: p.diggCount, shares: p.shareCount, comments: p.commentCount, views: p.playCount, engagementScore: (p.diggCount + p.shareCount * 3 + p.commentCount * 2), url: p.webVideoUrl, author: p.authorMeta?.name })), ...instagram.items.map(p => ({ platform: "instagram", text: p.caption, likes: p.likesCount, comments: p.commentsCount, engagementScore: (p.likesCount + p.commentsCount * 2), url: p.url, author: p.ownerUsername })), ...reddit.items.map(p => ({ platform: "reddit", text: p.title + " " + (p.selftext || ""), likes: p.score, comments: p.numComments, engagementScore: (p.score + p.numComments * 3), url: p.url, author: p.author })) ]; // Return top 20 by engagement score return normalized .sort((a, b) => b.engagementScore - a.engagementScore) .slice(0, 20); }
import axios from 'axios'; const claude = axios.create({ baseURL: 'https://api.anthropic.com/v1', headers: { 'x-api-key': process.env.CLAUDE_API_KEY, 'anthropic-version': '2023-06-01', 'Content-Type': 'application/json' } }); async function extractHooks(viralPosts) { const prompt = ` You are an expert viral content analyst. Analyze these top-performing posts and extract the exact patterns that made them go viral. VIRAL POSTS: ${JSON.stringify(viralPosts.slice(0, 10), null, 2)} Respond ONLY in this JSON format, no preamble: { "hookPatterns": [ { "pattern": "pattern name", "template": "reusable template with [BRACKETS] for variables", "example": "real example from the data", "whyItWorks": "psychological reason this triggers engagement", "bestPlatforms": ["tiktok", "instagram"] } ], "commonStructures": [ { "format": "format name (list | storytime | tutorial | controversy | etc)", "openingFormula": "how these posts start", "bodyFormula": "how they build", "closingFormula": "how they end / CTA", "avgEngagementBoost": "estimated % above average" } ], "topEmotions": ["curiosity", "surprise", "..."], "keyInsight": "single most important lesson from this batch of viral content" } `; const { data } = await claude.post('/messages', { model: "claude-opus-4-5", max_tokens: 2000, messages: [{ role: "user", content: prompt }] }); return JSON.parse(data.content[0].text.replace(/```json|```/g, '').trim()); }
async function generateScripts(hookAnalysis, niche, count = 5) { const prompt = ` You are a viral content creator. Use these proven hook patterns to generate ${count} original video scripts. NICHE: ${niche} HOOK PATTERNS: ${JSON.stringify(hookAnalysis.hookPatterns, null, 2)} BEST STRUCTURES: ${JSON.stringify(hookAnalysis.commonStructures, null, 2)} Respond ONLY in this JSON format: { "scripts": [ { "id": 1, "title": "video title", "platform": "tiktok | instagram | youtube_shorts", "hookPattern": "which pattern was used", "hook": "opening line โ first 3 seconds", "fullScript": "complete word-for-word script (120-180 words)", "estimatedDuration": "30s", "hashtags": ["#tag1", "#tag2", "#tag3", "#tag4", "#tag5"], "cta": "call to action", "thumbnailIdea": "thumbnail concept", "viralPotential": "high | medium", "bestPostTime": "morning | afternoon | evening" } ] } `; const { data } = await claude.post('/messages', { model: "claude-opus-4-5", max_tokens: 3000, messages: [{ role: "user", content: prompt }] }); return JSON.parse(data.content[0].text.replace(/```json|```/g, '').trim()); }
async function generatePostCaptions(scripts) { const prompt = ` Transform these video scripts into platform-optimized social media captions. SCRIPTS: ${JSON.stringify(scripts, null, 2)} Respond ONLY in this JSON format: { "posts": [ { "scriptId": 1, "platforms": { "instagram": { "caption": "full caption with line breaks and emojis", "hashtags": ["#tag1", "#tag2"], "firstComment": "hashtags to put in first comment" }, "tiktok": { "caption": "shorter, punchy tiktok caption", "hashtags": ["#fyp", "#tag2"] }, "linkedin": { "caption": "professional angle of the same content, 150-200 words", "hashtags": ["#tag1"] }, "twitter": { "thread": [ "tweet 1 (hook)", "tweet 2", "tweet 3", "tweet 4 (CTA)" ] } } } ] } `; const { data } = await claude.post('/messages', { model: "claude-opus-4-5", max_tokens: 3000, messages: [{ role: "user", content: prompt }] }); return JSON.parse(data.content[0].text.replace(/```json|```/g, '').trim()); }
async function buildContentCalendar(scripts, captions) { const today = new Date(); const days = ['Monday','Tuesday','Wednesday','Thursday','Friday','Saturday','Sunday']; const prompt = ` Build a 7-day content calendar from these generated posts. Maximize reach by distributing smartly across platforms and times. AVAILABLE CONTENT: Scripts: ${scripts.scripts.length} video scripts Captions: ready for Instagram, TikTok, LinkedIn, Twitter Today is ${today.toDateString()}. Respond ONLY in this JSON format: { "calendar": [ { "day": "Monday", "date": "YYYY-MM-DD", "posts": [ { "time": "08:00", "platform": "instagram", "contentType": "reel | carousel | story | post", "scriptId": 1, "caption": "caption preview", "hashtags": ["#tag1"], "status": "scheduled", "notes": "optional tip for this post" } ] } ], "weekSummary": { "totalPosts": 0, "platformBreakdown": { "instagram": 0, "tiktok": 0, "linkedin": 0, "twitter": 0 }, "estimatedReach": "rough estimate", "bestDayToPost": "day name", "strategy": "brief summary of the week strategy" } } `; const { data } = await claude.post('/messages', { model: "claude-opus-4-5", max_tokens: 3000, messages: [{ role: "user", content: prompt }] }); return JSON.parse(data.content[0].text.replace(/```json|```/g, '').trim()); }
async function publishToScheduler(calendar) { // Example: send to Buffer API const BUFFER_TOKEN = process.env.BUFFER_ACCESS_TOKEN; for (const day of calendar.calendar) { for (const post of day.posts) { const scheduledTime = new Date(`${day.date}T${post.time}:00`); if (BUFFER_TOKEN) { await axios.post( 'https://api.bufferapp.com/1/updates/create.json', { text: post.caption, profile_ids: [process.env[`BUFFER_${post.platform.toUpperCase()}_ID`]], scheduled_at: scheduledTime.toISOString(), hashtags: post.hashtags.join(' ') }, { headers: { Authorization: `Bearer ${BUFFER_TOKEN}` } } ); } // Or push to your own webhook / CMS if (process.env.PUBLISH_WEBHOOK_URL) { await axios.post(process.env.PUBLISH_WEBHOOK_URL, { platform: post.platform, caption: post.caption, hashtags: post.hashtags, scheduledAt: scheduledTime.toISOString(), scriptId: post.scriptId }); } console.log(`โ Scheduled: [${post.platform}] ${day.date} ${post.time}`); } } }
import cron from 'node-cron'; async function runContentPipeline(niche = "entrepreneurship") { console.log(`\n๐ญ Content Pipeline started โ ${new Date().toISOString()}`); const report = {}; try { // STEP 1 โ Scrape viral content console.log("\n[1/5] Scraping viral content with Apify..."); const viralContent = await scrapeViralContent(); report.postsScraped = viralContent.length; console.log(` โ ${viralContent.length} viral posts collected`); // STEP 2 โ Extract hooks and patterns console.log("\n[2/5] Extracting viral hooks with Claude..."); const hookAnalysis = await extractHooks(viralContent); report.hookPatterns = hookAnalysis.hookPatterns.length; console.log(` โ ${hookAnalysis.hookPatterns.length} hook patterns identified`); console.log(` ๐ก Key insight: ${hookAnalysis.keyInsight}`); // STEP 3 โ Generate scripts console.log("\n[3/5] Generating video scripts..."); const scripts = await generateScripts(hookAnalysis, niche, 7); report.scriptsGenerated = scripts.scripts.length; console.log(` โ ${scripts.scripts.length} scripts generated`); // STEP 4 โ Write captions for all platforms console.log("\n[4/5] Writing multi-platform captions..."); const captions = await generatePostCaptions(scripts.scripts); report.captionsWritten = captions.posts.length; console.log(` โ Captions written for ${captions.posts.length} posts`); // STEP 5 โ Build weekly calendar and schedule console.log("\n[5/5] Building content calendar and scheduling..."); const calendar = await buildContentCalendar(scripts, captions); report.calendarBuilt = true; report.totalPostsScheduled = calendar.weekSummary.totalPosts; await publishToScheduler(calendar); console.log(` โ ${calendar.weekSummary.totalPosts} posts scheduled for the week`); // Summary console.log("\n๐ PIPELINE COMPLETE:"); console.log(` โข Viral posts scraped: ${report.postsScraped}`); console.log(` โข Hook patterns found: ${report.hookPatterns}`); console.log(` โข Scripts generated: ${report.scriptsGenerated}`); console.log(` โข Posts scheduled: ${report.totalPostsScheduled}`); console.log(` โข Best day this week: ${calendar.weekSummary.bestDayToPost}`); console.log(` โข Strategy: ${calendar.weekSummary.strategy}`); return { success: true, report, calendar }; } catch (err) { console.error("Pipeline error:", err.message); throw err; } } // Run every Sunday at 8:00 AM โ generates the full week ahead cron.schedule('0 8 * * 0', () => { runContentPipeline("entrepreneurship"); }); // Run every morning at 6:00 AM for daily fresh content cron.schedule('0 6 * * *', () => { runContentPipeline("productivity"); }); // Run immediately on startup runContentPipeline("ai tools");
# .env APIFY_TOKEN=apify_api_xxxxxxxxxxxxxxxx CLAUDE_API_KEY=sk-ant-xxxxxxxxxxxxxxxx # Publishing (optional โ pick one or more) BUFFER_ACCESS_TOKEN=your_buffer_token BUFFER_INSTAGRAM_ID=your_ig_profile_id BUFFER_TIKTOK_ID=your_tiktok_profile_id BUFFER_LINKEDIN_ID=your_linkedin_profile_id PUBLISH_WEBHOOK_URL=https://your-app.com/webhooks/publish # Alerts (optional) SLACK_WEBHOOK_URL=https://hooks.slack.com/services/xxx/xxx/xxx
{ "runAt": "2025-02-25T06:00:00Z", "niche": "entrepreneurship", "postsScraped": 90, "hookPatterns": 6, "scriptsGenerated": 7, "totalPostsScheduled": 21, "calendar": { "Monday": [ { "time": "08:00", "platform": "instagram", "type": "reel", "scriptId": 1 }, { "time": "18:00", "platform": "tiktok", "type": "video", "scriptId": 1 }, { "time": "12:00", "platform": "linkedin", "type": "post", "scriptId": 2 } ] }, "weekSummary": { "totalPosts": 21, "platformBreakdown": { "instagram": 7, "tiktok": 7, "linkedin": 4, "twitter": 3 }, "bestDayToPost": "Tuesday", "strategy": "Lead with curiosity hooks on TikTok early week, repurpose as LinkedIn insights mid-week, close with engagement posts on weekends" } }
Scrape wide, publish narrow โ collect 50+ viral posts, produce 5โ7 pieces of original content Never copy โ use viral posts as structural inspiration only, always generate original text Set cron to run on Sunday evening to pre-fill the full week ahead Use 3โ5 niches max to keep the content focused and the audience growing Track which posts actually perform and feed that back as additional context to Claude Combine with the Trend Radar skill to inject real-time trend data into the pipeline For maximum automation, connect the video scripts to InVideo (see Short Video Creator skill)
Apify account โ https://www.apify.com/?fpr=dx06p Claude / OpenClaw API key Node.js 18+ with apify-client, axios, node-cron Optional: Buffer, Later, or Hootsuite account for automated publishing Optional: InVideo account for auto video production from generated scripts
Writing, remixing, publishing, visual generation, and marketing content production.
Largest current source with strong distribution and engagement signals.