Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Interact with Twitter/X via TwitterAPI.io — search tweets, get user info, post tweets, like, retweet, follow, send DMs, and more. Covers all 59 endpoints. Us...
Interact with Twitter/X via TwitterAPI.io — search tweets, get user info, post tweets, like, retweet, follow, send DMs, and more. Covers all 59 endpoints. Us...
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.
Access Twitter/X data and perform actions via TwitterAPI.io REST API. Use TwitterAPI.io REST API for read, write, webhook, and stream operations. Docs: https://docs.twitterapi.io | Dashboard: https://twitterapi.io/dashboard
Get API key: https://twitterapi.io/dashboard ($0.10 free credits, no CC) Store the key in a .env file or your shell's secure config (do not use raw export with the actual key in the terminal -- it gets saved to shell history). For write actions, you also need login_cookies from v2 login + residential proxy. Base URL: https://api.twitterapi.io Auth header: X-API-Key: $TWITTERAPI_IO_KEY (all requests)
ResourceCreditsApprox $/1KTweets (per returned tweet)15$0.15Profiles (per returned profile)18$0.18Profiles batch 100+ (per profile)10$0.10Followers (per returned follower)15$0.15Verified followers (per follower)30$0.30Minimum per API call15$0.00015List endpoint calls150$0.0015Check follow relationship100$0.001Get article100$0.001Community info20$0.0002Write actions (tweet, like, RT, follow)200-300$0.002-0.003Login300$0.003 Note: If the API returns 0 or 1 item, you are still charged the minimum (15 credits).
Account Balance (Credits)QPS Limit< 1,000 (free tier)1 req / 5 sec>= 1,0003>= 5,0006>= 10,00010>= 50,00020
All V1 endpoints have been removed from the API. Use only V2 endpoints (_v2 suffix) for write operations. V2 requires login_cookies (from user_login_v2) + residential proxy.
The API has an inconsistency in naming: user_login_v2 response returns the field as login_cookie (singular) All v2 action endpoints expect the field as login_cookies (plural) Always use login_cookies (plural) in request bodies. The value is the same string.
{ "type": "tweet", "id": "1234567890", "url": "https://x.com/user/status/1234567890", "text": "Tweet content...", "source": "Twitter Web App", "retweetCount": 5, "replyCount": 2, "likeCount": 42, "quoteCount": 1, "viewCount": 1500, "bookmarkCount": 3, "createdAt": "Sun Feb 08 12:00:00 +0000 2026", "lang": "en", "isReply": false, "inReplyToId": null, "inReplyToUserId": null, "inReplyToUsername": null, "conversationId": "1234567890", "displayTextRange": [0, 280], "isLimitedReply": false, "author": { "...User Object..." }, "entities": { "hashtags": [{ "text": "AI", "indices": [10, 13] }], "urls": [{ "display_url": "example.com", "expanded_url": "https://example.com", "url": "https://t.co/xxx" }], "user_mentions": [{ "id_str": "123", "name": "Someone", "screen_name": "someone" }] }, "quoted_tweet": null, "retweeted_tweet": null }
{ "type": "user", "id": "999888777", "userName": "elonmusk", "name": "Elon Musk", "url": "https://x.com/elonmusk", "isBlueVerified": true, "verifiedType": "none", "profilePicture": "https://pbs.twimg.com/...", "coverPicture": "https://pbs.twimg.com/...", "description": "Bio text...", "location": "Mars", "followers": 200000000, "following": 800, "canDm": false, "favouritesCount": 50000, "mediaCount": 2000, "statusesCount": 30000, "createdAt": "Tue Jun 02 20:12:29 +0000 2009", "pinnedTweetIds": ["1234567890"], "isAutomated": false, "possiblySensitive": false, "profile_bio": { "description": "Bio text...", "entities": { "description": { "urls": [] }, "url": { "urls": [{ "display_url": "example.com", "expanded_url": "https://example.com" }] } } } }
{ "tweets": [ "...array of Tweet Objects..." ], "has_next_page": true, "next_cursor": "cursor_string...", "status": "success", "msg": null }
For detailed endpoint documentation with curl examples, consult the reference files: For READ endpoint documentation (33 endpoints), consult references/read-endpoints.md For WRITE V2 endpoint documentation (19 endpoints), consult references/write-endpoints.md For Webhook and Stream endpoint documentation (6 endpoints), consult references/webhook-stream-endpoints.md For the complete endpoint index table (all 58 endpoints), consult references/endpoint-index.md
CRITICAL: Around March 5, 2026, Twitter/X disabled or degraded several advanced search features due to high platform usage. This affects ALL Twitter API providers (not just TwitterAPI.io) because TwitterAPI.io proxies Twitter's own search infrastructure.
FeatureStatusImpactsince:DATE / until:DATE in searchDEGRADEDReturns incomplete results (often only 7-20 tweets per query regardless of actual volume)Search paginationBROKENCursor-based pagination returns the SAME page of results repeatedly instead of advancingsince_time:UNIX / until_time:UNIXWORKSAlternative date format using Unix timestamps -- returns correct date rangewithin_time:NhWORKSRelative time filter (e.g., within_time:72h)get_user_last_tweets paginationWORKSUser timeline cursor pagination is unaffectedget_user_mentions sinceTime/untilTimeWORKSServer-side Unix timestamp parameters (not search operators)Webhook filter rulesWORKSReal-time collection unaffected (but webhook URL may be lost during API key rotation)
Instead of (broken): $BTC since:2026-03-06_00:00:00_UTC until:2026-03-07_00:00:00_UTC Use (working): $BTC since_time:1741219200 until_time:1741305600 Convert dates to Unix timestamps: date -d "2026-03-06T00:00:00Z" +%s or use Python: int(datetime(2026,3,6,tzinfo=timezone.utc).timestamp()) Pagination workaround: Since pagination is broken, use hourly time windows instead of paginating through a large result set. Each 1-hour window returns a unique set of ~7-16 tweets (page 1 only). This gives ~250 unique tweets per coin per day across 24 windows. For user timelines: Use GET /twitter/user/last_tweets with cursor pagination (works normally). Paginate backwards through the user's timeline, then client-side filter by createdAt date. This completely bypasses search operators.
When a TwitterAPI.io API key is rotated (e.g., after account data reset), the webhook filter rules may be restored but the webhook URL is NOT automatically restored. You must manually re-set the webhook URL in the dashboard at https://twitterapi.io/tweet-filter-rules after any key rotation event. Monitoring tip: Check that collection_type='webhook' tweets are still arriving. If rules are active but zero webhook tweets arrive for 30+ minutes, verify the webhook URL is configured.
OperatorExampleDescriptionStatus (Mar 2026)from:from:elonmuskTweets by userWorkingto:to:elonmuskReplies to userWorking"...""exact phrase"Exact matchWorkingORcat OR dogEither termWorking--spamExclude termWorkingsince: / until:since:2026-01-01_00:00:00_UTCDate range (UTC)DEGRADED -- use since_time: insteadsince_time: / until_time:since_time:1741219200Date range (Unix timestamp)Workingwithin_time:within_time:24hRelative time windowWorkingmin_faves:min_faves:100Min likesWorkingmin_retweets:min_retweets:50Min RTsWorkingfilter:mediafilter:mediaHas mediaWorkingfilter:linksfilter:linksHas linksWorkinglang:lang:enLanguageWorkingis:replyis:replyOnly repliesWorking-is:retweet-is:retweetExclude RTsWorking More examples: https://github.com/igorbrigadir/twitter-advanced-search
Most list endpoints return: { "has_next_page": true, "next_cursor": "cursor_string..." } Pass cursor=NEXT_CURSOR to get next page. First page: omit cursor or cursor="". Known issues (March 2026): advanced_search pagination is broken -- returns the same results on every page. Use hourly time windows (1 page per window) instead of deep pagination. get_user_last_tweets pagination works normally -- cursor advances through the user's timeline chronologically. has_next_page may return true even when no more data exists (Twitter API limitation). If a subsequent request returns empty or identical results, stop paginating.
{ "status": "error", "msg": "Error message" } ErrorCauseFixInvalid API keyWrong or missing X-API-Key headerCheck key in dashboardInvalid login_cookieExpired or faulty cookieRe-login via user_login_v2 with valid totp_secret400 on v2 actionsFaulty cookie from login without proper totp_secretRe-login with 16-char string totp_secretProxy errorBad proxy format or dead proxyFormat: http://user:pass@host:port, use residentialRate limitedExceeded QPS for your balance tierBack off, add balance for higher QPSAccount suspendedTwitter account bannedUse different account404 on endpointWrong pathCheck correct path in this doc
GET /twitter/user/info?userName=TARGET -> extract data.id Use that numeric ID in follow/DM calls Note: get_user_mentions accepts userName directly -- no ID lookup needed.
Upload: POST /twitter/upload_media_v2 -> get media_id Tweet: POST /twitter/create_tweet_v2 with media_ids: ["media_id"]
POST /twitter/create_tweet_v2 with tweet_text + reply_to_tweet_id
POST /twitter/create_tweet_v2 with tweet_text + attachment_url (full tweet URL)
POST /twitter/create_tweet_v2 with tweet_text + community_id
Use Stream endpoints instead of polling /twitter/user/last_tweets: POST /oapi/x_user_stream/add_user_to_monitor_tweet for each account Set up webhook to receive notifications
claude mcp add twitterapi-io -- npx -y twitterapi-io-mcp npm: https://www.npmjs.com/package/twitterapi-io-mcp GitHub: https://github.com/dorukardahan/twitterapi-io-mcp Also available: twitterapi-docs MCP server for querying this documentation programmatically.
Read endpoints need only API key. No Twitter account needed. Write endpoints need login_cookies from v2 login + residential proxy. V2 cookies only work with v2 endpoints (_v2 suffix). 2FA strongly recommended -- use 16-character string totp_secret for reliable login. Proxy mandatory for all write actions. Use high-quality residential proxies. Credits never expire once recharged. Bonus credits valid 30 days.
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.