← All skills
Tencent SkillHub Β· AI

android-agent

Control a real Android phone via USB or network using GPT-4o vision to run tasks like opening apps, typing, tapping, and automation scripts.

skill openclawclawhub Free
0 Downloads
0 Stars
0 Installs
0 Score
High Signal

Control a real Android phone via USB or network using GPT-4o vision to run tasks like opening apps, typing, tapping, and automation scripts.

⬇ 0 downloads β˜… 0 stars Unverified but indexed

Install for OpenClaw

Quick setup
  1. Download the package from Yavira.
  2. Extract the archive and review SKILL.md first.
  3. Import or place the package into your OpenClaw setup.

Requirements

Target platform
OpenClaw
Install method
Manual import
Extraction
Extract archive
Prerequisites
OpenClaw
Primary doc
SKILL.md

Package facts

Download mode
Yavira redirect
Package format
ZIP package
Source platform
Tencent SkillHub
What's included
SKILL.md, examples/tasks.md, requirements.txt, scripts/connect.sh, scripts/run-task.py, scripts/screenshot.sh

Validation

  • Use the Yavira download entry.
  • Review SKILL.md after the package is downloaded.
  • Confirm the extracted package contains the expected setup assets.

Install with your agent

Agent handoff

Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.

  1. Download the package from Yavira.
  2. Extract it into a folder your agent can access.
  3. Paste one of the prompts below and point your agent at the extracted folder.
New install

I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.

Upgrade existing

I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.

Trust & source

Release facts

Source
Tencent SkillHub
Verification
Indexed source record
Version
1.1.1

Documentation

ClawHub primary doc Primary doc: SKILL.md 33 sections Open source page

android-agent β€” AI-Powered Android Phone Control

Plug your old Android phone into your Mac/PC. Now your AI assistant can use it. Got an old Android in a drawer? Connect it to any machine running OpenClaw β€” your gateway, a Mac Mini, a Raspberry Pi. Your AI can now open apps, tap buttons, type text, and complete tasks on a real phone. Book a cab, order food, check your bank app β€” anything you'd do with your thumbs.

How It Works

Your AI agent sees the phone screen (via screenshots), decides what to tap/type/swipe, and executes actions via ADB. Under the hood it uses DroidRun with GPT-4o vision. β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” screenshots β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” ADB commands β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ GPT-4o │◄─────────────────│ DroidRun │──────────────────►│ Android β”‚ β”‚ Vision │─────────────────►│ Agent │◄──────────────────│ Phone β”‚ β”‚ β”‚ tap/type/swipe β”‚ β”‚ screen state β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Direct Mode

Phone plugged into your OpenClaw gateway machine via USB. Zero networking required. [Gateway Machine] ──USB──► [Android Phone]

Node Mode

Phone plugged into a Mac Mini, Raspberry Pi, or any OpenClaw node. The gateway controls it over the network. [Gateway] ──network──► [Mac Mini / Pi node] ──USB──► [Android Phone] For Node mode, connect ADB over TCP/WiFi so the node can forward commands.

1. Enable USB Debugging

On your Android phone: Go to Settings β†’ About Phone Tap Build Number 7 times to enable Developer Options Go to Settings β†’ Developer Options Enable USB Debugging

2. Connect & Install

# Plug phone in via USB, then: pip install -r requirements.txt adb devices # verify phone shows up β€” authorize on phone if prompted

3. Run Your First Task

export OPENAI_API_KEY="sk-..." python scripts/run-task.py "Open Settings and turn on Dark Mode" That's it. The script handles everything: waking the screen, unlocking, keeping the display on, and running your task.

πŸ“± Daily Life

python scripts/run-task.py "Order an Uber to the airport" python scripts/run-task.py "Set an alarm for 6 AM tomorrow" python scripts/run-task.py "Check my bank balance on PhonePe" python scripts/run-task.py "Open Google Maps and navigate to the nearest coffee shop"

πŸ’¬ Messaging

python scripts/run-task.py "Send a WhatsApp message to Mom saying I'll be late" python scripts/run-task.py "Read my latest SMS messages" python scripts/run-task.py "Open Telegram and check unread messages"

πŸ›’ Shopping

python scripts/run-task.py "Open Amazon and search for wireless earbuds under 2000 rupees" python scripts/run-task.py "Add milk and bread to my Instamart cart"

πŸ“… Productivity

python scripts/run-task.py "Open Google Calendar and check my schedule for tomorrow" python scripts/run-task.py "Create a new note in Google Keep: Buy groceries"

🎡 Entertainment

python scripts/run-task.py "Play my Discover Weekly playlist on Spotify" python scripts/run-task.py "Open YouTube and search for lo-fi study music"

βš™οΈ Settings & Setup

python scripts/run-task.py "Turn on Dark Mode" python scripts/run-task.py "Connect to my home WiFi network" python scripts/run-task.py "Enable Do Not Disturb mode" python scripts/run-task.py "Turn off Bluetooth"

πŸ“Έ Utilities

python scripts/run-task.py "Take a screenshot" python scripts/run-task.py "Open the camera and take a photo" python scripts/run-task.py "Clear all notifications"

Environment Variables

VariableRequiredDescriptionOPENAI_API_KEYYesAPI key for GPT-4o visionANDROID_SERIALNoDevice serial number. Auto-detected if only one device is connectedANDROID_PINNoPhone PIN/password for auto-unlock. If not set, unlock is skippedDROIDRUN_TIMEOUTNoTask timeout in seconds (default: 120)

Direct Mode (USB)

Install ADB: # macOS brew install android-platform-tools # Ubuntu/Debian sudo apt install android-tools-adb # Windows # Download from https://developer.android.com/tools/releases/platform-tools Connect phone via USB and verify: ./scripts/connect.sh usb Install DroidRun Portal APK on the phone: Download from DroidRun releases Or sideload: adb install droidrun-portal.apk Open the Portal app on the phone and grant accessibility permissions Install Python dependencies: pip install -r requirements.txt

Node Mode (Remote via WiFi/TCP)

On the node machine (Mac Mini, Pi, etc.), connect the phone via USB and enable WiFi ADB: adb tcpip 5555 adb connect <phone-ip>:5555 From your gateway, connect to the node's ADB: # If using SSH tunnel: ssh -L 15555:<phone-ip>:5555 user@node-ip export ANDROID_SERIAL="127.0.0.1:15555" # Or direct WiFi (same network): ./scripts/connect.sh wifi <phone-ip> Run tasks as normal β€” the script uses whatever ANDROID_SERIAL points to.

DroidRun Portal Setup

The DroidRun Portal APK must be installed and running on the phone. It provides the accessibility service that allows DroidRun to read screen content and interact with UI elements. Install the APK (download from DroidRun GitHub releases) Open the Portal app Grant Accessibility Service permission when prompted Keep it running in the background

scripts/run-task.py β€” The Main Script

# Basic usage python scripts/run-task.py "Your task description here" # With options python scripts/run-task.py --timeout 180 "Install Spotify from Play Store" python scripts/run-task.py --model gpt-4o "Open Chrome and search for weather" python scripts/run-task.py --no-unlock "Take a screenshot" python scripts/run-task.py --serial 127.0.0.1:15555 "Check notifications" python scripts/run-task.py --verbose "Open Settings" Options: FlagDescriptiongoalTask description (positional, required)--timeoutTimeout in seconds (default: 120, or DROIDRUN_TIMEOUT env)--modelLLM model to use (default: gpt-4o)--no-unlockSkip the auto-unlock step--serialDevice serial (default: ANDROID_SERIAL env or auto-detect)--verboseShow detailed debug output

scripts/connect.sh β€” Setup & Verify Connection

./scripts/connect.sh # Auto-detect USB device ./scripts/connect.sh usb # USB mode (explicit) ./scripts/connect.sh wifi 192.168.1.100 # WiFi/TCP mode

scripts/screenshot.sh β€” Screenshot (ADB screencap, reliable)

DroidRun’s internal screenshot sometimes fails on certain devices. Use this to bypass DroidRun and capture a PNG directly via ADB. # Save to /tmp/android-screenshot.png ./scripts/screenshot.sh # Explicit serial + output path ./scripts/screenshot.sh 127.0.0.1:15555 /tmp/a03.png You can also do it from Python: python scripts/run-task.py --screenshot --serial 127.0.0.1:15555 --screenshot-path /tmp/a03.png

scripts/status.sh β€” Device Status

./scripts/status.sh # Output: # πŸ“± Device: Samsung Galaxy A03 (SM-A035F) # πŸ€– Android: 11 (API 30) # πŸ”‹ Battery: 87% # πŸ“Ί Screen: ON (unlocked) # πŸ”Œ Connection: USB # πŸ“¦ DroidRun Portal: installed (v0.5.5)

"no devices/emulators found"

Check USB cable (use a data cable, not charge-only) Authorize the computer on your phone's USB debugging prompt Try adb kill-server && adb start-server

"device unauthorized"

Disconnect and reconnect USB Check the phone screen for an authorization dialog If no dialog appears, revoke USB debugging authorizations in Developer Options and reconnect

Phone screen turns off during task

The script sets keep-awake mode automatically, but some phones override this Manually: Settings β†’ Developer Options β†’ Stay Awake (while charging)

Task fails with dialog/popup blocking

The script tries to dismiss common dialogs automatically For persistent popups, dismiss them manually first, then retry Use --verbose to see what the agent is seeing

WiFi ADB disconnects after reboot

WiFi ADB mode resets on phone reboot β€” you need to re-enable it via USB Run ./scripts/connect.sh usb first, then ./scripts/connect.sh wifi <ip>

DroidRun agent seems confused

Make sure DroidRun Portal is running and accessibility service is enabled Close unnecessary apps to reduce screen complexity Try a simpler task first to verify the setup works

PIN unlock fails

PIN pad button coordinates vary by device and screen resolution To find your device's coordinates: adb shell getevent -l and tap each digit Or use adb shell input text <PIN> as a fallback on some devices Set ANDROID_PIN environment variable (never hardcode it)

Security

ADB grants full device access β€” only connect devices you trust and own Screenshots are sent to your LLM provider (OpenAI by default) β€” be mindful of sensitive content on screen (banking apps, private messages) PIN is read from environment variable only β€” never stored in files or logs WiFi ADB is unencrypted β€” use USB or an SSH tunnel on untrusted networks DroidRun Portal requires accessibility permissions β€” this is powerful access; understand what it enables

Requirements

Python 3.10+ ADB (Android Debug Bridge) Android 8.0+ phone with Developer Options and USB Debugging enabled DroidRun Portal APK installed on phone OpenAI API key (GPT-4o for vision capabilities) USB data cable (not charge-only)

⚠️ Security Notes

Use a dedicated test device, not your primary phone. Screenshots & screen text go to OpenAI. Every screenshot the agent takes is sent to GPT-4o for vision processing. Don't run this on devices with sensitive data visible β€” banking apps, 2FA tokens, private messages, medical info. If it's on screen, it's sent to the cloud. ANDROID_PIN is stored as an environment variable. While it's never written to files or logs, anyone with access to the host's environment can read it. Use a disposable device PIN you don't use elsewhere, or accept the risk. Only install DroidRun Portal from official sources. Download the APK exclusively from DroidRun GitHub releases. Never sideload APKs from third-party sites. ADB grants full device access. Combined with accessibility permissions, this is effectively root-level control. Only connect devices you own and are comfortable exposing. WiFi ADB is unencrypted. If using TCP/WiFi mode on an untrusted network, wrap it in an SSH tunnel. Bottom line: Treat the connected phone as a "work device for AI." Don't leave personal accounts logged in. Don't store secrets on it. If you wouldn't hand your unlocked phone to a stranger, don't point this skill at it.

License

MIT β€” see LICENSE

Category context

Agent frameworks, memory systems, reasoning layers, and model-native orchestration.

Source: Tencent SkillHub

Largest current source with strong distribution and engagement signals.

Package contents

Included in package
3 Scripts2 Docs1 Files
  • SKILL.md Primary doc
  • examples/tasks.md Docs
  • scripts/connect.sh Scripts
  • scripts/run-task.py Scripts
  • scripts/screenshot.sh Scripts
  • requirements.txt Files