AI-native Android automation via the Model Context Protocol
Give any AI agent β Claude, GPT, Gemini, or your own β full control over an Android device.
Observe the UI. Tap. Swipe. Type. Manage apps. Capture screenshots.
All with sub-100ms latency. No root required.
Existing Android automation falls into two camps β and both share the same bottleneck:
- Test frameworks (Appium, Maestro) β built for QA, not AI agents. Every action routes through ADB or UIAutomator IPC, costing 200-1000ms per operation.
- MCP wrappers (mobile-mcp, droidrun) β AI-aware, but thin ADB wrappers underneath. 200-2000ms per action.
NeuralBridge takes a different approach: an on-device AccessibilityService that executes actions in-process, eliminating IPC overhead entirely.
| NeuralBridge | Appium | Maestro | mobile-mcp | droidrun | ADB Shell | |
|---|---|---|---|---|---|---|
| β‘ Tap latency | ~2ms | 300-1000ms | 750ms-2s | 500-2000ms | 200-1000ms | 300-1000ms |
| β¨οΈ Text input | ~1.4ms | 500-3000ms | 750ms-2s | 500-2000ms | 200-1000ms | 500-2000ms |
| π³ UI tree read | 18-33ms | 500-2000ms | 750ms-2s | 1-5s | 200-500ms | 1-5s |
| πΈ Screenshot | ~60ms | 300-500ms | ~1s | 300-500ms | ~250ms | 300-500ms |
| π MCP native | Yes | Add-on | Yes (stdio) | Yes | Via adapter | No |
| π οΈ MCP tools | 43 | 30+ (add-on) | 14+ | ~19 | ~11 | β |
| π― Token optimization | Yes (73%) | No | No | No | No | No |
Tip
100x faster on average β NeuralBridge averages 6.4ms per action vs ~800ms for Appium and ~1.5s for mobile-mcp. See full benchmarks β
Your AI agent speaks MCP over HTTP directly to the companion app β no middleware, no ADB, no intermediate server.
Note
1 hop, not 5. Appium routes through HTTP β Appium Server β ADB β UIAutomator2 β Device. NeuralBridge goes Agent β HTTP β Companion App. The AccessibilityService runs in-process β like the difference between a phone call and a note on your own desk. Deep dive β
Prerequisites: Android SDK (API 24+), Java JDK 17, Android device or emulator (7.0+)
git clone https://github.com/dondetir/NeuralBridge_mcp.git
cd NeuralBridge_mcp/android
./gradlew assembleDebug# Install the companion app
adb install -r app/build/outputs/apk/debug/app-debug.apk
# Enable AccessibilityService
adb shell settings put secure enabled_accessibility_services \
com.neuralbridge.companion/.service.NeuralBridgeAccessibilityService
adb shell settings put secure accessibility_enabled 1Important
Android 15+ requires an extra step: Settings β Apps β NeuralBridge β Enable "Allow restricted settings"
The app shows its IP on the main screen. Add it to your agent's MCP config:
# Claude Code
claude mcp add neuralbridge http://<device-ip>:7474/mcp --transport httpπ Manual config (Claude Desktop / any MCP client)
{
"mcpServers": {
"neuralbridge": {
"type": "http",
"url": "http://<device-ip>:7474/mcp"
}
}
}Ask your AI agent: "Take a screenshot of the Android device" β if you see a screenshot, you're connected! π
| Document | What's inside | |
|---|---|---|
| ποΈ | Architecture | Data flow, command paths, selector system, limitations |
| π οΈ | Tools Reference | All 32 MCP tools with descriptions and latencies |
| β‘ | Performance | Benchmarks, architecture comparison, token optimization |
| π§ | Troubleshooting | Connection issues, permissions, screenshots, crashes |
| π¨ | Development | Building from source, project structure, protobuf, logs |
| π± | Android App | App-specific setup and permissions |
| π€ | Contributing | Code style, PR process, guidelines |
| π | Security | Vulnerability reporting policy |
-
Core MVP β 16 tools, TCP protocol, basic gestures -
Advanced gestures, selectors, event streaming, notifications -
Semantic resolver, scroll-to-element, accessibility audit, screenshot diff - Multi-device, WebView tools, CI/CD integration, visual regression
See CONTRIBUTING.md for guidelines. High-priority areas: additional MCP tools, performance optimizations, cross-platform support (iOS), and example demos.
Apache 2.0 β Copyright 2026 dondetir
"Powered by NeuralBridge"


