First MCP-compatible embodiment runtime on mobile ALPHA

Give AI
a body

VAGUS is a runtime that connects AI agents to the physical world through the devices you already carry. Senses in. Inference up. Action out. Governed by you.

Where AI meets the physical world.

Download Alpha Read the Docs

What is VAGUS?

VAGUS is a local runtime that gives AI agents access to:

It acts as a nervous system layer between AI and the real world — senses flow up, actions flow down, and an inference layer adds meaning in between.

AI can reason, write, and code.
It cannot feel the sun on your skin, know you're walking, or tap your wrist to get your attention.

VAGUS gives AI a body.

Every phone is already a dense array of sensors and actuators — accelerometers, GPS, barometers, haptic motors, speakers. VAGUS opens them to AI agents through a standard protocol, with an inference layer that transforms raw signals into understanding. The result isn't a chatbot that knows facts about the world. It's an agent that perceives the world — and can act back into it.

Mind
AI Agent
Openclaw, n8n, Claude, GPT, custom agent, etc.
↑ sense↓ act
Nervous system
VAGUS
Senses · Inference · Action · Governance
↑ signals↓ output
Physical world
You + Environment
Motion, light, pressure, location, sound

See it in action

An agent that perceives you

OpenClaw — WhatsApp
you Install the VAGUS skill from github.com/embodiedsystems-org/VAGUS-MCP/openclaw
📱 Installing VAGUS... fetching skill files.
✓ Skill installed. Open the VAGUS app and give me your pairing code.

you K7X2M9
✓ Connected to Galaxy S26 · VAGUS 1.0.0
Senses: motion, activity, location, environment
Inference: attention, sleep, indoor, notification timing
I/O: haptics, tts, notify, clipboard

you What am I doing right now?
You're walking at a moderate pace, phone in your pocket.
Outdoors — bright light, 1013 hPa. Battery at 67%, not charging.
Looks like a nice walk. *gentle haptic tap*

The agent didn't ask you. It felt your motion, inferred you were outdoors, and reached back through touch. Three layers working — sense, inference, action.

The Shift

From information to embodiment

Current AI integrations pass data. VAGUS creates continuous coupling between an agent and the physical world — a fundamentally different relationship.

AI today

Agent asks user for info
Text in, text out
Stateless per-request interactions
Knows facts about the world
User describes their context

With VAGUS

Agent perceives user directly
Senses in, actions out
Continuous embodied coupling
Perceives the world itself
Agent reads context from sensors + inference

Senses · Inference · I/O

Raw perception.
Contextual understanding.
Physical action.

Three layers: raw sensor data flows up, on-device inference adds meaning, and output tools let the agent act back into the physical world. Every capability individually governed.

Sensors — raw perception
🧭

Motion & Position

Raw accelerometer, gyroscope, and GPS. Proprioceptive awareness and spatial grounding — the foundation the inference layer builds on.

vagus://sensors/motion · vagus://sensors/location RESOURCE
🔋

Battery

Charge level and charging state. The agent knows when the device — its own substrate — is running low.

vagus://device/battery RESOURCE
📶

Connectivity

WiFi or cellular, validated, metered, roaming, carrier. Network awareness for bandwidth-sensitive decisions.

vagus://device/connectivity RESOURCE
📱

Screen State

Screen on/off, locked/unlocked, last event type. Context for choosing the right output modality.

vagus://device/screen RESOURCE
🔔

Notifications

Stream of incoming app notifications — package, title, body. The agent sees what's competing for the user's attention.

vagus://device/notifications RESOURCE
📋

Clipboard

Current clipboard content. Passive awareness of what the user has copied.

vagus://device/clipboard RESOURCE
Inference — contextual understanding
🚶

Activity Recognition

Classified activity state with confidence — still, walking, running, cycling, in vehicle. Inferred from raw IMU via Google Play Services.

vagus://sensors/activity INFERENCE
🌡️

Environment Context

Indoor, outdoor, or vehicle — fused from ambient light, pressure, proximity, activity, connectivity, and time of day. Debounced with hysteresis.

vagus://sensors/environment INFERENCE
🧠

Attention Availability

Is the user available, busy, or away? Inferred from screen state, activity, charging, and time of day. Confidence-scored.

vagus://inference/attention INFERENCE
🏠

Indoor Confidence

How likely is the user indoors? Fused from ambient light, proximity, WiFi presence, activity state, and time. Probabilistic output.

vagus://inference/indoor_confidence INFERENCE
😴

Sleep Likelihood

Low, medium, or high sleep probability — from time window, screen state, activity, ambient light, and charging. Don't wake the user.

vagus://inference/sleep_likelihood INFERENCE

Notification Timing

Should the agent reach out right now? Good, neutral, or poor — inferred from attention, sleep, activity, network, and time. The agent learns when to be quiet.

vagus://inference/notification_timing INFERENCE
I/O — physical action
📳

Haptic Pulse

Single vibration burst, configurable duration. A tactile signal from agent to human.

haptic/pulse TOOL
📳

Haptic Pattern

Custom vibration sequences — arbitrary timing arrays. Build a tactile vocabulary.

haptic/pattern TOOL
🔊

Speak

Text-to-speech through the phone speaker. Language, rate, pitch configurable. Can interrupt current speech.

speak TOOL
🔔

Notify

Push a notification to the phone. Title and body. The standard way to reach the user without interrupting.

notify TOOL
📋

Clipboard Write

Place text on the clipboard — links, codes, content the user can paste anywhere.

clipboard/set TOOL
💬

Send SMS

Send a text message to any phone number. The agent can reach people in the real world on the user's behalf.

sms/send TOOL
🌐

Open URL

Launch any link in the phone's browser. Surface articles, maps, tickets — anything with a URL.

intent/open_url TOOL
📅

Create Calendar Event

Schedule events directly on the device calendar — title, time, location, description. The agent manages your time.

calendar/create_event TOOL
🪪

Agent Identity

Set the agent's display name on the device. The phone knows who it's talking to.

agent/set_name TOOL

How It Works

Three minutes to embodied AI

1

Install the app

Download the VAGUS APK from GitHub. It runs a foreground service with an embedded MCP server. Toggle which senses, inference channels, and outputs your agent can access.

2

Pair with a code

Tap Pair in the app. Tell your AI agent the 6-character code. The agent connects through the relay — no port forwarding, no network config.

3

Your agent has a body

The agent discovers available senses, inference, and I/O through MCP capability negotiation. It perceives your motion and environment, understands context through on-device inference, and reaches back through haptics, voice, and notifications.

Architecture

MCP native, open protocol

VAGUS speaks standard MCP over WebSocket. No proprietary APIs, no vendor lock-in. Any MCP-compatible agent connects to the same embodiment layer.

Named for the vagus nerve — the body's longest cranial nerve, carrying sensory signals from gut, heart, and lungs to the brain and motor signals back. VAGUS does the same for AI: sensory data flows up from the phone, inference adds meaning, actions flow back down.

AI Agent
(Openclaw, n8n, Claude, GPT, custom agent, etc.)
MCP-over-WSS
Relay
relay.withvagus.com — stateless forwarder
MCP-over-WSS
VAGUS Core Android App
Sensors
motion (raw IMU), location↑ sense
battery, connectivity, screen↑ sense
notifications, clipboard↑ sense
Inference
activity recognition↑ infer
environment context↑ infer
attention availability↑ infer
indoor confidence↑ infer
sleep likelihood↑ infer
notification timing↑ infer
I/O
haptic/pulse, pattern↓ act
speak, notify↓ act
sms/send↓ act
intent/open_url↓ act
calendar/create_event↓ act
clipboard/set↓ act
agent/set_name↓ act
Governance → per-capability toggles

Governance

Your agent can already act.
VAGUS makes it safer.

AI agents can already send texts, create events, and push notifications through direct integrations. VAGUS puts a physical device in your hand that gates every action — the human body governs the agent body.

🛡️

Ask Each Time

Any tool can require approval before it fires. The agent wants to send an SMS? Your phone prompts you first. Human-in-the-loop for every action that matters.

Time-of-Day Windows

Set when each tool is allowed to run. SMS only during business hours. TTS never after 10pm. The agent doesn't need to know the rules — VAGUS enforces them.

📊

Rate Limiting

Per-tool rate limits prevent runaway agents. If the agent hallucinates a loop, governance catches it before your phone does something you didn't want.

📜

Full Access Logs

Every resource read and every tool call logged with timestamps. See exactly what your agent perceived and did. 7-day rolling history.

🔴

One-Tap Kill Switch

Persistent notification with quick-suspend. One tap disconnects the agent instantly. The physical device in your hand is always the final authority.

Open Ecosystem

Built for the agent ecosystem

VAGUS doesn't compete with AI agents — it gives them a body.

Any Agent

Openclaw, n8n, Claude, GPT, custom frameworks. If it speaks MCP, it connects to the physical world.

User Governed

Per-tool rate limits, time-of-day windows, approval prompts, and full access logs. You decide what the agent can sense and do.

Self-Hostable

The relay server is open source. Run it on your own infrastructure for full control. Or use ours at relay.withvagus.com.

Coming next: wearable and smart glasses pairing, external inference pipelines, web sensor API integration, and more. See the roadmap →

The Bigger Picture

Infrastructure for
embodied intelligence

VAGUS is the infrastructure layer of Embodied Systems — a research program investigating what changes when AI is continuously coupled to physical substrates. The body isn't just an interface. It may be the curriculum.

Explore the research at embodiedsystems.org →

AI has a mind.
Now give it a body.

Your phone is already a dense sensory array. VAGUS opens it — and the intelligence between the signals — to the agents that need it.

Download Alpha → Star on GitHub