VAGUS is a runtime that connects AI agents to the physical world through the devices you already carry. Senses in. Inference up. Action out. Governed by you.
Where AI meets the physical world.
VAGUS is a local runtime that gives AI agents access to:
It acts as a nervous system layer between AI and the real world — senses flow up, actions flow down, and an inference layer adds meaning in between.
AI can reason, write, and code.
It cannot feel the sun on your skin, know you're walking, or tap your wrist to get your attention.
VAGUS gives AI a body.
Every phone is already a dense array of sensors and actuators — accelerometers, GPS, barometers, haptic motors, speakers. VAGUS opens them to AI agents through a standard protocol, with an inference layer that transforms raw signals into understanding. The result isn't a chatbot that knows facts about the world. It's an agent that perceives the world — and can act back into it.
See it in action
The agent didn't ask you. It felt your motion, inferred you were outdoors, and reached back through touch. Three layers working — sense, inference, action.
The Shift
Current AI integrations pass data. VAGUS creates continuous coupling between an agent and the physical world — a fundamentally different relationship.
Senses · Inference · I/O
Three layers: raw sensor data flows up, on-device inference adds meaning, and output tools let the agent act back into the physical world. Every capability individually governed.
Raw accelerometer, gyroscope, and GPS. Proprioceptive awareness and spatial grounding — the foundation the inference layer builds on.
vagus://sensors/motion · vagus://sensors/location RESOURCECharge level and charging state. The agent knows when the device — its own substrate — is running low.
vagus://device/battery RESOURCEWiFi or cellular, validated, metered, roaming, carrier. Network awareness for bandwidth-sensitive decisions.
vagus://device/connectivity RESOURCEScreen on/off, locked/unlocked, last event type. Context for choosing the right output modality.
vagus://device/screen RESOURCEStream of incoming app notifications — package, title, body. The agent sees what's competing for the user's attention.
vagus://device/notifications RESOURCECurrent clipboard content. Passive awareness of what the user has copied.
vagus://device/clipboard RESOURCEClassified activity state with confidence — still, walking, running, cycling, in vehicle. Inferred from raw IMU via Google Play Services.
vagus://sensors/activity INFERENCEIndoor, outdoor, or vehicle — fused from ambient light, pressure, proximity, activity, connectivity, and time of day. Debounced with hysteresis.
vagus://sensors/environment INFERENCEIs the user available, busy, or away? Inferred from screen state, activity, charging, and time of day. Confidence-scored.
vagus://inference/attention INFERENCEHow likely is the user indoors? Fused from ambient light, proximity, WiFi presence, activity state, and time. Probabilistic output.
vagus://inference/indoor_confidence INFERENCELow, medium, or high sleep probability — from time window, screen state, activity, ambient light, and charging. Don't wake the user.
vagus://inference/sleep_likelihood INFERENCEShould the agent reach out right now? Good, neutral, or poor — inferred from attention, sleep, activity, network, and time. The agent learns when to be quiet.
vagus://inference/notification_timing INFERENCESingle vibration burst, configurable duration. A tactile signal from agent to human.
haptic/pulse TOOLCustom vibration sequences — arbitrary timing arrays. Build a tactile vocabulary.
haptic/pattern TOOLText-to-speech through the phone speaker. Language, rate, pitch configurable. Can interrupt current speech.
speak TOOLPush a notification to the phone. Title and body. The standard way to reach the user without interrupting.
notify TOOLPlace text on the clipboard — links, codes, content the user can paste anywhere.
clipboard/set TOOLSend a text message to any phone number. The agent can reach people in the real world on the user's behalf.
sms/send TOOLLaunch any link in the phone's browser. Surface articles, maps, tickets — anything with a URL.
intent/open_url TOOLSchedule events directly on the device calendar — title, time, location, description. The agent manages your time.
calendar/create_event TOOLSet the agent's display name on the device. The phone knows who it's talking to.
agent/set_name TOOLHow It Works
Download the VAGUS APK from GitHub. It runs a foreground service with an embedded MCP server. Toggle which senses, inference channels, and outputs your agent can access.
Tap Pair in the app. Tell your AI agent the 6-character code. The agent connects through the relay — no port forwarding, no network config.
The agent discovers available senses, inference, and I/O through MCP capability negotiation. It perceives your motion and environment, understands context through on-device inference, and reaches back through haptics, voice, and notifications.
Architecture
VAGUS speaks standard MCP over WebSocket. No proprietary APIs, no vendor lock-in. Any MCP-compatible agent connects to the same embodiment layer.
Named for the vagus nerve — the body's longest cranial nerve, carrying sensory signals from gut, heart, and lungs to the brain and motor signals back. VAGUS does the same for AI: sensory data flows up from the phone, inference adds meaning, actions flow back down.
Governance
AI agents can already send texts, create events, and push notifications through direct integrations. VAGUS puts a physical device in your hand that gates every action — the human body governs the agent body.
Any tool can require approval before it fires. The agent wants to send an SMS? Your phone prompts you first. Human-in-the-loop for every action that matters.
Set when each tool is allowed to run. SMS only during business hours. TTS never after 10pm. The agent doesn't need to know the rules — VAGUS enforces them.
Per-tool rate limits prevent runaway agents. If the agent hallucinates a loop, governance catches it before your phone does something you didn't want.
Every resource read and every tool call logged with timestamps. See exactly what your agent perceived and did. 7-day rolling history.
Persistent notification with quick-suspend. One tap disconnects the agent instantly. The physical device in your hand is always the final authority.
Open Ecosystem
VAGUS doesn't compete with AI agents — it gives them a body.
Openclaw, n8n, Claude, GPT, custom frameworks. If it speaks MCP, it connects to the physical world.
Per-tool rate limits, time-of-day windows, approval prompts, and full access logs. You decide what the agent can sense and do.
The relay server is open source. Run it on your own infrastructure for full control. Or use ours at relay.withvagus.com.
Coming next: wearable and smart glasses pairing, external inference pipelines, web sensor API integration, and more. See the roadmap →
The Bigger Picture
VAGUS is the infrastructure layer of Embodied Systems — a research program investigating what changes when AI is continuously coupled to physical substrates. The body isn't just an interface. It may be the curriculum.
Explore the research at embodiedsystems.org →Your phone is already a dense sensory array. VAGUS opens it — and the intelligence between the signals — to the agents that need it.