One MCP server — robot-md-mcp — runs behind every surface.
Install once per agent, then ask your robot anything in plain language.
Purpose-built for Anthropic surfaces; open to any MCP-aware agent. Each surface page has a full install walkthrough, a capabilities table, and worked examples.
Claude Code
The primary surface. Native MCP stdio — read capabilities, validate, run compliance commands from any Claude session.
pip install robot-md robot-md init claudeFull guide →
Gemini CLI
Google Gemini CLI via MCP. Add one config line and your robot manifest is readable from every Gemini session.
gemini mcp add robot-md \ npx -- --yes \ robot-md-mcp ./ROBOT.mdFull guide →
OpenAI Codex
Codex CLI via MCP stdio. Requires --dangerously-bypass to unlock tool calls in the Codex sandbox.
codex --mcp-server \ "npx --yes robot-md-mcp \ ./ROBOT.md" \ --dangerously-bypassFull guide →
ChatGPT
OpenAPI bridge via robot-md-http. Expose your manifest as a Custom GPT Action endpoint.
npx -y robot-md-http # OpenAPI bridge for # Custom GPT ActionsFull guide →
Amazon Q
Q CLI MCP support is currently x86-only. aarch64 (RPi) support is tracked upstream. Falls back to the OpenAPI bridge today.
# MCP: pending aarch64 # Use robot-md-http # as fallback todayFull guide →
Don't see your surface?
New agent surface adapters are tracked as GitHub issues. If you need robot-md on a surface not listed here — an IDE extension, a custom orchestrator, or a new CLI — open a request and we'll add it to the roadmap.
Request a surface →
Every surface above runs the same robot-md-mcp stdio server.
The server reads your ROBOT.md at startup and exposes
capabilities, safety metadata, and compliance fields as MCP resources and tools.
Surface-specific wiring is a one-line config change per agent — the manifest file
and the MCP server stay identical.