Status: Pre-release

robot-md-http v0.1 is scaffolding — local-only, no auth, localhost-only. The full hosted version (v0.2) with RCAN 3.0 envelopes and a GPT Store entry is tracked in issue #3. This page is informational; the install flow below requires manual steps.

§ 01 — How it works

ChatGPT doesn't speak MCP — it speaks OpenAPI.

robot-md-http launches a local Hono-based HTTP server that mirrors the MCP surface as REST endpoints. ChatGPT Custom GPT Actions import the OpenAPI 3.1 schema directly.

Launch the HTTP bridge

$ npx -y robot-md-http /path/to/ROBOT.md

Server starts on http://localhost:8787. OpenAPI schema at /openapi.json. Then: in your Custom GPT editor → Actions → Add action → import the schema URL.

§ 02 — HTTP endpoints

Available endpoints (v0.1).

GET
/frontmatter
Full YAML frontmatter as JSON
GET
/capabilities
Capabilities array from frontmatter
GET
/safety
Safety block (estop, hitl_gates, etc.)
GET
/body
Markdown prose body as text/markdown
GET
/identity
RRN + robot_name + metadata fields
GET
/context
Combined context object (all fields)
POST
/validate
Validate the loaded ROBOT.md. Returns { ok, summary, errors }
POST
/render
Canonical YAML of frontmatter
§ 03 — Capabilities

What works via ChatGPT.

Feature ChatGPT (OpenAPI) Notes
Read capabilities / safety / frontmatter → HTTP bridge Requires local robot-md-http running
Validate ROBOT.md → HTTP bridge POST /validate via Custom GPT Action
Compliance filing — Not yet Planned for robot-md-http v0.2
Hosted mode (no local server) — Pre-release v0.2 roadmap — tracked in issue #3
GPT Store entry — Pre-release Deferred to v0.2

Track progress: robot-md issue #3 — robot-md-http v0.1 scaffold + v0.2 roadmap.