Other runtimes (coming soon)
Carabase’s agent runtime is pluggable through the AgentRuntimeProvider interface in src/services/agent-runtime/provider.ts. Three providers exist in the v0.1 codebase, but only one ships:
| Provider | Module | Status in v0.1 |
|---|---|---|
| OpenClaw | openclaw-provider.ts | ✅ Supported — see OpenClaw |
| Claude | claude-provider.ts | Stub present, not advertised |
| Codex | codex-provider.ts | Stub present, not advertised |
Why only OpenClaw for v0.1?
Section titled “Why only OpenClaw for v0.1?”Two reasons:
- The chat surface is the most user-visible thing in the product. Shipping it with one runtime that we’ve end-to-end tested beats shipping with three runtimes where two have rough edges.
- OpenClaw is the only runtime whose memory + skill model we’ve validated against Carabase’s MCP server contract. Claude / Codex providers work for one-shot agentic flows (issued via
POST /api/v1/agent-tasks) but haven’t been hardened for the chat-loop pattern.
What “stub present” means
Section titled “What “stub present” means”The AI SDK provider modules are wired through src/services/agent-runtime/registry.ts and selectable via the provider field on each agent task. So this works in v0.1:
curl -X POST http://localhost:3000/api/v1/agent-tasks \ -H "x-workspace-id: $WORKSPACE_ID" \ -d '{ "instruction": "...", "originBlockId": "...", "provider": "claude" }'…as long as utilityHigh model routing is configured for an Anthropic key. But this is a developer-only path — no UI surface, no chat history persistence in the chat-sessions table, no streaming over the existing /api/v1/chat/stream SSE endpoint.
What’s coming in v0.2
Section titled “What’s coming in v0.2”- Claude as a fully-supported chat runtime with the same MCP tool surface OpenClaw gets
- Codex as a fully-supported chat runtime
- A picker in the Admin SPA’s AI Engine page that lets you choose per-workspace
- A “test connection” round-trip for each runtime, similar to the existing OpenClaw one
If you want to track this work or have a strong opinion on which runtime to prioritize, comment on the relevant issue in the GitHub repo.