P0.9.1 — ν MCP Bridge — Audit

Step 1 of the 5-step executor chain. Inventory of the surface that P0.9.1 must land into, the constraints that bound it, and the seams already in the codebase.

Task scope (from docs/guides/implementation/task-breakdown.md § P0.9.1)

  • Output: src/domains/integrations/mcp-bridge.ts + src/__tests__/domains/integrations/mcp-bridge.test.ts
  • Effort: M.
  • Depends on: P0.2.1 (SHIPPED Wave A at 40cd679d).

Acceptance criteria (1:1 mapping to Step 2 contract)

  1. McpBridge: wraps outbound MCP client calls to external servers.
  2. connectToServer(url): creates MCP client and returns a connected bridge.
  3. callTool(bridge, name, args): calls remote tool, returns result.
  4. Timeout: 30s default, configurable via COLIBRI_MCP_TIMEOUT.
  5. Retry: 3 attempts with exponential backoff on transient errors.
  6. Test: mock MCP server → verify roundtrip tool call.

Existing files this task depends on

Path Role here Key lines
src/domains/integrations/index.ts ν barrel. This task adds a new export line: export * from './mcp-bridge.js'. L1-9 (whole file, short)
src/domains/integrations/notifications.ts P0.9.3 precedent. Fire-and-forget pattern, COLIBRI_* namespace, stderr-only logging, library-only (no server coupling). Follow this pattern. L1-322
src/config.ts Zod env validation. Needs one new optional field: COLIBRI_MCP_TIMEOUT (positive integer, default 30000). Must stay in COLIBRI_* namespace. L43-73 schema block
docs/reference/extractions/nu-integrations-extraction.md Donor-era ν spec (R45). “MCP Bridge: Stdio Transport + JSON-RPC + Tool Dispatch” section and “Webhook Delivery” section carry the retry/backoff algorithm reference. Bridge uses HTTP/SSE transport (not stdio) for outbound calls — see transport choice note below. L14-56 bridge + pipeline; L58-114 retry mechanics reference
package.json @modelcontextprotocol/sdk ^1.0.4 already present. No new runtime deps needed — SDK exposes Client, InMemoryTransport, SSE/HTTP transports out of the box. L28

MCP SDK surface inventory

The @modelcontextprotocol/sdk package already installed exposes:

Export Path Use here
Client @modelcontextprotocol/sdk/client/index.js Core outbound MCP client
InMemoryTransport @modelcontextprotocol/sdk/inMemory.js Test-only transport — pairs two in-memory endpoints via createLinkedPair()
Server @modelcontextprotocol/sdk/server/index.js Used in tests only — creates a mock server
CallToolResultSchema @modelcontextprotocol/sdk/types.js Second arg to client.callTool()

Key API shape from dist/esm/client/index.d.ts:

class Client {
  constructor(clientInfo: Implementation, options?: ClientOptions)
  connect(transport: Transport, options?: RequestOptions): Promise<void>
  callTool(
    params: CallToolRequest['params'],
    resultSchema?: typeof CallToolResultSchema,
    options?: RequestOptions
  ): Promise<CallToolResult>
  close(): Promise<void>
}

InMemoryTransport.createLinkedPair() returns [clientTransport, serverTransport] — ideal for unit tests without spawning subprocesses or real network.

Transport choice for outbound connections

The spec says connectToServer(url). For Phase 0:

  • url will be a string. Real-world usage would use StreamableHttpClientTransport or SSEClientTransport.
  • However, for Phase 0 scope, the bridge must be testable in isolation. Injecting the transport as a factory is the right abstraction — avoids spawning network processes in tests.
  • Contract will define TransportFactory as the seam; the default factory creates StreamableHttpClientTransport when the SDK exports it, falling back to a StdioClientTransport.
  • Tests override the factory with InMemoryTransport.

Retry / error classification

Reference: nu-integrations-extraction.md L58-114 (webhook backoff) and docs/reference/extractions/*.md. The donor’s webhook retry pattern applies:

Error type Retryable
ETIMEDOUT / ECONNRESET / ECONNREFUSED Yes
MCP McpError with INTERNAL_ERROR code Yes
MCP McpError with INVALID_REQUEST or NOT_FOUND No
Network TypeError (fetch failed) Yes
Timeout (AbortController elapsed) Yes

Max 3 attempts (not 5 like donor webhook). Backoff: base 1000ms × 2^(attempt-1) (1s, 2s, 4s). Jitter not required for Phase 0.

Scope boundary

  • Library-only. No MCP tool registration in src/server.ts. No broker_* tool surface.
  • No migration needed — bridge state is in-memory (active connections are short-lived; no persistence required).
  • P0.9.2 (Claude API wrappers) is owned by a sibling sub-agent. Do NOT touch src/domains/integrations/claude.ts or 008_nu_anthropic.sql.

New files / directories this task creates

src/domains/integrations/mcp-bridge.ts              ← implementation
src/__tests__/domains/integrations/mcp-bridge.test.ts ← tests
docs/audits/p0-9-1-nu-mcp-bridge-audit.md           ← this file (Step 1)
docs/contracts/p0-9-1-nu-mcp-bridge-contract.md     ← Step 2
docs/packets/p0-9-1-nu-mcp-bridge-packet.md         ← Step 3
docs/verification/p0-9-1-nu-mcp-bridge-verification.md ← Step 5

Files modified (beyond new ones above)

  • src/domains/integrations/index.ts — add export * from './mcp-bridge.js'
  • src/config.ts — add COLIBRI_MCP_TIMEOUT: z.coerce.number().int().positive().default(30000) to the schema

Donor-bug mitigations explicitly cited

  • mcp-bridge.ts must NEVER write to process.stdout — that pipe belongs to the inbound StdioServerTransport owned by src/server.ts.
  • All log output from the bridge must go to console.error (stderr).
  • Retries must not interfere with the MCP SDK’s own in-flight deduplication — create a fresh Client instance per connectToServer() call.

Carry-over tails (not in scope)

  • src/db/index.ts:15 / src/db/schema.sql:11 planning comments reference old migration numbers — leave as-is; the P0.9.1 bridge has no migration.
  • P0.9.3 notifications library is not wired to the bridge yet — that coupling is a future task.

Back to top

Colibri — documentation-first MCP runtime. Apache 2.0 + Commons Clause.

This site uses Just the Docs, a documentation theme for Jekyll.