Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Dec 22, 2025

Summary

This PR implements raw API inference request/response logging behind ROO_CODE_API_LOGGING=true and emits logs to the VS Code Debug Console via console.log.

What changed (vs the PR’s original direction)

The original approach was trending toward updating each provider’s business logic to manually log provider-specific request/response shapes.

This PR instead lands on a middleware-style transport interception approach where possible:

  • Adds an HTTP wrapper (createLoggingFetch) that logs actual request bodies and response bodies at the SDK fetch boundary.
  • Injects that wrapped fetch into SDK clients at construction time (OpenAI-compatible + Anthropic SDK), avoiding per-provider stream reassembly.
  • For streaming text/event-stream responses, the wrapper tees the stream so the SDK can consume it normally while we buffer/log the other branch.
  • Streaming logs emit an assembled final object and retain bounded raw stream debug (e.g. __sse.__rawSse).

For providers that don’t expose fetch injection or don’t use SSE/HTTP in an interceptable way, we fall back to provider-level logging at the API boundary.

Log format

  • Request: [API][request][Provider][model] + sanitized payload
  • Response: [API][response][Provider][model][Xms] (+ [streaming]) + sanitized payload
  • Error: [API][error][Provider][model][Xms] + sanitized payload

Payloads are sanitized (redaction, truncation, base64 image replacement) before logging.

Provider coverage

Covered via HTTP middleware (createLoggingFetch)

  • OpenAI-compatible providers (OpenAI / OpenAI Native / OpenRouter / RouterProvider wrappers / LM Studio / xAI / Requesty / Qwen Code / and other BaseOpenAiCompatibleProvider derivatives)
  • Anthropic SDK providers (Anthropic / MiniMax)
  • Direct fetch provider: Cerebras
  • Claude Code: fetch wrapped inside the integration streaming client

Covered via provider-level logging (special transports)

  • Bedrock (AWS SDK / Smithy event streams)
  • Native Ollama (ollama npm client)
  • Anthropic Vertex (@anthropic-ai/vertex-sdk)
  • Gemini / Vertex (Google GenAI SDK)
  • VS Code LM (VS Code API)

Tests

Adds/updates unit tests to validate:

  • createLoggingFetch (SSE + non-SSE logging behavior)
  • provider-level logging paths (Bedrock / Native Ollama / Anthropic Vertex / Gemini/Vertex / VS Code LM)

Important

Adds API inference logging for debugging, using middleware-style transport interception to log requests and responses across various providers, with structured logging to the VS Code Debug Console.

  • Behavior:
    • Adds API inference logging behind ROO_CODE_API_LOGGING=true, logging to VS Code Debug Console.
    • Uses middleware-style transport interception for logging requests/responses.
    • Handles streaming responses by teeing the stream and logging assembled objects.
  • Implementation:
    • Introduces createLoggingFetch for HTTP request/response logging.
    • Injects logging fetch into SDK clients (e.g., OpenAI, Anthropic) at construction.
    • Adds ApiInferenceLogger for structured logging of requests, responses, and errors.
  • Providers:
    • Covers OpenAI-compatible, Anthropic, Cerebras, Claude Code, Bedrock, Gemini, and others.
    • Falls back to provider-level logging for non-HTTP transports.
  • Tests:
    • Adds unit tests for createLoggingFetch and ApiInferenceLogger.
    • Tests logging behavior for various providers and scenarios.

This description was created by Ellipsis for e5b9558. You can customize this summary. It will automatically update as commits are pushed.

@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. Enhancement New feature or request labels Dec 22, 2025
@roomote
Copy link
Contributor

roomote bot commented Dec 22, 2025

Rooviewer Clock   See task on Roo Cloud

All issues have been resolved. The latest commit (e5b9558) adds improved model extraction from payloads, fixes Headers cloning for SSE stream teeing, gates dotenv loading with existence checks, and removes unused code. LGTM!

  • anthropic.ts:382 - Replace confusing if (logHandle!) with standard if (logHandle) check (resolved: code removed in cleanup)
Previous reviews

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Dec 22, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Review] in Roo Code Roadmap Dec 22, 2025
@hannesrudolph hannesrudolph added PR - Needs Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Dec 22, 2025
daniel-lxs and others added 4 commits December 29, 2025 18:21
- Add ApiInferenceLogger singleton class for logging API requests/responses
- Add inferenceLogger property to BaseProvider
- Update all providers to use the logger
- Configure logger in extension.ts with output channel sink
- Add payload sanitization (truncate strings, cap arrays, redact secrets)
- Enable via ROO_CODE_API_LOGGING=true environment variable

Closes ROO-271
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Dec 30, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Dec 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request PR - Needs Review size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants