Skip to content

Comments

feat(instrumentation): Add Forge LLM provider support#3689

Open
Yiiii0 wants to merge 1 commit intotraceloop:mainfrom
Yiiii0:feature/add-forge-provider
Open

feat(instrumentation): Add Forge LLM provider support#3689
Yiiii0 wants to merge 1 commit intotraceloop:mainfrom
Yiiii0:feature/add-forge-provider

Conversation

@Yiiii0
Copy link

@Yiiii0 Yiiii0 commented Feb 16, 2026

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

What

Add Forge (TensorBlock) as a recognized LLM provider, following the same pattern as the existing OpenRouter integration.

Forge is an OpenAI-compatible LLM router that uses the Provider/model-name format (e.g., OpenAI/gpt-4o-mini), identical to OpenRouter.

Changes

  1. opentelemetry-semantic-conventions-ai: Added FORGE = "Forge" to GenAISystem enum
  2. opentelemetry-instrumentation-openai:
    • Added "forge.tensorblock.co" detection in _get_vendor_from_url() → returns "Forge"
    • Added Forge to the model name extraction condition alongside OpenRouter, so Provider/model-name is properly stripped to just model-name in spans

Notes

  • No new tests added — the changes are minimal vendor detection logic (string matching + enum value) that follows the exact same pattern as the existing provider support. The existing test suite (203 tests) passes without modification.
  • No screenshot included — the span attribute changes (gen_ai.system: "Forge", stripped model name) follow the identical pattern as OpenRouter which is already established.
  • Documentation: N/A — no user-facing doc changes needed for vendor detection.

About Forge

Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.

Motivation

We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with OpenLLMetry. This integration aims to bridge that gap.

Key Benefits

  • Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty
  • Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately
  • Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK)

References


Important

Add Forge LLM provider support by updating vendor detection and model name extraction logic.

  • Behavior:
    • Add Forge as a recognized LLM provider in GenAISystem enum in semconv_ai/__init__.py.
    • Update _get_vendor_from_url() in shared/__init__.py to detect forge.tensorblock.co and return "Forge".
    • Modify _set_request_attributes() in shared/__init__.py to handle Forge model name extraction using Provider/model-name format.
  • Misc:
    • No new tests added; existing test suite passes without modification.
    • No documentation changes required.

This description was created by Ellipsis for 9fe0517. You can customize this summary. It will automatically update as commits are pushed.

Summary by CodeRabbit

New Features

  • Added support for Forge as a new AI system provider with complete vendor detection and model identification capabilities.

## Changes

- Added Forge vendor detection in _get_vendor_from_url() and model name extraction, added FORGE to GenAISystem enum
- Environment variable: FORGE_API_KEY
- Base URL: https://api.forge.tensorblock.co/v1
- Model format: Provider/model-name (e.g., OpenAI/gpt-4o)
- Non-breaking: purely additive, existing providers are untouched

## About Forge

Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.

## Motivation

We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with OpenLLMetry. This integration aims to bridge that gap.

## Key Benefits

- Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty
- Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately
- Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK)

## References

- Repo: https://github.com/TensorBlock/forge
- Docs: https://www.tensorblock.co/api-docs/overview
- Main Page: https://www.tensorblock.co/
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to 9fe0517 in 13 seconds. Click for details.
  • Reviewed 34 lines of code in 2 files
  • Skipped 0 files when reviewing.
  • Skipped posting 0 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.

Workflow ID: wflow_qEJM0eRW0FYcMdGY

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@coderabbitai
Copy link

coderabbitai bot commented Feb 16, 2026

📝 Walkthrough

Walkthrough

Support for Forge as a vendor is introduced across two packages. The OpenAI instrumentation now recognizes Forge URLs and treats Forge identically to OpenRouter when extracting model names from provider-formatted identifiers. A new Forge enum member is added to the semantic conventions AI system definitions.

Changes

Cohort / File(s) Summary
Vendor detection and model extraction
packages/opentelemetry-instrumentation-openai/opentelemetry/instrumentation/openai/shared/__init__.py
Added Forge vendor recognition in _get_vendor_from_url for URLs containing "forge.tensorblock.co" and extended _set_request_attributes to treat Forge identically to OpenRouter when deriving model names from provider format.
Semantic conventions enum
packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py
Added new FORGE = "Forge" member to the GenAISystem enum to register Forge as a supported AI system identifier.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~8 minutes

Poem

🐰✨ A new vendor joins the fold,
Forge arrives with arms so bold,
OpenRouter's twin in spirit true,
Semantic constants bright and new!
The telemetry rings with joy—
One more system to deploy! 🔧

🚥 Pre-merge checks | ✅ 3 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly summarizes the main change: adding support for Forge as an LLM provider in the instrumentation, which is the primary focus of this changeset across both packages.
Merge Conflict Detection ✅ Passed ✅ No merge conflicts detected when merging into main

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Tip

Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
packages/opentelemetry-instrumentation-openai/opentelemetry/instrumentation/openai/shared/__init__.py (1)

118-119: Prefer in membership test for readability and extensibility.

As more routing providers are added, chained or comparisons become harder to maintain. A tuple membership test is more idiomatic Python.

♻️ Suggested change
-    elif vendor == "OpenRouter" or vendor == "Forge":
+    elif vendor in ("OpenRouter", "Forge"):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants