feat(instrumentation): Add Forge LLM provider support#3689
feat(instrumentation): Add Forge LLM provider support#3689Yiiii0 wants to merge 1 commit intotraceloop:mainfrom
Conversation
## Changes - Added Forge vendor detection in _get_vendor_from_url() and model name extraction, added FORGE to GenAISystem enum - Environment variable: FORGE_API_KEY - Base URL: https://api.forge.tensorblock.co/v1 - Model format: Provider/model-name (e.g., OpenAI/gpt-4o) - Non-breaking: purely additive, existing providers are untouched ## About Forge Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key. ## Motivation We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with OpenLLMetry. This integration aims to bridge that gap. ## Key Benefits - Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty - Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately - Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK) ## References - Repo: https://github.com/TensorBlock/forge - Docs: https://www.tensorblock.co/api-docs/overview - Main Page: https://www.tensorblock.co/
|
|
There was a problem hiding this comment.
Important
Looks good to me! 👍
Reviewed everything up to 9fe0517 in 13 seconds. Click for details.
- Reviewed
34lines of code in2files - Skipped
0files when reviewing. - Skipped posting
0draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
Workflow ID: wflow_qEJM0eRW0FYcMdGY
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
📝 WalkthroughWalkthroughSupport for Forge as a vendor is introduced across two packages. The OpenAI instrumentation now recognizes Forge URLs and treats Forge identically to OpenRouter when extracting model names from provider-formatted identifiers. A new Forge enum member is added to the semantic conventions AI system definitions. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Tip Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
packages/opentelemetry-instrumentation-openai/opentelemetry/instrumentation/openai/shared/__init__.py (1)
118-119: Preferinmembership test for readability and extensibility.As more routing providers are added, chained
orcomparisons become harder to maintain. A tuple membership test is more idiomatic Python.♻️ Suggested change
- elif vendor == "OpenRouter" or vendor == "Forge": + elif vendor in ("OpenRouter", "Forge"):
feat(instrumentation): ...orfix(instrumentation): ....What
Add Forge (TensorBlock) as a recognized LLM provider, following the same pattern as the existing OpenRouter integration.
Forge is an OpenAI-compatible LLM router that uses the
Provider/model-nameformat (e.g.,OpenAI/gpt-4o-mini), identical to OpenRouter.Changes
opentelemetry-semantic-conventions-ai: AddedFORGE = "Forge"toGenAISystemenumopentelemetry-instrumentation-openai:"forge.tensorblock.co"detection in_get_vendor_from_url()→ returns"Forge"Provider/model-nameis properly stripped to justmodel-namein spansNotes
gen_ai.system: "Forge", stripped model name) follow the identical pattern as OpenRouter which is already established.About Forge
Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.
Motivation
We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with OpenLLMetry. This integration aims to bridge that gap.
Key Benefits
References
Important
Add Forge LLM provider support by updating vendor detection and model name extraction logic.
GenAISystemenum insemconv_ai/__init__.py._get_vendor_from_url()inshared/__init__.pyto detectforge.tensorblock.coand return"Forge"._set_request_attributes()inshared/__init__.pyto handle Forge model name extraction usingProvider/model-nameformat.This description was created by
for 9fe0517. You can customize this summary. It will automatically update as commits are pushed.
Summary by CodeRabbit
New Features