Skip to content

feat: detect OpenAI-compatible providers (MiniMax, Groq, etc.) in telemetry spans#1306

Open
octo-patch wants to merge 1 commit intoAgentOps-AI:mainfrom
octo-patch:feature/openai-compatible-provider-detection
Open

feat: detect OpenAI-compatible providers (MiniMax, Groq, etc.) in telemetry spans#1306
octo-patch wants to merge 1 commit intoAgentOps-AI:mainfrom
octo-patch:feature/openai-compatible-provider-detection

Conversation

@octo-patch
Copy link

Summary

When users use the OpenAI SDK with a custom base_url pointing to an OpenAI-compatible provider (e.g., MiniMax, Groq, DeepSeek), the gen_ai.system span attribute was always hardcoded to "OpenAI". This made it impossible to distinguish calls to different providers in observability dashboards.

This PR adds automatic provider detection by inspecting the OpenAI client base_url at instrumentation time:

  • New file: provider_detection.py maps known API base URLs to provider names
  • Modified: stream_wrapper.py chat completions and Responses API wrappers (sync + async) now detect the actual provider and set gen_ai.system accordingly
  • 35 new tests covering all detection paths

Supported providers

Provider Base URL pattern
MiniMax api.minimax.io, api.minimax.chat
Groq api.groq.com
Together AI api.together.xyz, api.together.ai
Fireworks AI api.fireworks.ai
DeepSeek api.deepseek.com
Mistral AI api.mistral.ai
Perplexity AI api.perplexity.ai
Google AI generativelanguage.googleapis.com
xAI api.x.ai
SambaNova api.sambanova.ai
Cerebras api.cerebras.ai

Example

from openai import OpenAI
import agentops

agentops.init()

# This call will now have gen_ai.system="MiniMax" instead of "OpenAI"
client = OpenAI(
    api_key="your-minimax-key",
    base_url="https://api.minimax.io/v1"
)
response = client.chat.completions.create(
    model="MiniMax-M2.7",
    messages=[{"role": "user", "content": "Hello!"}]
)

Design decisions

  • Detection only applies to chat completions and Responses API (wrapped via _custom_wrap), which are the primary LLM endpoints.
  • Unknown base URLs default to "OpenAI" no behavior change for standard OpenAI usage.
  • The provider host map is easily extensible.

Test plan

  • 16 unit tests for _match_provider() covering all providers + edge cases
  • 5 unit tests for _extract_base_url() covering string URLs, URL objects, missing attributes
  • 8 unit tests for detect_provider_from_instance() end-to-end
  • 6 integration tests simulating the full wrapper flow
  • All 148 existing instrumentation tests still pass

…emetry spans

When users use the OpenAI SDK with a custom base_url pointing to an
OpenAI-compatible provider like MiniMax, Groq, DeepSeek, or others,
the gen_ai.system span attribute was always set to "OpenAI". This made
it impossible to distinguish MiniMax or Groq calls from actual OpenAI
calls in observability dashboards.

This commit adds automatic provider detection by inspecting the OpenAI
client's base_url at instrumentation time. The detection is applied to
chat completions and Responses API wrappers (both sync and async),
covering the most commonly used endpoints.

Supported providers: MiniMax, Groq, Together AI, Fireworks AI, DeepSeek,
Mistral AI, Perplexity AI, Google AI, xAI, SambaNova, Cerebras.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant