Skip to content

chimera.providers

chimera.providers is the LLM-backend layer. Every provider implements the Provider ABC; create_provider() auto-detects the right one from a model name or explicit provider_type.

from chimera.providers import (
Provider,
Response,
StreamEvent,
create_provider,
register_provider,
get_provider_factory,
list_providers,
ThinkingLevel,
budget_for_level,
ProxyProvider,
)
from chimera.providers.catalog import ModelConfig, ProviderCatalog
SymbolPurpose
ProviderABC. Implement complete(messages, tools=...) (sync) and/or stream(...). Accepts a thinking parameter.
ResponseDataclass with content, tool_calls, usage, cost, stop_reason.
StreamEventStreaming delta type yielded by Provider.stream.
create_provider(provider_type=None, model=None, **kwargs)Factory with model-name autodetection (see below).
register_provider(name, factory)Register a custom provider type at runtime.
get_provider_factory(name) / list_providers()Inspect the registry.
ThinkingLevelEnum: OFF / MINIMAL / LOW / MEDIUM / HIGH / MAX.
budget_for_level(level)Map a ThinkingLevel to a token budget.
ProxyProviderWrap another provider to intercept/log/transform requests.
ModelConfig / ProviderCatalogCatalog API for slash-namespaced models (bedrock/claude-sonnet-4, azure/gpt-4o, groq/llama-3.3-70b, …).
ModuleClassRoutes by name prefix
chimera.providers.anthropicAnthropicProviderclaude-*
chimera.providers.openai_providerOpenAIProvidergpt-*, o1-*, o3-*
chimera.providers.googleGoogleProvidergemini-*
chimera.providers.ollamaOllamaProviderllama*, mistral*, qwen*, phi*
chimera.providers.modalModalProvider(explicit)
chimera.providers.compatibleCompatibleProviderOpenAI-compatible endpoints (vLLM, LiteLLM, Together, Groq, Fireworks). Requires base_url.
chimera.providers.proxyProxyProviderWraps any other provider.
  1. Explicit provider_type (e.g. "anthropic", "compatible").
  2. Known model-name prefix (the table above).
  3. ProviderCatalog.default() lookup (slash-namespaced ids).
  4. Environment-variable fallback (ANTHROPIC_BASE_URL / ANTHROPIC_AUTH_TOKEN → Anthropic; OPENAI_API_KEY → OpenAI).

For the full list of catalog ids and their env-var contracts (DeepSeek, GLM, Kimi, Qwen3, GPT-OSS, Mistral Codestral, Gemma 3, Bedrock, Azure, Groq), see Use with Third-Party Providers.

from chimera.providers.cost import register_model_cost
from chimera.providers.cost_tracker import CostTracker

register_model_cost(model, input_per_mtok, output_per_mtok) plugs a custom price into the global table; CostTracker records granular per-step token counts (cache hits, reasoning tokens, regular input / output) and rolls them up into a session total.