Providers
A Provider is an abstraction over an LLM backend. Any class that implements the Provider ABC (specifically, the complete() method) can power a Chimera agent. This design lets you swap between Anthropic, OpenAI, Google Gemini, Ollama, or any OpenAI-compatible endpoint without changing agent code.
The Provider ABC
Section titled “The Provider ABC”from chimera.providers.base import Provider, Response
class Provider(ABC): @abstractmethod def complete( self, messages: list[Message], tools: list[ToolSchema] | None = None, temperature: float = 0.0, max_tokens: int | None = None, ) -> Response: ...
@property @abstractmethod def context_window(self) -> int: ...
@property @abstractmethod def supports_tool_use(self) -> bool: ...
@property @abstractmethod def model_name(self) -> str: ...The complete() method takes a list of messages and optional tool schemas, and returns a Response.
Response Dataclass
Section titled “Response Dataclass”@dataclassclass Response: content: str # Text content of the response tool_calls: list[ToolCall] # Tool invocations requested by the model usage: dict[str, int] # {"input_tokens": N, "output_tokens": M}
@property def has_tool_calls(self) -> bool: return len(self.tool_calls) > 0There is also a StreamEvent dataclass for streaming responses, with types "text_delta", "tool_call_start", "tool_call_delta", and "done".
The create_provider() Factory
Section titled “The create_provider() Factory”The recommended way to create a provider is through the factory function, which auto-detects the provider type from the model name. The model parameter is optional — when omitted, it falls back to the ANTHROPIC_MODEL environment variable:
from chimera.providers.factory import create_provider
# Model from ANTHROPIC_MODEL env var (default fallback)provider = create_provider()
# Auto-detected as Anthropicprovider = create_provider(model="claude-sonnet-4-20250514")
# Auto-detected as OpenAIprovider = create_provider(model="gpt-4o")
# Auto-detected as Googleprovider = create_provider(model="gemini-2.0-flash")
# Explicit provider typeprovider = create_provider(provider_type="ollama", model="llama3.1")
# OpenAI-compatible endpointprovider = create_provider( provider_type="compatible", model="my-model", base_url="https://my-api.example.com/v1", api_key="sk-...",)Auto-detection Rules
Section titled “Auto-detection Rules”The factory infers the provider from the model name prefix:
| Model prefix | Provider |
|---|---|
claude* | Anthropic |
gpt*, o1*, o3*, codex* | OpenAI |
gemini* | |
llama*, mistral*, qwen*, phi* | Ollama |
If no prefix matches, it falls back to checking environment variables (ANTHROPIC_BASE_URL, ANTHROPIC_AUTH_TOKEN, OPENAI_API_KEY).
Supported Providers
Section titled “Supported Providers”| Provider | Class | Install extra | Model examples |
|---|---|---|---|
| Anthropic | AnthropicProvider | chimera-run[anthropic] | claude-opus-4, claude-sonnet-4, claude-haiku-3.5 |
| OpenAI | OpenAIProvider | chimera-run[openai] | gpt-4o, gpt-4o-mini, o1, o3-mini |
| Google Gemini | GoogleProvider | chimera-run[google] | gemini-2.0-flash, gemini-1.5-pro |
| Ollama | OllamaProvider | (none) | llama3.1, mistral, qwen2.5 |
| Modal | ModalProvider | chimera-run[modal] | Any model deployed on Modal |
| OpenAI-compatible | OpenAICompatibleProvider | (none) | Any model behind an OpenAI-compatible API |
Environment Variable Configuration
Section titled “Environment Variable Configuration”Each provider reads credentials from environment variables when no explicit api_key is passed:
| Variable | Provider |
|---|---|
ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN | Anthropic |
ANTHROPIC_BASE_URL | Anthropic (custom endpoint) |
OPENAI_API_KEY | OpenAI |
GOOGLE_API_KEY | Google Gemini |