Skip to content

Providers

A Provider is an abstraction over an LLM backend. Any class that implements the Provider ABC (specifically, the complete() method) can power a Chimera agent. This design lets you swap between Anthropic, OpenAI, Google Gemini, Ollama, or any OpenAI-compatible endpoint without changing agent code.

from chimera.providers.base import Provider, Response
class Provider(ABC):
@abstractmethod
def complete(
self,
messages: list[Message],
tools: list[ToolSchema] | None = None,
temperature: float = 0.0,
max_tokens: int | None = None,
) -> Response: ...
@property
@abstractmethod
def context_window(self) -> int: ...
@property
@abstractmethod
def supports_tool_use(self) -> bool: ...
@property
@abstractmethod
def model_name(self) -> str: ...

The complete() method takes a list of messages and optional tool schemas, and returns a Response.

@dataclass
class Response:
content: str # Text content of the response
tool_calls: list[ToolCall] # Tool invocations requested by the model
usage: dict[str, int] # {"input_tokens": N, "output_tokens": M}
@property
def has_tool_calls(self) -> bool:
return len(self.tool_calls) > 0

There is also a StreamEvent dataclass for streaming responses, with types "text_delta", "tool_call_start", "tool_call_delta", and "done".

The recommended way to create a provider is through the factory function, which auto-detects the provider type from the model name. The model parameter is optional — when omitted, it falls back to the ANTHROPIC_MODEL environment variable:

from chimera.providers.factory import create_provider
# Model from ANTHROPIC_MODEL env var (default fallback)
provider = create_provider()
# Auto-detected as Anthropic
provider = create_provider(model="claude-sonnet-4-20250514")
# Auto-detected as OpenAI
provider = create_provider(model="gpt-4o")
# Auto-detected as Google
provider = create_provider(model="gemini-2.0-flash")
# Explicit provider type
provider = create_provider(provider_type="ollama", model="llama3.1")
# OpenAI-compatible endpoint
provider = create_provider(
provider_type="compatible",
model="my-model",
base_url="https://my-api.example.com/v1",
api_key="sk-...",
)

The factory infers the provider from the model name prefix:

Model prefixProvider
claude*Anthropic
gpt*, o1*, o3*, codex*OpenAI
gemini*Google
llama*, mistral*, qwen*, phi*Ollama

If no prefix matches, it falls back to checking environment variables (ANTHROPIC_BASE_URL, ANTHROPIC_AUTH_TOKEN, OPENAI_API_KEY).

ProviderClassInstall extraModel examples
AnthropicAnthropicProviderchimera-run[anthropic]claude-opus-4, claude-sonnet-4, claude-haiku-3.5
OpenAIOpenAIProviderchimera-run[openai]gpt-4o, gpt-4o-mini, o1, o3-mini
Google GeminiGoogleProviderchimera-run[google]gemini-2.0-flash, gemini-1.5-pro
OllamaOllamaProvider(none)llama3.1, mistral, qwen2.5
ModalModalProviderchimera-run[modal]Any model deployed on Modal
OpenAI-compatibleOpenAICompatibleProvider(none)Any model behind an OpenAI-compatible API

Each provider reads credentials from environment variables when no explicit api_key is passed:

VariableProvider
ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKENAnthropic
ANTHROPIC_BASE_URLAnthropic (custom endpoint)
OPENAI_API_KEYOpenAI
GOOGLE_API_KEYGoogle Gemini