Chimera is published on PyPI as chimera-run (the import path stays
chimera):
pip install chimera-run # core (zero dependencies)
pip install chimera-run[anthropic] # + Claude support
pip install chimera-run[openai] # + OpenAI support
pip install chimera-run[all] # all providers
Or with uv:
uv add chimera-run # core
uv add 'chimera-run[anthropic]' # + Claude support
Requires Python 3.11+. Tested on 3.11, 3.12, and 3.13.
Chimera auto-detects the provider from the model name. Set the appropriate
environment variables for your backend:
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="..."
No credentials needed — Chimera connects to http://localhost:11434
by default.
export ANTHROPIC_BASE_URL="https://api.z.ai/api/anthropic"
export ANTHROPIC_AUTH_TOKEN="your-token-here"
export ANTHROPIC_MODEL="glm-5"
| Provider | Model Prefixes | Auto-detected |
|---|
| Anthropic | claude-* | Yes |
| OpenAI | gpt-*, o1-*, o3-* | Yes |
| Google | gemini-* | Yes |
| Ollama | llama-*, mistral-*, qwen-*, phi-* | Yes |
| Modal | — | No (provider_type="modal") |
| Compatible | — | No (provider_type="compatible") |
This example creates an agent with the default tool set, points it at a local
workspace directory, and asks it to generate a Python script.
# 1. Create a provider (auto-detects from env vars)
provider = chimera.create_provider()
# 2. Build an agent with the built-in tools
tools=list(chimera.DEFAULT_TOOLS),
# 3. Run the agent in a local environment
env = chimera.LocalEnvironment("./workspace")
"Create a hello world Python script",