Skip to main content

Connect an LLM Provider

Chat-in-Bio supports Anthropic, OpenAI, Google AI, and OpenRouter. You can switch providers at any time.

1. Set the API key

Add the key to your .env:

# Pick one (or set multiple for flexibility)
CHATINBIO_ANTHROPIC_API_KEY=sk-ant-...
CHATINBIO_OPENAI_API_KEY=sk-...
CHATINBIO_GOOGLE_AI_API_KEY=AIza...
CHATINBIO_OPENROUTER_API_KEY=sk-or-...

Restart the backend after changing .env.

2. Update bot config

curl -X PUT http://localhost:8000/api/admin/bot \
-H "Authorization: Bearer $ADMIN_KEY" \
-H "Content-Type: application/json" \
-d '{
"model_provider": "anthropic",
"model_name": "claude-sonnet-4-20250514"
}'

Provider and model reference

Provider valueModelsNotes
anthropicclaude-sonnet-4-20250514, claude-haiku-4-5-20251001Best for A2UI tool use
openaigpt-4o, gpt-4o-miniGood balance of cost/quality
google-glagemini-2.0-flash, gemini-2.5-pro-preview-06-05Cost-effective
openrouterAny model on OpenRouterAccess 100+ models via one API key

Using OpenRouter

OpenRouter gives you access to models from many providers (Llama, Mistral, Qwen, etc.) through a single API key:

# .env
CHATINBIO_OPENROUTER_API_KEY=sk-or-v1-...

# Set bot config to use an OpenRouter model
curl -X PUT http://localhost:8000/api/admin/bot \
-H "Authorization: Bearer $ADMIN_KEY" \
-H "Content-Type: application/json" \
-d '{
"model_provider": "openrouter",
"model_name": "meta-llama/llama-3.1-70b-instruct"
}'
Tool calling support

Chat-in-Bio relies on tool calling for A2UI components. Not all models support tool calling well. Models that work reliably: Claude, GPT-4o, Gemini, Llama 3.1+, Mistral Large. Smaller models may struggle with tool-heavy interactions.

3. Set a locale (optional)

If your audience primarily speaks a specific language, set the bot locale:

curl -X PUT http://localhost:8000/api/admin/bot \
-H "Authorization: Bearer $ADMIN_KEY" \
-H "Content-Type: application/json" \
-d '{"locale": "de"}'

The bot will default to responding in that language. If a visitor writes in a different language, the bot will try to match them.

4. Verify

Send a test message in the chat. If the provider is misconfigured, the agent will return a generic error message. Check the backend logs for details:

uv run litestar run --reload  # logs to stdout