Configure LLM provider endpoints and models for custom deployments
Provider | Default Endpoint | Custom Endpoint Support |
---|---|---|
OpenAI | https://api.openai.com | ✅ (regional variants) |
Anthropic | https://api.anthropic.com | ✅ (regional variants) |
Gemini | https://generativelanguage.googleapis.com | ✅ (Gemini) |
AWS Bedrock | Regional AWS endpoints | ✅ (cross-region) |
VertexAI | Regional GCP endpoints | ✅ (cross-region) |
Ollama | http://localhost:11434 | ✅ (any host/port) |