Open WebUI Integration
Integrate Helicone with Open WebUI, an extensible, offline-capable interface for various LLM runners. Monitor interactions across Ollama, OpenAI-compatible APIs, and custom LLM setups.
Introduction
Open WebUI (formerly Ollama WebUI) is a feature-rich, self-hosted web interface designed for interacting with various LLM runners. It supports Ollama, OpenAI-compatible APIs, and custom LLM setups.
Integrating Helicone with Open WebUI allows you to monitor and analyze interactions across these diverse LLM interfaces and features.
Integration Steps
Create an account + Generate an API Key
Log into helicone or create an account. Once you have an account, you can generate an API key.
Make sure to generate a write only API key.
Set OPENAI_API_BASE_URL as an environment variable when running the container
For more advanced setups, including GPU support or custom Ollama configurations, refer to the Open WebUI GitHub repository and documentation.
Was this page helpful?