Introduction

Open WebUI (formerly Ollama WebUI) is a feature-rich, self-hosted web interface designed for interacting with various LLM runners. It supports Ollama, OpenAI-compatible APIs, and custom LLM setups.

Integrating Helicone with Open WebUI allows you to monitor and analyze interactions across these diverse LLM interfaces and features.

Integration Steps

1

Create an account + Generate an API Key

Log into helicone or create an account. Once you have an account, you can generate an API key.

Make sure to generate a write only API key.

2

Set OPENAI_API_BASE_URL as an environment variable when running the container

export HELICONE_API_KEY=pk-<YOUR_API_KEY>

docker run -d -p 9842:8080 -e OPENAI_API_BASE_URL="https://oai.helicone.ai/${HELICONE_API_KEY}/v1" -e OPENAI_API_KEY="sk-api_key" --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

For more advanced setups, including GPU support or custom Ollama configurations, refer to the Open WebUI GitHub repository and documentation.