Manual Logger
Manual Logger - Python
Integrate any custom LLM with Helicone using the Python Manual Logger. Step-by-step guide for Python implementation to connect your proprietary or open-source models.
Python Manual Logger
Logging calls to custom models is supported via the Helicone Python SDK.
1
Install the Helicone helpers package
2
Set `HELICONE_API_KEY` as an environment variable
You can also set the Helicone API Key in your code (See below)
3
Create a new HeliconeManualLogger instance
4
Define your operation and make the request
API Reference
HeliconeManualLogger
logRequest
Parameters
request
: A dictionary containing the request parametersoperation
: A callable that takes a HeliconeResultRecorder and returns a resultadditional_headers
: Optional dictionary of additional headersprovider
: Optional provider specification (“openai”, “anthropic”, or None for custom)
HeliconeResultRecorder
Advanced Usage Examples
Streaming Responses
For streaming responses with Python, you can use the log_stream
method:
Using with Anthropic
Custom Model Integration
For custom models that don’t have a specific provider integration:
For more examples and detailed usage, check out our Manual Logger with Streaming cookbook.
Was this page helpful?