OpenLLMetry Async Integration
Log LLM traces directly to Helicone, bypassing our proxy, with OpenLLMetry. Supports OpenAI, Anthropic, Azure OpenAI, Cohere, Bedrock, Google AI Platform, and more.
Overview
Async Integration let’s you log events and calls without placing Helicone in your app’s critical path. This ensures that an issue with Helicone will not cause an outage to your app.
Install Helicone Async
Initialize Logger
Properties
You can set properties on the logger to be used in Helicone using the withProperties
method. (These can be used for Sessions, User Metrics, and more.)
Disabling Logging
You can completely disable all logging to Helicone if needed when using the async integration mode. This is useful for development environments or when you want to temporarily stop sending data to Helicone without changing your code structure.
When logging is disabled, no traces will be sent to Helicone. This is different from disable_content_tracing()
which only omits request and response content but still sends other metrics. Note that this feature is only available when using Helicone’s async integration mode.
Supported Providers
- OpenAI
- Anthropic
- Azure OpenAI
- Cohere
- Bedrock
- Google AI Platform
Other Integrations
Was this page helpful?