Async Logging
OpenLLMetry Async Integration
Log LLM traces directly to Helicone, bypassing our proxy, with OpenLLMetry. Supports OpenAI, Anthropic, Azure OpenAI, Cohere, Bedrock, Google AI Platform, and more.
Overview
Async Integration let’s you log events and calls without placing Helicone in your app’s critical path. This ensures that an issue with Helicone will not cause an outage to your app.
1
Install Helicone Async
2
Initialize Logger
3
Properties
You can set properties on the logger to be used in Helicone using the withProperties
method. (These can be used for Sessions, User Metrics, and more.)
Supported Providers
- OpenAI
- Anthropic
- Azure OpenAI
- Cohere
- Bedrock
- Google AI Platform
Other Integrations
Was this page helpful?