Manual Logger with Streaming
Learn how to use Helicone’s Manual Logger to track streaming LLM responses
Manual Logger with Streaming Support
Helicone’s Manual Logger provides powerful capabilities for tracking LLM requests and responses, including streaming responses. This guide will show you how to use the @helicone/helpers
package to log streaming responses from various LLM providers.
Installation
First, install the @helicone/helpers
package:
Basic Setup
Initialize the HeliconeManualLogger with your API key:
Streaming Methods
The HeliconeManualLogger provides several methods for working with streams:
1. logBuilder (New)
The recommended method for handling streaming responses with improved error handling:
2. logStream
A flexible method that gives you full control over stream handling:
3. logSingleStream
A simplified method for logging a single ReadableStream:
4. logSingleRequest
For logging a single request with a response body:
Next.js App Router with LogBuilder (Recommended)
The new logBuilder
method provides better error handling and simplified stream management:
The logBuilder
approach offers several advantages:
- Better error handling with
setError
method - Simplified stream handling with
toReadableStream
- More flexible async/await patterns with
sendLog
- Proper error status code tracking
Examples with Different LLM Providers
OpenAI
Together AI
Anthropic
Next.js API Route Example
Here’s how to use the manual logger in a Next.js API route:
Next.js App Router with Vercel’s after
Function
For Next.js App Router, you can use Vercel’s after
function to log requests without blocking the response:
Logging Custom Events
You can also use the manual logger to log custom events:
Advanced Usage: Tracking Time to First Token
The logStream
, logSingleStream
, and logBuilder
methods automatically track the time to first token, which is a valuable metric for understanding LLM response latency:
This timing information will be available in your Helicone dashboard, allowing you to monitor and optimize your LLM response times.
Conclusion
The HeliconeManualLogger provides powerful capabilities for tracking streaming LLM responses across different providers. By using the appropriate method for your use case, you can gain valuable insights into your LLM usage while maintaining the benefits of streaming responses.
Was this page helpful?