Together AI Integration
Connect Helicone with Together AI, a platform for running open-source language models. Monitor and optimize your AI applications using Together AI’s powerful models through a simple base_url configuration.
You can seamlessly integrate Helicone with your OpenAI compatible models that are deployed on Together AI.
The integration process closely mirrors the proxy approach. The only distinction lies in the modification of the base_url to point to the dedicated TogetherAI endpoint https://together.helicone.ai/v1
.
Please ensure that the base_url is correctly set to ensure successful integration.
Streaming with Together AI
Helicone now provides enhanced support for streaming with Together AI through our improved asynchronous stream parser. This allows for more efficient and reliable handling of streamed responses.
Example: Manual Logging with Streaming
Here’s an example of how to use Helicone’s manual logging with Together AI’s streaming functionality:
This approach allows you to:
- Log all your Together AI streaming requests to Helicone
- Process the stream in your application simultaneously
- Benefit from Helicone’s improved async stream parser for better performance
For more information on streaming with Helicone, see our streaming documentation.
Was this page helpful?