Automatically version and manage your prompts from your codebase, or create and edit them in the UI.
Example: A prompt template designed for a course generation application.
helicone-prompt-id
, allowing you to:
Install the package
Import hpf
Add `hpf` and identify input variables
hpf
and enclosing your input variables in an additional {}
, it allows Helicone to easily detect your prompt and inputs. We’ve designed for minimum code change to keep it as easy as possible to use Prompts.hpf
, Helicone provides hpstatic
for creating static prompts that don’t change between requests. This is useful for system prompts or other constant text that you don’t want to be treated as variable input.To use hpstatic
, import it along with hpf
:hpstatic
function wraps the entire text in <helicone-prompt-static>
tags, indicating to Helicone that this part of the prompt should not be treated as variable input.Assign an id to your prompt
Helicone-Prompt-Id
header to your LLM request.Assigning an id allows us to associate your prompt with future versions of your prompt, and automatically manage versions on your behalf.character
. For example, the prompt is “Write a story about a secret agent”, where the character
is “a secret agent”.
Helicone-Prompt-Mode
header to testing
in your LLM request. This will prevent Helicone from tracking new prompt versions.How can I improve my LLM app's performance?
Does Helicone own my prompts?
How does the version indexing work?
V2.0
, while others are labeled V2.1
. Major versions (i.e., V1.0
, V2.0
, V3.0
) are your prompts in production. Minor versions (V1.1
, V1.2
, V1.3
) are experiment versions that are created in Helicone, which were forked off of your production version V1.0
in this example.Need more help?