This feature is being depracated, we are looking to support similar functionality in the future.

What are Templated Prompts?

Templating prompts is a way to separate the parameters from the prompt text (the “template”). The parameters can come from end user inputs or other variables in your application, like in a prompt chain.

In a poetry writing app, this may look like

{
    "prompt": "Write an {{adjective}} poem about {{subject}}",
    "values": {
        "adjective": "amazing",
        "subject": "Helicone",
    }
}

Benefits of using Templated Prompts with Helicone

  • Visualize clearly all the values used in the prompts separate from the prompts.
  • Organize all the requests made in your applications by prompts.
  • Analyze prompt metrics, such as how expensive the template text is compared to the values and when finetuning may reduce costs.

How to Use Templated Prompts

Toggle Prompt Templating on

Add the header Helicone-Prompt-Format (with any value) into the request to notify Helicone that you are sending a serialized prompt template with values string.

Add the header Helicone-Prompt-Format (with any value) into the request to notify Helicone that you are sending a serialized prompt template with values string.

Serialize the prompt template and values

Serialize a dictionary with two keys:

  • prompt is a string with the template of your prompt. Any parameter should be wrapped with two curly bracked {{param_name}}
  • values is a dictionary with keys for every param_name and values the supplied value for the parameter.

Helicone then formats the string, replacing the user parameter values with their actual values before passing the formatted prompt to OpenAI.

On OpenAIs end, it’s as if you just made the string-formatted request without serializing a dictionary.

Here is an example:

import json

template = {
    "prompt": "Write an {{adjective}} poem about {{subject}}",
    "values": {
        "adjective": "amazing",
        "subject": "Helicone",
    }
}

serialized_template = json.dumps(template)

openai.Completion.create(
    model="text-davinci-003",
    prompt=serialized_template,
    headers={
        'Helicone-Prompt-Format': 'on',
    }
)

This is equivalent to sending the prompt Write an amazing poem about Helicone directly to OpenAI.

You can use a parameter multiple times in the prompt

Make sure every prompt variable is defined in values and every value is present in prompt without any typos to avoid errors

(Optional) Assign a name to the prompt

You can give a name to the prompt you are using by passing in a header Helicone-Prompt-Name with the name of your prompt as the value. Otherwise, Helicone assigns a default name.