This feature is currently in Beta. Use it at your own discretion.

We strongly recommend using our proxy integration for prompts , as it is significantly simpler to set up!

This guide assumes that you are using the custom integration method or the asynchronous version of Helicone. To learn more, please read Proxy vs Async.

Setting up the asynchronous libraries can be challenging. If you would like us to enhance support for this, please email us at engineering@helicone.ai.

Getting Started with Asynchronous Integration

1

Obtain the request ID from your logged request

Option 1: Predefine

You need to obtain the Helicone request ID for the request that you are logging.

The simplest way to do this is to define it beforehand.

const requestId = uuidv4(); // must be a valid UUID

When you make your request, you can add the request ID via the providerRequest.meta tag as follows.

meta: {
  `Helicone-Request-Id: ${requestId}`;
}

Option 2: Retrieve it post-request

You can also retrieve your request ID post-request via callbacks within the asynchronous SDKs as follows.

onLog: async (response: Response) => {
  const heliconeId = response.headers.get("helicone-id");
},
2

Automatically detect your template inputs

const promptId = "my_prompt_id";
const res = await fetch(
  `https://api.hconeai.com/v1/request/${requestId}/prompt/${promptId}/inputs`,
  {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      "Helicone-Auth": `Bearer ${heliconeApiKey}`,
    },
    body: JSON.stringify({
      inputs: inputs,
    }),
  }
);
3

(Optional) Explicitly define your template

const promptId = "my_prompt_id";
const res = await fetch(
  `https://api.hconeai.com/v1/request/${requestId}/prompt/${promptId}/inputs`,
  {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      "Helicone-Auth": `Bearer ${heliconeApiKey}`,
    },
    body: JSON.stringify({
      inputs: inputs,
      inputTemplate: {
        model: "gpt-3.5-turbo",
        messages: [
          {
            role: "user",
            content: `The content you were already sending to OpenAI <helicone-input-prompt key="my_input"/>`,
          },
        ],
      },
    }),
  }
);