AI Proxy Advanced

AI License Required

Batches route with Azure OpenAI Service

Configure a batch route using Azure OpenAI Service.

To connect to Azure AI, you’ll need three values from your Azure OpenAI resource:

Make sure to use a globalbatch or datazonebatch deployment for files and batch operations. Standard deployments (GlobalStandard) cannot process batch and file operations.

  1. Deployment ID — The unique name of your deployed model.
    • In the Azure AI Foundry Portal sidebar, select a resource and go to: Shared Resources > Deployments > Model deployments, then click the deployment name.
    • You can also see the deployment ID in the Azure OpenAI URL when calling the API, for example: https://{AZURE_INSTANCE_NAME}.openai.azure.com/openai/deployments/{AZURE_DEPLOYMENT_ID}/...
  2. Instance name — The name of your Azure OpenAI resource.
    • This is the prefix in your API endpoint URL, for example: https://{AZURE_INSTANCE_NAME}.openai.azure.com
  3. API Key — The key used to authenticate requests to your Azure OpenAI deployment in Azure AI Foundry.
    • In the Azure AI Foundry Portal sidebar, select a resource and go to: Shared Resources > Deployments > Model deployments, then click the deployment name.
    • The API key is visible in the Endpoint tile.

Check the Send batch requests to Azure OpenAI LLMs tutorial for a step-by-step walkthrough.

Environment variables

  • AZURE_OPENAI_API_KEY

  • AZURE_INSTANCE_NAME

  • AZURE_DEPLOYMENT_ID

Set up the plugin

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!