OpenAI SDK: One chat route with dynamic Azure OpenAI deploymentsv3.8+

Configure a dynamic route to target multiple Azure OpenAI model deployments.

In this configuration, if your SDK sends requests to http://localhost:8000/openai/deployments/my-gpt-3-5/chat/completions then AI Proxy Advanced automatically maps my-gpt-3-5 as the Azure deployment ID.

This allows a single Route to support multiple Azure model deployments dynamically.

For this configuration to work properly, you need a Route with the following configuration:

routes:
 - name: azure-chat-model-from-path
   paths:
     - "~/openai/deployments/azure-gpt-3-5/chat/completions$"
   methods:
     - POST

Prerequisites

  • Azure OpenAI Service account

Environment variables

  • AZURE_API_KEY: The API key used to authenticate requests to Azure OpenAI Service.

Set up the plugin

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!