Validate Gen AI tool calls with Jaeger and OpenTelemetry

Deployment Platform
Minimum Version
Kong Gateway - 3.13
TL;DR

Configure the AI Proxy plugin with logging.log_statistics and logging.log_payloads enabled. Enable the OpenTelemetry plugin pointing to your Jaeger endpoint. Send requests with tool definitions to your AI provider. Jaeger traces will include gen_ai.tool.* attributes such as gen_ai.tool.name, gen_ai.tool.type, and gen_ai.tool.call.id when the LLM responds with tool calls.

Prerequisites

Set the following Jaeger tracing variables before you configure the Data Plane:

export KONG_TRACING_INSTRUMENTATIONS=all
export KONG_TRACING_SAMPLING_RATE=1.0

This is a Konnect tutorial and requires a Konnect personal access token.

  1. Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.

  2. Export your token to an environment variable:

     export KONNECT_TOKEN='YOUR_KONNECT_PAT'
    
  3. Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:

     curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN -e KONG_TRACING_INSTRUMENTATIONS -e KONG_TRACING_SAMPLING_RATE --deck-output
    

    This sets up a Konnect Control Plane named quickstart, provisions a local Data Plane, and prints out the following environment variable exports:

     export DECK_KONNECT_TOKEN=$KONNECT_TOKEN
     export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart
     export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com
     export KONNECT_PROXY_URL='http://localhost:8000'
    

    Copy and paste these into your terminal to configure your session.

This tutorial requires Kong Gateway Enterprise. If you don’t have Kong Gateway set up yet, you can use the quickstart script with an enterprise license to get an instance of Kong Gateway running almost instantly.

  1. Export your license to an environment variable:

     export KONG_LICENSE_DATA='LICENSE-CONTENTS-GO-HERE'
    
  2. Run the quickstart script:

    curl -Ls https://get.konghq.com/quickstart | bash -s -- -e KONG_LICENSE_DATA \
      -e KONG_TRACING_INSTRUMENTATIONS \
      -e KONG_TRACING_SAMPLING_RATE
    

    Once Kong Gateway is ready, you will see the following message:

     Kong Gateway Ready
    

decK is a CLI tool for managing Kong Gateway declaratively with state files. To complete this tutorial, install decK version 1.43 or later.

This guide uses deck gateway apply, which directly applies entity configuration to your Gateway instance. We recommend upgrading your decK installation to take advantage of this tool.

You can check your current decK version with deck version.

For this tutorial, you’ll need Kong Gateway entities, like Gateway Services and Routes, pre-configured. These entities are essential for Kong Gateway to function but installing them isn’t the focus of this guide. Follow these steps to pre-configure them:

  1. Run the following command:

    echo '
    _format_version: "3.0"
    services:
      - name: example-service
        url: http://httpbin.konghq.com/anything
    routes:
      - name: example-route
        paths:
        - "/anything"
        service:
          name: example-service
    ' | deck gateway apply -
    

To learn more about entities, you can read our entities documentation.

This tutorial uses OpenAI:

  1. Create an OpenAI account.
  2. Get an API key.
  3. Create a decK variable with the API key:
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"

This tutorial requires you to install Jaeger.

In a new terminal window, deploy a Jaeger instance with Docker in all-in-one mode:

docker run --rm --name jaeger \
-e COLLECTOR_OTLP_ENABLED=true \
-p 16686:16686 \
-p 4317:4317 \
-p 4318:4318 \
-p 5778:5778 \
-p 9411:9411 \
jaegertracing/jaeger:2.5.0

The COLLECTOR_OTLP_ENABLED environment variable must be set to true to enable the OpenTelemetry Collector.

In this tutorial, we’re using host.docker.internal as our host instead of the localhost that Jaeger is using because Kong Gateway is running in a container that has a different localhost to you. Export the host as an environment variable in the terminal window you used to set the other Kong Gateway environment variables:

export DECK_JAEGER_HOST=host.docker.internal

Configure the AI Proxy plugin

The AI Proxy plugin routes LLM requests to external providers like OpenAI. To observe tool call interactions in detail, enable the plugin’s logging capabilities, which instrument requests and responses as OpenTelemetry spans.

Configure AI Proxy to route traffic to OpenAI and enable trace logging:

echo '
_format_version: "3.0"
plugins:
  - name: ai-proxy
    config:
      route_type: llm/v1/chat
      auth:
        header_name: Authorization
        header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
      model:
        provider: openai
        name: gpt-5-mini
        options:
          max_tokens: 512
          temperature: 1.0
      logging:
        log_statistics: true
        log_payloads: true
' | deck gateway apply -

The logging configuration controls what the AI Proxy plugin records:

  • log_statistics: Captures token usage, latency, and model metadata
  • log_payloads: Records the complete request prompts and LLM responses

These logs become OpenTelemetry span attributes when the OpenTelemetry plugin is enabled.

Enable the OpenTelemetry plugin

The OpenTelemetry plugin instruments Kong Gateway to export distributed traces. This allows you to observe request flows, measure latency, and inspect AI proxy operations including tool call requests and responses.

Configure the plugin to send traces to your Jaeger collector:

echo '
_format_version: "3.0"
plugins:
  - name: opentelemetry
    config:
      traces_endpoint: http://${{ env "DECK_JAEGER_HOST" }}:4318/v1/traces
      resource_attributes:
        service.name: kong-dev
' | deck gateway apply -

The traces_endpoint points to Jaeger’s OTLP HTTP receiver on port 4318. The service.name attribute identifies this Kong Gateway instance in the Jaeger UI, allowing you to filter traces by service.

For more information about the ports Jaeger uses, see API Ports in the Jaeger documentation.

Validate

Send a request that includes a tool definition. The LLM will respond with a tool call if it determines the user’s query requires function execution.

curl -X POST "$KONNECT_PROXY_URL/anything" \
     --no-progress-meter --fail-with-body  \
     -H "Accept: application/json"\
     -H "Content-Type: application/json" \
     --json '{
       "model": "gpt-5-mini",
       "stream": false,
       "tools": [
         {
           "type": "function",
           "function": {
             "name": "get_temperature",
             "description": "Get the current temperature for a city",
             "parameters": {
               "type": "object",
               "required": [
                 "city"
               ],
               "properties": {
                 "city": {
                   "type": "string",
                   "description": "The name of the city"
                 }
               }
             }
           }
         }
       ],
       "messages": [
         {
           "role": "user",
           "content": "What is the temperature in New York?"
         }
       ]
     }'
curl -X POST "http://localhost:8000/anything" \
     --no-progress-meter --fail-with-body  \
     -H "Accept: application/json"\
     -H "Content-Type: application/json" \
     --json '{
       "model": "gpt-5-mini",
       "stream": false,
       "tools": [
         {
           "type": "function",
           "function": {
             "name": "get_temperature",
             "description": "Get the current temperature for a city",
             "parameters": {
               "type": "object",
               "required": [
                 "city"
               ],
               "properties": {
                 "city": {
                   "type": "string",
                   "description": "The name of the city"
                 }
               }
             }
           }
         }
       ],
       "messages": [
         {
           "role": "user",
           "content": "What is the temperature in New York?"
         }
       ]
     }'

Validate gen_ai.tool attributes in Jaeger

Verify that the trace includes the expected span attributes for LLM tool call operations.

  1. Open the Jaeger UI at http://localhost:16686/.
  2. In the Service dropdown, select kong-dev.
  3. Click Find Traces.
  4. Click a trace result for the kong-dev service.
  5. In the trace detail view, locate and expand the span labeled kong.access.plugin.ai-proxy.
  6. Locate and expand the child span labeled kong.gen_ai.
  7. Verify the following span attributes are present:
    • gen_ai.operation.name: Set to chat
    • gen_ai.provider.name: Set to openai
    • gen_ai.request.model: The model identifier (for example, gpt-5-mini)
    • gen_ai.request.max_tokens: Maximum token limit (for example, 512)
    • gen_ai.request.temperature: Sampling temperature (for example, 1)
    • gen_ai.response.finish_reasons: Array containing ["tool_calls"] when the LLM responds with a tool call
    • gen_ai.response.id: Unique identifier for the API response
    • gen_ai.response.model: Actual model version used (for example, gpt-5-mini-2025-08-07)
    • gen_ai.tool.call.id: Unique identifier for the specific tool call (for example, call_KsEYAR17QngwYlWmNY5Q3K7D)
    • gen_ai.tool.name: Name of the function the LLM wants to call (for example, get_temperature)
    • gen_ai.tool.type: Set to function
    • gen_ai.usage.input_tokens: Token count for the request
    • gen_ai.usage.output_tokens: Token count for the response
    • gen_ai.output.type: Set to json

The presence of gen_ai.tool.* attributes indicates the LLM determined a tool call was needed to answer the user’s query. The gen_ai.response.finish_reasons array will contain tool_calls instead of stop when function calling is triggered.

Cleanup

If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.

curl -Ls https://get.konghq.com/quickstart | bash -s -- -d
Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!