Route Claude CLI traffic through Kong AI Gateway and OpenAI
Install Claude CLI, configure its API key helper, create a Gateway Service and Route, attach the AI Proxy plugin to forward requests to Claude, enable the File Log plugin to inspect traffic, and point Claude CLI to the local proxy endpoint so all LLM requests pass through the AI Gateway for monitoring and control.
Prerequisites
Kong Konnect
This is a Konnect tutorial and requires a Konnect personal access token.
-
Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.
-
Export your token to an environment variable:
export KONNECT_TOKEN='YOUR_KONNECT_PAT'Copied! -
Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:
curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN --deck-outputCopied!This sets up a Konnect Control Plane named
quickstart, provisions a local Data Plane, and prints out the following environment variable exports:export DECK_KONNECT_TOKEN=$KONNECT_TOKEN export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com export KONNECT_PROXY_URL='http://localhost:8000'Copied!Copy and paste these into your terminal to configure your session.
Kong Gateway running
This tutorial requires Kong Gateway Enterprise. If you don’t have Kong Gateway set up yet, you can use the quickstart script with an enterprise license to get an instance of Kong Gateway running almost instantly.
-
Export your license to an environment variable:
export KONG_LICENSE_DATA='LICENSE-CONTENTS-GO-HERE'Copied! -
Run the quickstart script:
curl -Ls https://get.konghq.com/quickstart | bash -s -- -e KONG_LICENSE_DATACopied!Once Kong Gateway is ready, you will see the following message:
Kong Gateway Ready
decK v1.43+
decK is a CLI tool for managing Kong Gateway declaratively with state files. To complete this tutorial, install decK version 1.43 or later.
This guide uses deck gateway apply, which directly applies entity configuration to your Gateway instance.
We recommend upgrading your decK installation to take advantage of this tool.
You can check your current decK version with deck version.
Required entities
For this tutorial, you’ll need Kong Gateway entities, like Gateway Services and Routes, pre-configured. These entities are essential for Kong Gateway to function but installing them isn’t the focus of this guide. Follow these steps to pre-configure them:
-
Run the following command:
echo ' _format_version: "3.0" services: - name: example-service url: http://httpbin.konghq.com/anything routes: - name: example-route paths: - "/anything" service: name: example-service ' | deck gateway apply -Copied!
To learn more about entities, you can read our entities documentation.
OpenAI
This tutorial uses OpenAI:
- Create an OpenAI account.
- Get an API key.
- Create a decK variable with the API key:
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"
Claude Code CLI
-
Install Claude:
curl -fsSL https://claude.ai/install.sh | bashCopied! -
Create or edit the Claude settings file:
mkdir -p ~/.claude nano ~/.claude/settings.jsonCopied!Put this exact content in the file:
{ "apiKeyHelper": "~/.claude/anthropic_key.sh" }Copied! -
Create the API key helper script:
nano ~/.claude/anthropic_key.shCopied!Inside, put a dummy API key:
echo "x"Copied! -
Make the script executable:
chmod +x ~/.claude/anthropic_key.shCopied! -
Verify it works by running the script:
~/.claude/anthropic_key.shCopied!You should see only your API key printed.
Configure the AI Proxy plugin
First, configure the AI Proxy plugin for the OpenAI provider:
- This setup uses the default
llm/v1/chatroute. Claude Code sends its requests to this route. - The configuration also raises the maximum request body size to 512 KB to support larger prompts.
The llm_format: anthropic parameter tells Kong AI Gateway to expect request and response payloads that match Claude’s native API format. Without this setting, the Gateway would default to OpenAI’s format, which would cause request failures when Claude Code communicates with the OpenAI endpoint.
echo '
_format_version: "3.0"
plugins:
- name: ai-proxy
config:
llm_format: anthropic
route_type: llm/v1/chat
logging:
log_statistics: true
log_payloads: false
auth:
header_name: Authorization
header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
allow_override: false
model:
provider: openai
name: gpt-5-mini
max_request_body_size: 524288
' | deck gateway apply -
Configure the File Log plugin
Now, let’s enable the File Log plugin on the Service, to inspect the LLM traffic between Claude and the AI Gateway. This creates a local claude.json file on your machine. The file records each request and response so you can review what Claude sends through the AI Gateway.
echo '
_format_version: "3.0"
plugins:
- name: file-log
config:
path: "/tmp/claude.json"
' | deck gateway apply -
Verify traffic through AI Gateway
Now, we can start a Claude Code session that points it to the local AI Gateway endpoint:
ANTHROPIC_BASE_URL=http://localhost:8000/anything \
ANTHROPIC_MODEL=gpt-5-mini \
claude
Claude Code asks for permission before it runs tools or interacts with files:
I'll need permission to work with your files.
This means I can:
- Read any file in this folder
- Create, edit, or delete files
- Run commands (like npm, git, tests, ls, rm)
- Use tools defined in .mcp.json
Learn more ( https://docs.claude.com/s/claude-code-security )
❯ 1. Yes, continue
2. No, exit
Select Yes, continue. The session starts. Ask a simple question to confirm that requests reach Kong AI Gateway.
Tell me about Procopius' Secret History.
Claude Code might prompt you approve its web search for answering the question. When you select Yes, Claude will produce a full-length response to your request:
Procopius’ Secret History (Greek: Ἀνέκδοτα, Anekdota) is a fascinating and
notorious work of Byzantine literature written in the 6th century by the
court historian Procopius of Caesarea. Unlike his official histories
(“Wars” and “Buildings”), which paint the Byzantine Emperor Justinian I
and his wife Theodora in a generally positive and conventional manner, the
Secret History offers a scandalous, behind-the-scenes account that
sharply criticizes and even vilifies the emperor, the empress, and other
key figures of the time.
Next, inspect the Kong AI Gateway logs to verify that the traffic was proxied through it:
docker exec kong-quickstart-gateway cat /tmp/claude.json | jq
You should find an entry that shows the upstream request made by Claude Code. A typical log record looks like this:
{
...
"method": "POST",
"headers": {
"user-agent": "claude-cli/2.0.37 (external, cli)",
"content-type": "application/json"
},
"ai": {
"meta": {
"request_model": "gpt-5-mini",
"request_mode": "oneshot",
"response_model": "gpt-5-mini-2025-08-07",
"provider_name": "openai",
"llm_latency": 6786,
"plugin_id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
},
"usage": {
"completion_tokens": 456,
"completion_tokens_details": {
"accepted_prediction_tokens": 0,
"audio_tokens": 0,
"rejected_prediction_tokens": 0,
"reasoning_tokens": 256
},
"total_tokens": 481,
"cost": 0,
"time_per_token": 14.881578947368,
"time_to_first_token": 6785,
"prompt_tokens": 25,
"prompt_tokens_details": {
"cached_tokens": 0,
"audio_tokens": 0
}
}
}
...
}
This output confirms that Claude Code routed the request through Kong AI Gateway using the gpt-5-mini model we selected while starting the Claude Code session.
Cleanup
Clean up Konnect environment
If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.
Destroy the Kong Gateway container
curl -Ls https://get.konghq.com/quickstart | bash -s -- -d