Route Claude CLI traffic through Kong AI Gateway and Gemini

Deployment Platform
Tags
#ai
Minimum Version
Kong Gateway - 3.13
TL;DR

Install Claude CLI, configure its API key helper, create a Gateway Service and Route, attach the AI Proxy plugin to forward requests to Claude, enable the File Log plugin to inspect traffic, and point Claude CLI to the local proxy endpoint so all LLM requests pass through the AI Gateway for monitoring and control.

Prerequisites

This is a Konnect tutorial and requires a Konnect personal access token.

  1. Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.

  2. Export your token to an environment variable:

     export KONNECT_TOKEN='YOUR_KONNECT_PAT'
    
  3. Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:

     curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN --deck-output
    

    This sets up a Konnect Control Plane named quickstart, provisions a local Data Plane, and prints out the following environment variable exports:

     export DECK_KONNECT_TOKEN=$KONNECT_TOKEN
     export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart
     export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com
     export KONNECT_PROXY_URL='http://localhost:8000'
    

    Copy and paste these into your terminal to configure your session.

This tutorial requires Kong Gateway Enterprise. If you don’t have Kong Gateway set up yet, you can use the quickstart script with an enterprise license to get an instance of Kong Gateway running almost instantly.

  1. Export your license to an environment variable:

     export KONG_LICENSE_DATA='LICENSE-CONTENTS-GO-HERE'
    
  2. Run the quickstart script:

    curl -Ls https://get.konghq.com/quickstart | bash -s -- -e KONG_LICENSE_DATA 
    

    Once Kong Gateway is ready, you will see the following message:

     Kong Gateway Ready
    

decK is a CLI tool for managing Kong Gateway declaratively with state files. To complete this tutorial, install decK version 1.43 or later.

This guide uses deck gateway apply, which directly applies entity configuration to your Gateway instance. We recommend upgrading your decK installation to take advantage of this tool.

You can check your current decK version with deck version.

For this tutorial, you’ll need Kong Gateway entities, like Gateway Services and Routes, pre-configured. These entities are essential for Kong Gateway to function but installing them isn’t the focus of this guide. Follow these steps to pre-configure them:

  1. Run the following command:

    echo '
    _format_version: "3.0"
    services:
      - name: example-service
        url: http://httpbin.konghq.com/anything
    routes:
      - name: example-route
        paths:
        - "/anything"
        service:
          name: example-service
    ' | deck gateway apply -
    

To learn more about entities, you can read our entities documentation.

Before you begin, you must get the following credentials from Google Cloud:

  • Service Account Key: A JSON key file for a service account with Vertex AI permissions
  • Project ID: Your Google Cloud project identifier
  • Location ID: The region where your Vertex AI endpoint is deployed (for example, us-central1)
  • API Endpoint: The Vertex AI API endpoint URL (typically https://{location}-aiplatform.googleapis.com)

Export these values as environment variables:

export GEMINI_API_KEY="<your_gemini_api_key>"
export GCP_PROJECT_ID="<your-gemini-project-id>"
export GEMINI_LOCATION_ID="<your-gemini-location_id>"
export GEMINI_API_ENDPOINT="<your_gemini_api_endpoint>"
  1. Install Claude:

     curl -fsSL https://claude.ai/install.sh | bash
    
  2. Create or edit the Claude settings file:

     mkdir -p ~/.claude
     nano ~/.claude/settings.json
    

    Put this exact content in the file:

     {
         "apiKeyHelper": "~/.claude/anthropic_key.sh"
     }
    
  3. Create the API key helper script:

     nano ~/.claude/anthropic_key.sh
    

    Inside, put a dummy API key:

     echo "x"
    
  4. Make the script executable:

     chmod +x ~/.claude/anthropic_key.sh
    
  5. Verify it works by running the script:

     ~/.claude/anthropic_key.sh
    

    You should see only your API key printed.

Configure the AI Proxy plugin

First, configure the AI Proxy plugin for the Gemini provider:

  • This setup uses the default llm/v1/chat route. Claude Code sends its requests to this route.
  • The configuration also raises the maximum request body size to 512 KB to support larger prompts.

The llm_format: anthropic parameter tells Kong AI Gateway to expect request and response payloads that match Claude’s native API format. Without this setting, the Gateway would default to OpenAI’s format, which would cause request failures when Claude Code communicates with the Gemini endpoint.

echo '
_format_version: "3.0"
plugins:
  - name: ai-proxy-advanced
    config:
      llm_format: anthropic
      targets:
      - route_type: llm/v1/chat
        logging:
          log_statistics: true
          log_payloads: false
        auth:
          allow_override: false
          gcp_use_service_account: true
          gcp_service_account_json: "${{ env "DECK_GEMINI_API_KEY" }}"
        model:
          provider: gemini
          name: gemini-2.0-flash
          options:
            gemini:
              api_endpoint: "${{ env "DECK_GEMINI_API_ENDPOINT" }}"
              project_id: "${{ env "DECK_GCP_PROJECT_ID" }}"
              location_id: "${{ env "DECK_GEMINI_LOCATION_ID" }}"
          max_tokens: 8192
' | deck gateway apply -

Configure the File Log plugin

Now, let’s enable the File Log plugin on the Service, to inspect the LLM traffic between Claude and the AI Gateway. This creates a local claude.json file on your machine. The file records each request and response so you can review what Claude sends through the AI Gateway.

echo '
_format_version: "3.0"
plugins:
  - name: file-log
    config:
      path: "/tmp/claude.json"
' | deck gateway apply -

Verify traffic through AI Gateway

Now, we can start a Claude Code session that points it to the local AI Gateway endpoint:

Ensure that ANTHROPIC_MODEL matches the model you deployed in Gemini.

ANTHROPIC_BASE_URL=http://localhost:8000/anything \
ANTHROPIC_MODEL=YOUR_GEMINI_MODEL \
claude

Claude Code asks for permission before it runs tools or interacts with files:

I'll need permission to work with your files.

This means I can:
- Read any file in this folder
- Create, edit, or delete files
- Run commands (like npm, git, tests, ls, rm)
- Use tools defined in .mcp.json

Learn more ( https://docs.claude.com/s/claude-code-security )

❯ 1. Yes, continue
2. No, exit

Select Yes, continue. The session starts. Ask a simple question to confirm that requests reach Kong AI Gateway.

Tell me about Anna Komnene's Alexiad.

Claude Code might prompt you approve its web search for answering the question. When you select Yes, Claude will produce a full-length response to your request:

Anna Komnene (1083-1153?) was a Byzantine princess, scholar, physician,
hospital administrator, and historian. She is known for writing the
Alexiad, a historical account of the reign of her father, Emperor Alexios
I Komnenos (r. 1081-1118). The Alexiad is a valuable primary source for
understanding Byzantine history and the First Crusade.

Next, inspect the Kong AI Gateway logs to verify that the traffic was proxied through it:

docker exec kong-quickstart-gateway cat /tmp/claude.json | jq

You should find an entry that shows the upstream request made by Claude Code. A typical log record looks like this:

{
  ...
  "method": "POST",
  "headers": {
    "user-agent": "claude-cli/2.0.37 (external, cli)",
    "content-type": "application/json"
  },
  ...
  "ai": {
    "proxy": {
      "tried_targets": [
        {
          "provider": "gemini",
          "model": "gemini-2.0-flash",
          "port": 443,
          "upstream_scheme": "https",
          "host": "us-central1-aiplatform.googleapis.com",
          "upstream_uri": "/v1/projects/example-project-id/locations/us-central1/publishers/google/models/gemini-2.0-flash:generateContent",
          "route_type": "llm/v1/chat",
          "ip": "xxx.xxx.xxx.xxx"
        }
      ],
      "meta": {
        "request_model": "gemini-2.0-flash",
        "request_mode": "oneshot",
        "response_model": "gemini-2.0-flash",
        "provider_name": "gemini",
        "llm_latency": 1694,
        "plugin_id": "13f5c57a-77b2-4c1f-9492-9048566db7cf"
      },
      "usage": {
        "completion_tokens": 19,
        "completion_tokens_details": {},
        "total_tokens": 11203,
        "cost": 0,
        "time_per_token": 89.157894736842,
        "time_to_first_token": 1694,
        "prompt_tokens": 11184,
        "prompt_tokens_details": {}
      }
    }
  }
  ...
}

This output confirms that Claude Code routed the request through Kong AI Gateway using the gemini-2.0-flash model we selected while starting the Claude Code session.

Cleanup

If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.

curl -Ls https://get.konghq.com/quickstart | bash -s -- -d
Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!