AI MCP Proxy

AI License Required
Related Documentation
Made by
Kong Inc.
Supported Gateway Topologies
hybrid db-less traditional
Supported Konnect Deployments
hybrid cloud-gateways serverless
Compatible Protocols
grpc grpcs http https
Minimum Version
Kong Gateway - 3.12
Tags
AI Gateway Enterprise: This plugin is only available as part of our AI Gateway Enterprise offering.

The AI MCP Proxy plugin lets you connect any Kong-managed Service to the Model Context Protocol (MCP). It acts as a protocol bridge, translating between MCP and HTTP so that MCP-compatible clients can either call existing APIs or interact with upstream MCP servers through Kong.

The plugin’s mode parameter controls whether it proxies MCP requests, converts RESTful APIs into MCP tools, or exposes grouped tools as an MCP server. This flexibility allows you to integrate existing HTTP APIs into MCP workflows, front third-party MCP servers with Kong’s policies, or expose multiple tool sets as a managed MCP server.

Because the plugin runs directly on Kong AI Gateway, MCP endpoints are provisioned dynamically on demand. You don’t need to host or scale them separately, and the Kong AI Gateway applies its authentication, traffic control, and observability features to MCP traffic at the same scale it delivers for traditional APIs.

Note: Unlike other available AI plugins, the AI MCP Proxy plugin is not invoked as part of an LLM request flow. Instead, it’s part of an MCP request flow. It’s registered and executed as a regular plugin, between the MCP client and the MCP server, allowing it to capture MCP traffic independently of LLM request flow.

Do not configure the AI MCP Proxy plugin together with other AI plugins on the same Service or Route.

Why use the AI MCP Proxy plugin

The AI MCP Proxy bridges the Kong plugin ecosystem with the MCP world, enabling you to bring all of Kong’s traffic management, security, and observability capabilities to MCP endpoints:

Use case

Example

Authentication Apply OpenID Connect or the Key Auth plugin to an MCP Server
Rate limiting Use Rate Limiting or Rate Limiting Advanced plugins to control MCP request volume
Observability Add logging and tracing plugins for full request and response visibility
Traffic control Apply request/response transformation plugins or ACL policies

How it works

The AI MCP Proxy plugin handles MCP requests by converting them into standard HTTP calls and returning the responses in MCP format. The flow works as follows:

  1. Accepts MCP protocol requests from a client.
  2. Parses the MCP tool call and matches it to an OpenAPI operation.
  3. Converts the operation into a standard HTTP request.
  4. Sends the request to the upstream Service.
  5. Wraps the HTTP response in MCP-compatible format and returns it.
 
sequenceDiagram
    participant C as MCP Client
    participant K as Kong (AI MCP Proxy plugin)
    participant U as Upstream Service

    C->>K: MCP request (tool invocation)
    activate K
    K->>K: Parse MCP payload
    K->>K: Map to HTTP endpoint (OpenAPI schema)
    K->>U: HTTP request
    deactivate K
    activate U
    U-->>K: HTTP response
    deactivate U
    activate K
    K->>K: Convert to MCP format
    K-->>C: MCP response
    deactivate K
  

Pings from your MCP client are included in the total request count for your Kong AI Gateway instance, in addition to the requests made to the MCP server.

Prerequisites

Before using the AI MCP Proxy plugin, ensure your setup meets these requirements:

  • The upstream Service exposes a valid OpenAPI schema.
  • That Service is configured and accessible in Kong.
  • An MCP-compatible client (such as Claude, Cursor, or LMstudio) is available to connect to Kong.
  • The Kong AI Gateway instance supports the AI MCP Proxy plugin (is 3.12 or higher).

Configuration modes

The AI MCP Proxy plugin operates in four modes, controlled by the config.mode parameter. Each mode determines how Kong handles MCP requests and whether it converts RESTful APIs into MCP tools.

Mode

Description

passthrough-listener Listens for incoming MCP requests and proxies them to the upstream_url of the Gateway Service. Generates MCP observability metrics for traffic, making it suitable for third-party MCP servers hosted by users.
conversion-listener Converts RESTful API paths into MCP tools and accepts incoming MCP requests on the Route path. You can define tools directly in the plugin configuration and optionally set a server block.
conversion-only Converts RESTful API paths into MCP tools but does not accept incoming MCP requests. This mode requires config.server.tag in the plugin configuration, but does not define a server.
listener Similar to conversion-listener, but instead of defining its own tools, it binds multiple conversion-only tools using the config.server.tag property. conversion-only plugins define tags at the plugin level, and the listener connects to them to expose the tools on a Route for incoming MCP requests.

Scope of support

The AI MCP Proxy plugin provides support for key MCP operations and upstream interactions, while certain advanced features and non-HTTP protocols are not currently supported. The table below summarizes what is fully supported and what is outside the current scope.

Features Description Supported
Protocol Handling latest streamable HTTP with HTTP and HTTPS upstreams
OpenAPI operations Mapping MCP calls to upstream HTTP operations based on the OpenAPI schema
JSON format Handling standard JSON request and response bodies
Form-encoded data Handling application/x-www-form-urlencoded
SNI routing Converting SNI-only routes
Form and XML data Handling formats such as multipart/form-data or XML
Advanced MCP features Handling structured output, active notifications on tool changes, and session sharing between instances
Non-HTTP protocols Handling WebSocket and gRPC upstreams
AI Guardrails Applying guardrails to MCP AI plugin requests and responses
Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!