The AI MCP Proxy plugin lets you connect any Kong-managed Service to the Model Context Protocol (MCP). It acts as a protocol bridge, translating between MCP and HTTP so that MCP-compatible clients can either call existing APIs or interact with upstream MCP servers through Kong.
The plugin’s mode
parameter controls whether it proxies MCP requests, converts RESTful APIs into MCP tools, or exposes grouped tools as an MCP server. This flexibility allows you to integrate existing HTTP APIs into MCP workflows, front third-party MCP servers with Kong’s policies, or expose multiple tool sets as a managed MCP server.
Because the plugin runs directly on Kong AI Gateway, MCP endpoints are provisioned dynamically on demand. You don’t need to host or scale them separately, and the Kong AI Gateway applies its authentication, traffic control, and observability features to MCP traffic at the same scale it delivers for traditional APIs.
Note: Unlike other available AI plugins, the AI MCP Proxy plugin is not invoked as part of an LLM request flow. Instead, it’s part of an MCP request flow. It’s registered and executed as a regular plugin, between the MCP client and the MCP server, allowing it to capture MCP traffic independently of LLM request flow.
Do not configure the AI MCP Proxy plugin together with other AI plugins on the same Service or Route.