The AI Request Transformer plugin uses a configured LLM service to transform a client request body before proxying the request upstream.
This plugin supports the same llm/v1/chat
requests and providers as the AI Proxy plugin.
It also uses the same configuration and tuning parameters as the AI Proxy plugin, under the config.llm
block.
The AI Request Transformer plugin runs before all of the AI prompt plugins and the AI Proxy plugin, allowing it to also transform requests before sending them to a different LLM.
Known failure mode: AI Request Transformer with AI Proxy
Chaining AI Request Transformer with AI Proxy or AI Proxy Advanced may fail for some providers, even though the same setup works with others.
The reason is that the AI Request Transformer plugin forwards raw model output, and if the model does not produce strict JSON, the proxy chain cannot function correctly. This is not a bug in Kong AI Gateway but a limitation of LLM behavior.