AI Semantic Response Guard

AI License Required

Allow and deny responses using Valkey as a vector databasev3.14+

The AI Semantic Response Guard plugin analyzes the full response from an LLM service and filters it based on semantic similarity to configured allow or deny patterns.

Deny rules take precedence over allow rules. Responses matching a deny pattern are blocked, even if they also match an allow pattern. Responses not matching any allow pattern are blocked when allow rules are set.

Valkey is automatically detected when using the redis vectordb strategy. Kong Gateway queries the server and uses the Valkey-specific driver when it detects a Valkey backend.

Prerequisites

Environment variables

  • OPENAI_API_KEY: Your OpenAI API key

  • VALKEY_HOST: The host where your Valkey instance runs

Set up the plugin

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!