Use AI Semantic Prompt Guard plugin to govern your LLM traffic
Use the AI Semantic Prompt Guard plugin to allow or deny prompts by subject area.
Prerequisites
Kong Konnect
This is a Konnect tutorial and requires a Konnect personal access token.
-
Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.
-
Export your token to an environment variable:
export KONNECT_TOKEN='YOUR_KONNECT_PAT'
-
Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:
curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN --deck-output
This sets up a Konnect Control Plane named
quickstart
, provisions a local Data Plane, and prints out the following environment variable exports:export DECK_KONNECT_TOKEN=$KONNECT_TOKEN export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com export KONNECT_PROXY_URL='http://localhost:8000'
Copy and paste these into your terminal to configure your session.
Kong Gateway running
This tutorial requires Kong Gateway Enterprise. If you don’t have Kong Gateway set up yet, you can use the quickstart script with an enterprise license to get an instance of Kong Gateway running almost instantly.
-
Export your license to an environment variable:
export KONG_LICENSE_DATA='LICENSE-CONTENTS-GO-HERE'
-
Run the quickstart script:
curl -Ls https://get.konghq.com/quickstart | bash -s -- -e KONG_LICENSE_DATA
Once Kong Gateway is ready, you will see the following message:
Kong Gateway Ready
decK
decK is a CLI tool for managing Kong Gateway declaratively with state files. To complete this tutorial you will first need to install decK.
Required entities
For this tutorial, you’ll need Kong Gateway entities, like Gateway Services and Routes, pre-configured. These entities are essential for Kong Gateway to function but installing them isn’t the focus of this guide. Follow these steps to pre-configure them:
-
Run the following command:
echo ' _format_version: "3.0" services: - name: example-service url: http://httpbin.konghq.com/anything routes: - name: example-route paths: - "/anything" service: name: example-service ' | deck gateway apply -
To learn more about entities, you can read our entities documentation.
OpenAI
This tutorial uses OpenAI:
- Create an OpenAI account.
- Get an API key.
- Create a decK variable with the API key:
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"
export DECK_OPENAI_API_KEY="YOUR OPENAI API KEY"
Redis stack
To complete this tutorial, make sure you have the following:
- A Redis Stack running and accessible from the environment where Kong is deployed.
- Port
6379
, or your custom Redis port is open and reachable from Kong. -
Redis host set as an environment variable so the plugin can connect:
export DECK_REDIS_HOST='YOUR-REDIS-HOST'
If you’re testing locally with Docker, use
host.docker.internal
as the host value.
Configure the AI Proxy plugin
The AI Proxy plugin acts as the core relay between the client and the LLM provider—in this case, OpenAI. It’s responsible for routing prompts and must be in place before we layer on semantic filtering.
echo '
_format_version: "3.0"
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
header_name: Authorization
header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
model:
provider: openai
name: gpt-4o
options:
max_tokens: 512
temperature: 1.0
' | deck gateway apply -
Configure the AI Semantic Prompt guard plugin
Now, we can set up the AI Semantic Prompt Guard plugin to semantically filter incoming prompts based on topic. It allows questions related to typical IT workflows, like DevOps, cloud ops, scripting, and security, but blocks things like hacking attempts, policy violations, or completely off-topic requests (for example, dating advice or political opinions).
echo '
_format_version: "3.0"
plugins:
- name: ai-semantic-prompt-guard
config:
embeddings:
auth:
header_name: Authorization
header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
model:
name: text-embedding-3-small
provider: openai
search:
threshold: 0.7
vectordb:
strategy: redis
distance_metric: cosine
threshold: 0.5
dimensions: 1024
redis:
host: "${{ env "DECK_REDIS_HOST" }}"
port: 6379
rules:
match_all_conversation_history: true
allow_prompts:
- Network troubleshooting and diagnostics
- Cloud infrastructure management (AWS, Azure, GCP)
- Cybersecurity best practices and incident response
- DevOps workflows and automation
- Programming concepts and language usage
- IT policy and compliance guidance
- Software development lifecycle and CI/CD
- Documentation writing and technical explanation
- System administration and configuration
- Productivity and collaboration tools usage
deny_prompts:
- Hacking techniques or penetration testing without authorization
- Bypassing software licensing or digital rights management
- Instructions on exploiting vulnerabilities or writing malware
- Circumventing security controls or access restrictions
- Gathering personal or confidential employee information
- Using AI to impersonate or phish others
- Social engineering tactics or manipulation techniques
- Guidance on violating company IT policies
- Content unrelated to work, such as entertainment or dating
- Political, religious, or sensitive non-work-related discussions
' | deck gateway apply -
Validate configuration
Once the AI Semantic Prompt Guard plugin is configured, you can test different kinds of prompts to make sure the guardrails are working. Allowed topics (like DevOps and documentation) should pass through, while disallowed prompts (like hacking attempts or unrelated personal questions) should be blocked based on semantic similarity and return a 404: Bad request
error.
Cleanup
Clean up Konnect environment
If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.
Destroy the Kong Gateway container
curl -Ls https://get.konghq.com/quickstart | bash -s -- -d