You can proxy requests to Mistral AI models through AI Gateway using the AI Proxy and AI Proxy Advanced plugins. This reference documents all supported AI capabilities, configuration requirements, and provider-specific details needed for proper integration.
Mistral provider
Upstream paths
AI Gateway automatically routes requests to the appropriate Mistral API endpoints. The following table shows the upstream paths used for each capability.
| Capability | Upstream path or API |
|---|---|
| Chat completions | /v1/chat/completions or user-defined |
| Embeddings | /v1/embeddings or user-defined |
| Function calling | /v1/chat/completions or user-defined |
Supported capabilities
The following tables show the AI capabilities supported by Mistral provider when used with the AI Proxy or the AI Proxy Advanced plugin.
Set the plugin’s
route_typebased on the capability you want to use. See the tables below for supported route types.
Text generation
Support for Mistral basic text generation capabilities including chat, completions, and embeddings:
| Capability | Route type | Streaming | Model example | Min version |
|---|---|---|---|---|
| Chat completions | llm/v1/chat |
mistral-large-latest | 3.6 | |
| Embeddings | llm/v1/embeddings |
mistral-embed | 3.11 |
Advanced text generation
Support for Mistral function calling to allow Mistral models to use external tools and APIs:
| Capability | Route type | Model example | Min version |
|---|---|---|---|
| Function calling | llm/v1/chat |
mistral-large-latest | 3.6 |
Mistral base URL
The base URL is $UPSTREAM_URL, where {route_type_path} is determined by the capability.
AI Gateway uses this URL automatically. You only need to configure a URL if you’re using a self-hosted or Mistral-compatible endpoint, in which case set the upstream_url plugin option.
Configure Mistral with AI Proxy
To use Mistral with AI Gateway, configure the AI Proxy or AI Proxy Advanced.
Here’s a minimal configuration for chat completions:
For more configuration options and examples, see:
Tutorials
- Configure dynamic authentication to LLM providers using HashiCorp vault View →
- Use AI PII Sanitizer plugin to protect sensitive data in responses View →
- Store and rotate Mistral API keys as secrets in Google Cloud View →
- Store a Mistral API key as a secret in Konnect Config Store View →
- Use AI Prompt Guard plugin to govern your LLM traffic View →
- Provide AI prompt templates for end users with the AI Prompt Template plugin and Mistral View →
- Visualize LLM traffic with Prometheus and Grafana View →