AI in Insomnia
Explore the AI features available in Insomnia and learn how they enhance automation, collaboration, and productivity.As of Insomnia v12, AI features are free to use, though this may change in future releases.
Insomnia 12 introduces a suite of AI-driven capabilities that make API development faster, smarter, and more collaborative. These include:
- MCP Servers and Clients, which connect to external AI-ready tools and resources.
 - AI-assisted mock server generation, which transforms prompts or API definitions into mock APIs.
 - AI commit message suggestions, which help maintain clear and atomic commit histories.
 
Activate these features by enabling a Large Language Model (LLM) in Preferences > AI Settings.
Choose from one of the following providers:
- Local
 - Claude
 - OpenAI
 - Gemini
 
Local models keep processing fully on your machine for privacy and control, but may run slower and produce less-refined responses than hosted models.
AI features overview
Once an LLM is activated, Insomnia unlocks multiple AI features that are designed to automate repetitive workflows, generate content dynamically, and enhance collaboration.
| 
             Feature  | 
        
            
             Description  | 
        
            
             Product context  | 
        
        
|---|---|---|
| Auto-generate Mock Servers from natural language | Creates a mock server from a prompt, OpenAPI definition, or live URL response. Automatically scaffolds routes, responses, and configurations. | Available when creating Self-hosted mock servers. See Mock Servers. | 
| Suggest comments and grouping for Commits | Analyzes staged Git changes and suggests logical commit groupings and draft messages. | Available in the Git Sync interface. See Version control in Insomnia. | 
| MCP Client operations | Connect to MCP Servers that expose callable tools, prompts, and structured resources via JSON-RPC. | Manage connections under MCP Servers in Insomnia. See MCP clients in Insomnia. | 
Get started
Activate features by choosing and uploading an LLM in Preferences > AI Settings:
- Click Preferences.
 - Select the AI Settings tab.
 - In the provider list, choose a LLM type:
 - Enter your credentials or select a local model.
 - Click Activate.
 
After activation you can toggle Auto-generate Mock Servers and Suggest commit comments from the AI Features panel
Note: Local LLMs require a
.gguffile placed in the/Insomnia/llms/directory.
Credentials for hosted LLM providers are stored securely on your local system by the Insomnia app and are never synced across accounts or devices.
MCP Servers and Clients
Model Context Protocol (MCP) Servers expose domain-specific operations through a JSON-RPC interface. For example:
tools/CALLresources/READprompts/GET
When you connect Insomnia to an MCP Server, Insomnia creates an MCP Client that acts like a synchronized request group. The client stays updated with tools, prompts, and resources published by the server. When offline, you will view cached data until you resync.
AI-driven Git commits
Insomnia’s Suggest comments and grouping for Commits feature analyzes staged changes and helps maintain consistent, meaningful Git histories. For Git concepts and workflows, go to Version control in Insomnia.
- Open the Git Sync interface.
 - Click Suggest comments and grouping for Commits.
 - Review the suggested commit groups and messages.
 - (Optional) To edit a message inline, double-click the message.
 - Drag and drop files between commit groups, or exclude files.
 - Click Commit or Commit & Push.
 
Frequently asked questions
Can I disable AI features for myself?
Yes. Go to Preferences → AI Settings and deactivate the toggles for Auto-generate Mock Servers and Suggest commit comments.
To stop using an LLM entirely, click Deactivate under the provider configuration.
Why don’t I see AI features in the app?
You must first configure and activate an LLM under Preferences → AI Settings.
If AI is deactivated at the instance level, the feature toggles will remain unavailable in the UI.
How do I manage AI across my organization?
Enterprise administrators can activate or deactivate AI features at the instance level from Insomnia Admin → AI Settings.
When deactivated, the desktop app shows an explanatory message.
When activated, each user must still activate a model before toggles become available.
This setting is available only for Enterprise plans.
Which transports are supported for MCP Clients?
Both HTTP and STDIO transports are supported for connecting to MCP Servers.
Can MCP Clients use Git Sync or Import/Export?
Not yet. MCP Clients are stored locally and currently do not support Git Sync or import/export.
Why are commit suggestions less accurate when using a local LLM?
Local LLMs with fewer than 10 billion parameters may produce less accurate or inconsistent commit suggestions.
Smaller models have limited context understanding and token capacity compared to hosted providers.