AI Providers
Magic Shell supports multiple AI providers, including OpenCode Zen, OpenRouter, and custom models for local or remote endpoints.
OpenCode Zen (Recommended)
Section titled “OpenCode Zen (Recommended)”OpenCode Zen provides curated models optimized for coding tasks. It’s the default provider and offers several free models.
Free Models
Section titled “Free Models”| Model | Description |
|---|---|
big-pickle | Stealth model - experimental (default) |
glm-4.7 | GLM 4.7 - good general purpose |
minimax-m2.1 | MiniMax’s capable model |
kimi-k2.5 | Moonshot’s latest model |
Premium Models
Section titled “Premium Models”| Model | Description |
|---|---|
claude-sonnet-4-5 | Anthropic’s hybrid reasoning model |
claude-opus-4-5 | Anthropic’s most capable model |
claude-haiku-4-5 | Anthropic’s latest fast model |
kimi-k2 | Moonshot’s Kimi K2 |
kimi-k2-thinking | Kimi K2 with extended reasoning |
qwen3-coder | Alibaba’s massive coding model |
glm-4.6 | Zhipu AI’s capable model |
gemini-3-pro | Google’s high-end Gemini model |
gemini-3-flash | Google’s fast Gemini model |
gpt-5.2 | OpenAI’s flagship GPT model |
gpt-5.2-codex | OpenAI’s coding-focused GPT model |
Get an API Key
Section titled “Get an API Key”- Visit opencode.ai/auth
- Sign up or log in
- Copy your API key
- Run
msh --setupand paste it
OpenRouter
Section titled “OpenRouter”OpenRouter provides access to models from many providers through a single API.
Free Models
Section titled “Free Models”| Model | Description |
|---|---|
mimo-v2-flash | MiMo V2 Flash |
deepseek-v3 | DeepSeek V3 |
Premium Models
Section titled “Premium Models”| Model | Description |
|---|---|
claude-sonnet-4.5 | Anthropic Claude Sonnet 4.5 |
claude-opus-4.5 | Anthropic Claude Opus 4.5 |
claude-haiku-4.5 | Anthropic’s fast and efficient model |
deepseek-r1 | DeepSeek R1 reasoning model |
glm-4.7 | Zhipu AI’s capable model |
gemini-2.5-pro | Google’s Gemini 2.5 Pro (stable until June 2026) |
gemini-2.5-flash | Google’s fast Gemini 2.5 (stable until June 2026) |
Get an API Key
Section titled “Get an API Key”- Visit openrouter.ai/keys
- Create an account
- Generate a new API key
- Run
msh --setupand select OpenRouter
Switching Providers
Section titled “Switching Providers”Via CLI
Section titled “Via CLI”# Switch to OpenRoutermsh --provider openrouter
# Switch back to OpenCode Zenmsh --provider opencode-zenVia TUI
Section titled “Via TUI”Press Ctrl+X S to open the provider switcher.
Switching Models
Section titled “Switching Models”Via CLI
Section titled “Via CLI”# List available modelsmsh --models
# Set default modelmsh --model big-pickleVia TUI
Section titled “Via TUI”Press Ctrl+X M to open the model picker.
Model Recommendations
Section titled “Model Recommendations”| Use Case | Recommended Model |
|---|---|
| Quick commands | big-pickle (free) |
| Complex git operations | claude-sonnet-4.5 |
| System administration | glm-4.7 or minimax-m2.1 |
| Multi-step tasks | claude-opus-4.5 or kimi-k2-thinking |
| Learning/experimenting | Any free model |
Environment Variables
Section titled “Environment Variables”Set your API keys via environment variables:
# OpenCode Zenexport OPENCODE_ZEN_API_KEY="your-key"
# OpenRouterexport OPENROUTER_API_KEY="your-key"Keys stored in the system keychain (via msh --setup) take precedence over environment variables.
Custom Models
Section titled “Custom Models”Magic Shell supports custom models for local or remote OpenAI-compatible endpoints. This is perfect for:
- LM Studio - Run models locally on your machine
- Ollama - Local model management
- Self-hosted APIs - Your own OpenAI-compatible endpoints
- Third-party services - Any OpenAI-compatible API
Adding a Custom Model
Section titled “Adding a Custom Model”# Interactive setup wizardmsh --add-modelYou’ll be prompted for:
- Model ID: A unique identifier (e.g.,
my-local-llama) - Display name: Human-readable name (e.g.,
Local Llama 3.2) - API model ID: The model identifier sent to the API (e.g.,
llama-3.2-3b) - Base URL: The API endpoint (e.g.,
http://localhost:1234/v1) - API key: Optional, stored securely in your system keychain
- Category:
fast,smart, orreasoning
Managing Custom Models
Section titled “Managing Custom Models”# List all custom modelsmsh --list-custom
# Set a custom model as defaultmsh --model my-local-llama
# Remove a custom modelmsh --remove-model my-local-llamaExample: LM Studio
Section titled “Example: LM Studio”# Start LM Studio and load a model# Then add it to Magic Shell:msh --add-model
# Model ID: llama-3.2-local# Display name: Local Llama 3.2# API model ID: llama-3.2-3b# Base URL: http://localhost:1234/v1# API key: (leave empty for LM Studio)# Category: smart
# Use itmsh --model llama-3.2-local "find all large files"Custom model API keys are securely stored in your system keychain, just like provider API keys.