Skip to main content

Chat Providers

vai chat supports three LLM providers. Each has different strengths, pricing, and setup requirements.

Anthropic (Claude)

FeatureDetails
ModelsClaude Opus, Claude Sonnet, Claude Haiku
Streaming✅ Supported
Tool calling✅ (required for agent mode)
SetupAPI key from console.anthropic.com
vai config set llm-provider anthropic
vai config set llm-api-key sk-ant-...
vai config set llm-model claude-sonnet-4-20250514

Best for: High-quality responses, agent mode, long context.

OpenAI (GPT)

FeatureDetails
ModelsGPT-4, GPT-4 Turbo, GPT-3.5 Turbo
Streaming✅ Supported
Tool calling✅ Supported
SetupAPI key from platform.openai.com
vai config set llm-provider openai
vai config set llm-api-key sk-...
vai config set llm-model gpt-4

Best for: Wide model selection, OpenAI-compatible endpoints.

Ollama (Local)

FeatureDetails
ModelsLlama 3, Mistral, Phi, Gemma, etc.
Streaming✅ Supported
Tool callingDepends on model
SetupInstall Ollama, pull a model
CostFree (runs locally)
ollama pull llama3
vai config set llm-provider ollama
vai config set llm-base-url http://localhost:11434
vai config set llm-model llama3

Best for: Privacy, no API costs, offline usage.

Switching Providers

Change provider at any time:

vai config set llm-provider openai

Or override per session:

vai chat --llm-provider ollama --llm-model mistral