Skip to main content

Docker

vai ships with full Docker support, so you can run any vai command without installing Node.js or managing dependencies. The Docker image supports three modes: CLI commands, the web playground, and the MCP server.

Quick Start

Build the image and run a command:

docker build -t vai .
docker run --rm -e VOYAGE_API_KEY="your-key" vai embed "hello world"

Three Run Modes

CLI (one-shot commands)

Run any vai command and exit:

docker run --rm -e VOYAGE_API_KEY="your-key" vai models
docker run --rm -e VOYAGE_API_KEY="your-key" vai embed "semantic search" --json
docker run --rm -e VOYAGE_API_KEY="your-key" vai explain embeddings

Web Playground

Start the interactive web playground:

docker run --rm -p 3333:3333 -e VOYAGE_API_KEY="your-key" vai playground --no-open

Then open http://localhost:3333 in your browser.

MCP Server

Start the MCP server for AI tool integration:

docker run --rm -p 3100:3100 -e VOYAGE_API_KEY="your-key" vai mcp-server --transport http --host 0.0.0.0 --port 3100

AI clients connect to http://localhost:3100/mcp.

Image Details

PropertyValue
Base imagenode:22-slim
Size~180 MB
Entrypointvai
Working directory/data
Exposed ports3333 (playground), 3100 (MCP server)
Healthcheckvai ping --json

The image uses a multi-stage build: the first stage installs vai from npm, and the second stage copies only the installed package into a clean slim runtime.

Credentials

Pass credentials as environment variables at runtime. Never bake secrets into the image.

VariableRequiredDescription
VOYAGE_API_KEYYesVoyage AI API key
MONGODB_URIFor search/storeMongoDB Atlas connection string
VAI_LLM_PROVIDERFor chatLLM provider: anthropic, openai, ollama
VAI_LLM_API_KEYFor chatLLM provider API key
VAI_LLM_MODELFor chatModel name override
VAI_LLM_BASE_URLFor OllamaOllama API base URL
VAI_MCP_SERVER_KEYFor MCP authBearer token for HTTP transport

You can pass these individually with -e flags or use an env file:

# Individual flags
docker run --rm -e VOYAGE_API_KEY="..." -e MONGODB_URI="..." vai search --query "hello" --db mydb --collection docs

# Env file
docker run --rm --env-file .env vai search --query "hello" --db mydb --collection docs

Mounting Local Files

Commands that read files (like pipeline, ingest, store) need access to your local filesystem. Mount your data directory into the container's /data path:

docker run --rm --env-file .env -v "$(pwd):/data:ro" vai pipeline ./docs/ --db myapp --collection knowledge

The :ro flag mounts read-only, which is recommended for ingestion workloads.

Pinning a Version

By default, the Dockerfile installs the latest vai release. To pin a specific version:

docker build -t vai --build-arg VAI_VERSION=1.27.0 .

Next Steps