Skip to main content

Model Configuration

Cody CLI supports multiple AI models via the AINative gateway. You can select a model per-session or set a default.

Available Models

Model IDParametersBest ForPlan
qwen-coder-32b32BDefault — best code qualityPro+
qwen-coder-7b7BFast iteration, simple tasksFree
nouscoder-14b14BBalanced code modelPro+
gemma-9b9BGeneral text and reasoningFree
deepseek-r1-distill-qwen-7b7BStep-by-step reasoningFree

Selecting a Model

Per-session flag

cody --model qwen-coder-7b

Interactive model picker

Run /model inside an interactive session to switch models on the fly.

Set a default in settings

Add to ~/.cody/settings.json:

{
"model": "qwen-coder-32b"
}

Context Limits

Each model has a maximum context window. Cody automatically manages context to stay within limits, summarizing older turns when needed.

Tool Support

Not all models support tool use (file read/write, bash, search). Reasoning models (deepseek-r1-*) do not support tools — use qwen-coder-* or nouscoder-* for agentic tasks.

Fast Mode

Toggle fast mode with /fast in an interactive session, or:

cody --fast

Fast mode uses the fast model (qwen-coder-7b by default) for quicker responses.