Supported Large Language Models
Chat and Commands
Cody supports a variety of cutting edge large language models for use in Chat and Commands, allowing you to select the best model for your use case. Free users are defaulted to the Claude 3 Sonnet model from Anthropic, while Pro users have access to select any supported model.
Provider | Model | Free | Pro | Enterprise |
---|---|---|---|---|
OpenAI | gpt-3.5 turbo | - | ✅ | ✅ |
OpenAI | gpt-4 | - | - | ✅ |
OpenAI | gpt-4 turbo | - | ✅ | ✅ |
OpenAI | gpt-4o | - | ✅ | ✅ |
Anthropic | claude-3 Haiku | - | ✅ | ✅ |
Anthropic | claude-3 Sonnet | ✅ | ✅ | ✅ |
Anthropic | claude-3 Opus | - | ✅ | ✅ |
Mistral | mixtral 8x7b | - | ✅ | - |
Mistral | mixtral 8x22b | - | ✅ | - |
Ollama | variety | experimental | experimental | - |
Google Gemini | 1.5 Pro | - | ✅ | ✅ |
Google Gemini | 1.5 Flash | - | ✅ | ✅ |
To use Claude 3 (Opus and Sonnets) models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.
Autocomplete
Cody uses a set of models for autocomplete which are suited for the low latency use case.
Provider | Model | Free | Pro | Enterprise |
---|---|---|---|---|
Fireworks.ai | StarCoder | ✅ | ✅ | ✅ |
Anthropic | claude Instant | - | - | ✅ |
Ollama* | variety | experimental | experimental | - |
For information on context token limits, see our documentation here.