Installing Cody for JetBrains
Learn how to use Cody and its features with the JetBrains IntelliJ editor.
The Cody extension by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment.
Prerequisites
- You have the latest version of JetBrains IDEs installed
- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account
- Cody is compatible with the following JetBrains IDEs:
Install the JetBrains IntelliJ Cody extension
Follow these steps to install the Cody plugin:
- Open JetBrains IntelliJ editor on your local machine
- Open Settings (Mac:
⌘+,
Windows:Ctrl+Alt+S
) and select Plugins - Type and search Cody: AI Coding Assistant with Autocomplete & Chat extension and click Install
Alternatively, you can Download and install the extension from the Jetbrains marketplace.
Connect the extension to Sourcegraph
After a successful installation, Cody's icon appears in the sidebar. Clicking it prompts you to start with codehosts like GitHub, GitLab, and your Google login. This allows Cody to access your Sourcegraph.com account.
Alternatively, you can also click the Sign in with an Enterprise Instance to connect to your enterprise instance.
For Sourcegraph Enterprise users
To connect the extension with your Enterprise instance,
- Click Sign in with an Enterprise Instance
- Enter the server for your enterprise instance (e.g.
https://<your-instance>.sourcegraph.com
) - Select Generate new token. You'll be directed to the Access tokens page on your instance in the browser
- Generate a new token, copy it, and paste it into the Token field in your editor
- Click Sign in
For Sourcegraph.com users
For Cody Free and Cody Pro users, you can Log in through SSO to authenticate the IntelliJ extension with your Sourcegraph.com account.
Verifying the installation
Once Cody is successfully connected, you'll see that the sign-in panel has been replaced by a welcome message from Cody. Let's create an autocomplete suggestion to verify that the Cody extension has been successfully installed and is working as expected.
Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc.
- Create a new file in IntelliJ, for example,
code.js
- Next, type the following algorithm function to sort an array of numbers
function bubbleSort(array)
- As you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context
- These autocomplete suggestions appear as grayed text. To accept the suggestion, press the
Tab
key
Autocomplete
Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IntelliJ editor. With this setting, there is a list of programming languages supported and enabled by default.
To manually configure the Autocomplete feature,
- Go to the Cody Settings... from the Cody icon in the sidebar
- Next, click the Sourcegraph & Cody dropdown and select Cody
- The Autocomplete settings will appear with the list of Enabled Languages
Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the Custom color for completions checkbox to customize the color of your choice.
In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions:
Tab
to accept a suggestionAlt + [
(Windows) orOpt + [
(macOS) to cycle suggestionsAlt + \
(Windows) orOpt + \
(macOS) to manually trigger autocomplete if no suggestions have been returned
Commands
Cody with JetBrains offers quick, ready-to-use Commands for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor.
These are broadly categorized into the Edit Commands and Chat Commands.
- Edit Code: Makes inline code edits. You also get the option to select LLM for edit suggestions
- Document Code: Create inline documentation for your code
- Generate Unit Test: Creates inline unit tests for your code-
- Smell Code: Finds code smells in your file
- Explain Code: Expains code in your file
Let's learn about how to use some of these commands:
Inline code edits
You can make edits to your code directly in your file without opening the chat window. The Edit Code command makes direct code edits, refactors, or bug fixes.
You can run the inline edit command on a selected code snippet, an entire file, or to generate code on a new line. To do so, use the Edit Code command in the Cody sidebar or context menu, or the Shift + Ctrl + Enter
shortcut. This opens a floating editor where you can describe the change you want to make.
Once you enter your prompt, Cody will perform inline edits that you can Accept, Undo, or Show diff for the change. You can also click Edit & Retry to iterate your prompt and get alternate suggestions.
Inline unit tests
The Generate Unit Test command uses the code edit functionality to create unit tests. To generate unit tests for your code, type Shift + Ctrl + T
or click the Generate Unit Test command in the Cody sidebar or from the context menu.
When you run this command, it performs the following actions:
- Check if you’re using a test framework
- Check if you already have a test file created
- Insert new unit tests directly in the existing test file or create a new file if none exists
Here, you can Accept, Undo, or Show diff for the change.
Document your code
Like the Edit Code and Generate Unit Test commands, you can generate inline documentation for your code without opening the chat window. Doc command is available via hotkey, context menu, and Cody sidebar. Type Shift + Ctrl + H
, and Cody will start generating documentation for your selected code snippet or the entire file.
Supported LLM models
For Cody Free users, Claude 3 Sonnet is the default LLM model for Chat and Commands. Users on Cody Pro have the ability to choose from a list of supported LLM models for Chat and Commands. These LLMs are Claude Instant, Claude 2.0, Claude 2.1, Claude 3 (Opus and Sonnet), ChatGPT 3.5 Turbo, ChatGPT 4 Turbo Preview, GPT-4o, and Mixtral.
Enterprise users get Claude 3 (Opus and Sonnet) as the default LLM models without extra cost. You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. Your site administrator determines the LLM, and cannot be changed within the editor. However, Cody Enterprise users when using Cody Gateway have the ability to configure custom models Anthropic (like Claude 2.0 and Claude Instant) and OpenAI (GPT 3.5 and GPT 4).
Chat history
Next to the Chat tab is Chat History, which displays Cody's previous chat interactions. You can revisit any previous chat interactions to get the context of the conversation. You can export your chat history as JSON files for later use.
Context fetching mechanism
JetBrains users on the Free or Pro plan can leverage local search as the primary context source for Cody chat. Local or remote embeddings will no longer be produced or used as a context source.
Enterprise users can leverage the full power of the Sourcegraph search engine as the primary context provider to Cody.
Context scope
JetBrains users on the Free or Pro plan get single-repo support in chat and can use one repo for context fetching. Enterprise users get multi-repo support in chat and can explicitly specify up to 10 additional repos they would like Cody to use for context.
Context Selection
Cody automatically understands the context of your codebase for all Cody Free, Pro, and Enterprise users based on the project opened in your workspace. Enterprise users can add up to 9 additional repos (10 total, including the default project) to use as context. The multi-repo context for Enterprise is powered by Sourcegraph code search and allows Cody to use the selected codebase to answer your questions.
Moreover, Cody's chat allows you to add files as context in your messages. Type @-file
in the Cody chat window and then a filename to include a file as context.
Chat with multi-repo context
The chat sidebar has an interface for selecting chat context. For Cody Free and Cody Pro users, you can turn the local chat context on or off. When the local project is turned off, Cody doesn't search your local project for context and sends your prompt to the selected LLM.
Cody Enterprise users can add remote repositories from your Sourcegraph instance. You can type the name of your repositories into this interface and select up to 9 additional repos (10 total, including the default project). Cody will then search against those repositories and retrieve relevant files to answer your chat prompts.
Cody Context Filters
>=6.0.0
.Admins on the Sourcegraph Enterprise instance can use the Cody Context Filters to determine which repositories Cody can use as the context in its requests to third-party LLMs. Inside your site configuration, you can define a set of include
and exclude
rules that will be used to filter the list of repositories Cody can access.
For repos mentioned in the exclude
field, Cody's commands are disabled, and you cannot use them for context fetching. If you try running any of these, you'll be prompted with an error message. However, Cody chat will still work, and you can use it to ask questions.
Read more about the Cody Context Filters here →
Add or remove account
To add or remove an account you can do the following:
- Open your IntelliJ settings by selecting IntelliJ IDEA | Settings on macOS or File | Settings on Windows and Linux from the main menu.
- Get to the Cody Settings by navigating to
Tools -> Sourcegraph & Cody
- Under authentication see the accounts that are currently logged in
- To remove, select your account and hit
-
. To add click+
and choose the appropriate login method