Cody for VS Code v1.22: Introducing Gemini 1.5 Flash and Pro support

Cody for VS Code v1.22 is now available. This update supports two of Google’s latest Gemini 1.5 models, Flash and Pro. It also adds buttons for quickly adding the current codebase or file context to your chat and some UI tweaks to clearly show what files Cody is using as context.

Support for Gemini 1.5 Flash + Gemini 1.5 Pro

Google recently announced new models in the Gemini family, including Gemini 1.5 Flash and Gemini 1.5 Pro. Both models are now available to Cody Pro users in VS Code.

Gemini 1.5 Flash is a lightweight model built for speed and efficiency. Gemini 1.5 Pro is a larger model optimized for high performance across many tasks. Flash is under the “Optimized for Speed” LLM menu, and Pro is under “Optimized for Accuracy.”

Both Gemini models use expanded context windows (the same ones we introduced for the Claude 3 models in v1.14.0):

  • 30,000 tokens of user-defined context
  • 15,000 tokens of input context

Try them out, and let us know what you think!

Cody chat with Gemini 1.5 Flash

Quickly add @repository and @file to chat

While using chat in Cody, you can prompt it to use specific context by typing @<repository> or @<file>. When you start a new chat with Cody, your current repository and file are pre-populated in the chat window.

We’re making it easier to add these @-mentions to the chat window for follow-up messages (or in case you accidentally delete them).

When you type @ in the chat window, you’ll see “Repository“ and “Current File” in the dropdown. Clicking them adds the respective @-mentions to the chat.

Cody's current codebase and current file context buttons

See the exact context used for follow-up messages

When you start a new chat in Cody, you’ll see a message showing the context used to respond to the first question. For follow-up messages after that point, only net-new context was shown as a line item in the chat previously. The chat would actually use all previous and new contexts, but this wasn’t clear.

Now, Cody shows net-new context and mentions that prior messages are also being used as context, making it more evident that prior context in a thread is preserved for follow-ups.

Cody showing that prior context was used in chat

Changelog

See the changelog and GitHub releases for a complete list of changes.

Thank you

Cody wouldn’t be what it is without our amazing contributors 💖 A big thank you to everyone who contributed, filed issues, and sent us feedback.

We value your feedback in our support forum, Discord, and GitHub. Happy Codying!


To get started with Cody, install it from the VS Code Marketplace.

Get Cody, the AI coding assistant

Cody makes it easy to write, fix, and maintain code.