Windsurf Features

Autocomplete

Windsurf Features

Autocomplete

Overview

Codeium Autocomplete is powered by our best-in-class proprietary model, trained from scratch to optimize for speed and accuracy.

Our autocomplete makes in-line and multi-line suggestions based on the context of your code.

Suggestions appear in grey text as you type. You can press esc to cancel a suggestion. Suggestions will also disappear if you continue typing or navigating without accepting them.

Keyboard Shortcuts

Here are shortcuts that apply for MacOS. Replace with Ctrl and with Alt to get the corresponding shortcuts on Windows/Linux.

  • Accept suggestion:
  • Cancel suggestion: esc
  • Accept suggestion word-by-word: ⌘+→ (VS Code), ⌥+⇧+\ (JetBrains)
  • Next/previous suggestion: ⌥+]/⌥+[
  • Trigger suggestion: ⌥+\

Autocomplete Speeds

You can set the speed of the Autocomplete in your settings.

Fast Autocomplete is currently only available to our Pro, Teams, and Enterprise Users.

Tips

Inline Comments

You can instruct autocomplete with the use of comments in your code. Codeium will read these comments and suggest the code to bring the comment to life.

This method can get you good mileage, but if you’re finding value in writing natural-language instructions and having the AI execute them, consider using Codeium Command.

Fill In The Middle (FIM)

Codeium’s Autocomplete can Fill In The Middle (FIM).

Read more about in-line FIM on our blog here.

Snooze

Click the Codeium widget in the status bar towards the bottom right of your editor to see the option to switch Autocomplete off, either temporarily or until you reenable it.

Chat

Overview

Converse with a codebase-aware AI

Chat and its related features are only supported in: VS Code, JetBrains IDEs, Eclipse, X-Code, and Visual Studio.

Chat in Windsurf is integrated within Cascade. Set to “Chat” mode to replicate the original experience.

Codeium Chat enables you to talk to your codebase from within your editor. Chat is powered by our industry-leading context awareness engine. It combines thoughtfully-designed, built-in context retrieval with optional user guidance to provide accurate and grounded answers.

  • VS Code
  • JetBrains

In VS Code, Codeium Chat can be found by default on the left sidebar. If you wish to move it elsewhere, you can click and drag the Codeium icon and relocate it as desired.

You can use ⌘+⇧+A on Mac or Ctrl+⇧+A on Windows/Linux to open the chat panel and toggle focus between it and the editor. You can also pop the chat window out of the IDE entirely by clicking the page icon at the top of the chat panel.

@-Mentions

An @-mention is a deterministic way of bringing in context, and is guaranteed to be part of the context used to respond to a chat.

In any given chat message you send, you can explicitly refer to context items from within the chat input by prefixing a word with @.

Context items available to be @-mentioned:

  • Functions & classes
  • Only functions and classes in the local indexed
  • Also only available for languages we have built AST parsers for (Python, TypeScript, JavaScript, Go, Java, C, C++, PHP, Ruby, C#, Perl, Kotlin, Dart, Bash, COBOL, and more)
  • Directories and files in your codebase
  • Remote repositories
  • The contents of your in-IDE terminal (VS Code only).

You can also try @diff, which lets you chat about your repository’s current git diff state. The @diff feature is currently in beta.

If you want to pull a section of code into the chat and you don’t have @-Mentions available, you can: 1. highlight the code -> 2. right click -> 3. select ‘Codeium: Explain Selected Code Block’

Persistent Context

You can instruct the chat model to use certain context throughout a conversation and across different converstions by configuring the Context tab in the chat panel.

Chat shows you the context it is considering.

In this tab, you can see:

  • Custom Chat Instructions: a short prompt guideline like “Respond in Kotlin and assume I have little familiarity with it” to orient the model towards a certain type of response.
  • Pinned Contexts: items from your codebase like files, directories, and code snippets that you would like explicitly for the model to take into account. See also Context Pinning.
  • Active Document: a marker for your currently active file, which receives special focus.
  • Local Indexes: a list of local repositories that the Codeium context engine has indexed.

Slash Commands

You can prefix a message with /explain to ask the model to explain something of your choice. Currently, /explain is the only supported slash command. Let us know if there are other common workflows you want wrapped in a slash command.

Copy and Insert

Sometimes, Chat responses will contain code blocks. You can copy a code block to your clipboard or insert it directly into the editor at your cursor position by clicking the appropriate button atop the code block.

If you would like the AI to enact a change directly in your editor based on an instruction, consider using Codeium Command.

Inline Citations

Chat is aware of code context items, and its responses often contain linked references to snippets of code in your files.

Regenerate with Context

By default, Codeium makes a judgment call whether any given question is general or if it requires codebase context.

You can force the model to use codebase context by submitting your question with ⌘⏎. For a question that has already received a response, you rerun with context by clicking the sparkle icon.

Stats for Nerds

Lots of things happen under the hood for every chat message. You can click the stats icon to see these statistics for yourself.

Chat History

To revisit past conversations, click the history icon at the top of the chat panel. You can click the + to create a new conversation, and you can click the button to export your conversation.

Settings

Click on the Settings tab to update your theme preferences (light or dark) and font size. The settings panel also gives you an option to download diagnostics, which are debug logs that can be helpful for the Codeium team to debug an issue should you encounter one.

Telemetry

You may encounter issues with Chat if Telemetry is not enabled.

  • VS Code
  • JetBrains

To enable telemetry, open your VS Code settings and navigate to User > Application > Telemetry. In the following dropdown, select “all”.

Models

While we provide and train our own dedicated models for Chat, we also give you the flexibility choose your favorites.

It’s worth noting that the Codeium models are tightly integrated with our reasoning stack, leading to better quality suggestions than external models for coding-specific tasks.

Due to our industry-leading infrastructure, we are able to offer them for free (or at very low cost) to our users.

Model selection can be found directly under the chat.

Base Model ⚡

Access: All users

Available for unlimited use to all users is a fast, high-quality Codeium Chat model based on Meta’s Llama 3.1 70B.

This model is optimized for speed, and is the fastest model available in Codeium Chat. This is all while still being extremely accurate.

Codeium Premier 🚀

Access: Any paying users (Pro, Teams, Enterprise, etc.)

Available in our paid tier is unlimited usage of our premier Codeium Chat model based on Meta’s Llama 3.1 405B.

This is the highest-performing model available for use in Codeium, due to its size and integration with Codeium’s reasoning engine and native workflows.

Other Models (GPT-4o, Claude 3.5 Sonnet)

Access: Any paying users (Pro, Teams, Enterprise, etc.)

Codeium provides access to OpenAI’s and Anthropic’s flagship models, available for use in any of our paid tiers.

AI-powered in-line edits

Command is currently only available in VS Code and JetBrains IDEs.

Codeium Command allows you to generate new code or edit existing code via natural language inputs, directly in the editor window.

  • Windsurf
  • VS Code
  • JetBrains

To invoke Command, press ⌘+I on Mac or Ctrl+I on Windows/Linux. From there, you can enter a prompt in natural language and hit the Submit button (or ⌘+⏎/Ctrl+⏎) to forward the instruction to the AI. Codeium will then provide a multiline suggestion that you can accept or reject.

If you highlight a section of code before invoking Command, then the AI will edit the selection spanned by the highlighted lines. Otherwise, it will generate code at your cursor’s location.

You can accept, reject, or follow-up a generation by clicking the corresponding code lens above the generated diff, or by using the appropriate shortcuts (Cmd+Enter/Cmd+Delete)

In Windsurf, you can select your desired model to use for Command from the dropdown.

Codeium Fast is the fastest, most accurate model available.

Terminal Command

You can also open Command in the terminal in case you don’t remember the exact syntax of what you want to run.

Best Practices

Command is great for file-scoped, in-line changes that you can describe as an instruction in natural language. Here are some pointers to keep in mind:

  • The model that powers Command is larger than the one powering autocomplete. It is slower but more capable, and it is trained to be especially good at instruction-following.
  • If you highlight a block of code before invoking Command, it will edit the selection. Otherwise, it will do a pure generation.
  • Using Command effectively can be an art. Simple prompts like “Fix this” or “Refactor” will likely work thanks to Codeium’s context awareness. A specific prompt like “Write a function that takes two inputs of type Diffable and implements the Myers diff algorithm” that contains a clear objective and references to relevant context may help the model even more.

Refactors, Docstrings, and More

Features powered by Command

Command enables streamlined experiences for a few common operations.

Function Refactors and Docstring Generation

Above functions and classes, Codeium renders code lenses, which are small, clickable text labels that invoke Codeium’s AI capabilities on the labeled item.

You can disable code lenses by clicking the to the right of the code lens text.

The Refactor and Docstring code lenses in particular will invoke Command.

  • If you click Refactor, Codeium will prompt you with a dropdown of selectable, pre-populated instructions that you can choose from. You can also write your own. This is equivalent to highlighting the function and invoking Command.
  • If you click Docstring, Codeium will generate a docstring for you above the function header. (In Python, the docstring will be correctly generated underneath the function header.)

Encouraging readable and maintainable code, one docstring at a time.

Smart Paste

This feature allows you to copy code and paste it into a file in your IDE written in a different programming language. Use ⌘+⌥+V (Mac) or Ctrl+Alt+V (Windows/Linux) to invoke Smart Paste. Behind the scenes, Codeium will detect the language of the destination file and use Command to translate the code in your clipboard. Codeium’s context awareness will try to write it to fit in your code, for example by referencing proper variable names.

Some possible use cases:

  • Migrating code: you’re rewriting JavaScript into TypeScript, or Java into Kotlin.
  • Pasting from Stack Overflow: you found a utility function online written in Go, but you’re using Rust.
  • Learning a new language: you’re curious about Haskell and want to see what your would look like if written in it.