DEV Community

Cover image for Local and offline AI code assistant for VS Code with Ollama and Sourcegraph
Thor 雷神
Thor 雷神

Posted on • Edited on • Originally published at thor.bio

Local and offline AI code assistant for VS Code with Ollama and Sourcegraph

I recently learned that Sourcegraph's AI coding assistant Cody can be used offline by connecting it to a local running Ollama server.

Now, unfortunately my little old MacBook Air doesn't have enough VRAM to run Mistral's 22B Codestral model, but fear not, I found that the Llama 3 8B model works quite well in powering both code completion and code chat workloads!

Let's have a look at how we can set this up with VS Code for the absolute offline / in-flight coding bliss:

Install Ollama and pull Llama 3 8B

  1. Install Ollama
  2. Run ollama pull llama3:8b
  3. Once the downloade has completed, run ollama serve to start the Ollama server.

Configure Sourcegraph Cody in Vs Code

  1. Install the Sourcegraph Cody Vs Code Extension.
  2. Add the following to your Vs Code settings:
{
  //...
  // Cody autocomplete configuration:
  "cody.autocomplete.advanced.provider": "experimental-ollama",
  "cody.autocomplete.experimental.ollamaOptions": {
    "url": "http://127.0.0.1:11434",
    "model": "llama3:8b"
  },
  // Enable Ollama for Cody Chat:
  "cody.experimental.ollamaChat": true,
  // optional but useful to see detailed logs in the OUTPUT tab
  // (make sure to select "Cody by Sourcegraph" from the dropbdown)
  "cody.debug.verbose": true
  //...
}
Enter fullscreen mode Exit fullscreen mode

Start Cody and enjoy your Local Offline AI Code Assistant

That's it, as long as Ollama is running in the background, you should now have a fully functional offline AI code assistant for Vs Code with Cody. This setup allows you to use both code completion and code chat features without relying on any external services or internet connection. In fact most of this last paragraph was written by Llama 3 8B itself.

For Cody Chat, make sure to select the llama3:8b Experimental option from the dropdown and you're good to go! Happy Cod(y)ing \o/

Top comments (1)

Collapse
 
magnoabreu profile image
Carlos Magno Abreu

This worked at my home environment but doesn't work on computer at my work. The only difference is: I'm using local ollama server at my home and a hosted (a dedicated server here) ollama server at my work. I can't see the "Experimental" models option at the select model combo.