Home / Blog / Running DeepSeek AI Locally and Chatting from VS Code
Running DeepSeek AI Locally and Chatting from VS Code

Running DeepSeek AI Locally and Chatting from VS Code

Daniel Kelly
Daniel Kelly
Updated: February 11th 2025

Deepseek is an open source LLM that compares in quality to OpenAI’s o1 model but without the hefty price tag. You can chat with it directly via the official web app but if you’re concerned about data privacy you can also download the model to your local machine and run it with the confidence that your data isn’t going anywhere you don’t want it to. Once it’s available locally, you can interact with it in all kinds of ways.

Install Ollama

The first step to running DeepSeek locally is to download Ollama. You can think of it as npm but for LLMs. You can install it from their website for Mac, Windows, or Linux and follow the instructions it gives to get setup.

Once it’s installed, you can verify it’s setup correctly in your terminal by running the following command:

ollama -v

If all is well, then you’ll see the version of ollama that was installed.

Install Deepseek Locally with Ollama

Next, you can view what variations of the deepseek model are available for download on this page. There are multiple distilled models available. You can choose the one best fit for you based on the amount of space you’re willing/able to use.

The DeepSeek page on ollama’s website. It shows the available variations available for download.

Once you’ve chosen your variation, install it by copying and running the command to the right of the select dropdown. For example for 7b, you’d run this (this is what I did).

ollama run deepseek-r1:7b

And that’s it! Now you can start chatting with the model directly from your terminal.

![screenshot of deepseek running in a mac terminal via the ollama run command](https://i.imgur.com/Wa5DzDS.png

If you’d like to quit running Deepseek, you can press ctrl+d.

Chat with DeepSeek from VS Code (Visual Studio Code)

Chatting via the command line is ok, but it would be much nicer if we could access all that knowledge directly in VS Code. No problem! There are several different approaches to doing this. I liked the DeepSeek for GitHub Copilot extension best.

It adds a @deepseek command to the chat panel to direct your prompt to the locally running DeepSeek instance instead of the remote models Co-pilot supports out of the box.

Chatting with DeepSeek in VS Code via the DeepSeek for Github Copilot Extension

Alternatively, you could build your own VS code extension in under 7 minutes that does something similar or try out the Continue VS Code extension.

Alternative Ways of Interacting with DeepSeek Installed on Your Local Machine

Besides chatting via the terminal and VS code, there are ways of interacting with this locally run model.

Use Local DeepSeek Model via the npm Ollama Package

Use the npm ollama package to talk to any model running on ollama via JavaScript or TypeScript code. It’s as easy as running the model (as above), installing the dependency, and calling a chat function.

npm install ollama
import ollama from 'ollama'

const response = await ollama.chat({
  model: 'llama3.1',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)

Use OpenWebUI for an In-Browser Chat App

OpenWebUI provides an interface much like you’re accustomed to using with ChatGPT but it runs locally in a docker container and uses your local ollama models as the brains.

Chatting with the local DeepSeek model with OpenWebUI from the browser

With docker installed and running, use the following command to install and run OpenWebUi

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

That’s it! Now you can visit localhost:3000 to view the chat app.

Conclusion

DeepSeek is a huge win for developers needing more affordable access to world-class models. The fact that it’s open source and can easily be run privately on your own hardware is also a huge win!

Related Courses

Start learning Vue.js for free

Daniel Kelly
Daniel Kelly
Daniel is the lead instructor at Vue School and enjoys helping other developers reach their full potential. He has 10+ years of developer experience using technologies including Vue.js, Nuxt.js, and Laravel.

Comments

Latest Vue School Articles

Rich Content Comments with TinyMCE and Vue.js

Rich Content Comments with TinyMCE and Vue.js

Enhance collaboration in your Vue.js application by integrating TinyMCE’s powerful commenting system for seamless team feedback. Learn how to set up embedded comment storage, manage permissions, and customize user authentication for a streamlined content review process.
Daniel Kelly
Daniel Kelly
Vue.js Testing with Vue Test Utils and Vitest

Vue.js Testing with Vue Test Utils and Vitest

For inexperienced testers, Vue.js testing can be intimidating. But Vitest and Vue Test Utils makes testing Vue components a breeze!
Daniel Kelly
Daniel Kelly

Our goal is to be the number one source of Vue.js knowledge for all skill levels. We offer the knowledge of our industry leaders through awesome video courses for a ridiculously low price.

More than 200.000 users have already joined us. You are welcome too!

Follow us on Social

© All rights reserved. Made with ❤️ by BitterBrains, Inc.