Home / Blog / Running DeepSeek AI Locally and Chatting from VS Code
Running DeepSeek AI Locally and Chatting from VS Code

Running DeepSeek AI Locally and Chatting from VS Code

Daniel Kelly
Daniel Kelly
Updated: February 12th 2025

Deepseek is an open source LLM that compares in quality to OpenAI’s o1 model but without the hefty price tag. You can chat with it directly via the official web app but if you’re concerned about data privacy you can also download the model to your local machine and run it with the confidence that your data isn’t going anywhere you don’t want it to. Once it’s available locally, you can interact with it in all kinds of ways.

Install Ollama

The first step to running DeepSeek locally is to download Ollama. You can think of it as npm but for LLMs. You can install it from their website for Mac, Windows, or Linux and follow the instructions it gives to get setup.

Once it’s installed, you can verify it’s setup correctly in your terminal by running the following command:

ollama -v

If all is well, then you’ll see the version of ollama that was installed.

Install Deepseek Locally with Ollama

Next, you can view what variations of the deepseek model are available for download on this page. There are multiple distilled models available. You can choose the one best fit for you based on the amount of space you’re willing/able to use.

The DeepSeek page on ollama’s website. It shows the available variations available for download.

Once you’ve chosen your variation, install it by copying and running the command to the right of the select dropdown. For example for 7b, you’d run this (this is what I did).

ollama run deepseek-r1:7b

And that’s it! Now you can start chatting with the model directly from your terminal.

screenshot of deepseek running in a mac terminal via the ollama run command

If you’d like to quit running Deepseek, you can press ctrl+d.

Chat with DeepSeek from VS Code (Visual Studio Code)

Chatting via the command line is ok, but it would be much nicer if we could access all that knowledge directly in VS Code. No problem! There are several different approaches to doing this. I liked the DeepSeek for GitHub Copilot extension best.

It adds a @deepseek command to the chat panel to direct your prompt to the locally running DeepSeek instance instead of the remote models Co-pilot supports out of the box.

Chatting with DeepSeek in VS Code via the DeepSeek for Github Copilot Extension

Alternatively, you could build your own VS code extension in under 7 minutes that does something similar or try out the Continue VS Code extension.

Alternative Ways of Interacting with DeepSeek Installed on Your Local Machine

Besides chatting via the terminal and VS code, there are ways of interacting with this locally run model.

Use Local DeepSeek Model via the npm Ollama Package

Use the npm ollama package to talk to any model running on ollama via JavaScript or TypeScript code. It’s as easy as running the model (as above), installing the dependency, and calling a chat function.

npm install ollama
import ollama from 'ollama'

const response = await ollama.chat({
  model: 'llama3.1',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)

Use OpenWebUI for an In-Browser Chat App

OpenWebUI provides an interface much like you’re accustomed to using with ChatGPT but it runs locally in a docker container and uses your local ollama models as the brains.

Chatting with the local DeepSeek model with OpenWebUI from the browser

With docker installed and running, use the following command to install and run OpenWebUi

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

That’s it! Now you can visit localhost:3000 to view the chat app.

Conclusion

DeepSeek is a huge win for developers needing more affordable access to world-class models. The fact that it’s open source and can easily be run privately on your own hardware is also a huge win!

Related Courses

Start learning Vue.js for free

Daniel Kelly
Daniel Kelly
Daniel is the lead instructor at Vue School and enjoys helping other developers reach their full potential. He has 10+ years of developer experience using technologies including Vue.js, Nuxt.js, and Laravel.

Comments

Latest Vue School Articles

Handling File Uploads in Nuxt with useStorage

Handling File Uploads in Nuxt with useStorage

Learn how to implement secure and flexible file uploads in your Nuxt application using the useStorage composable and Unstorage’s unified API. This step-by-step guide covers everything from validation to storage and serving files, making it easy to build a production-ready upload system.
Daniel Kelly
Daniel Kelly
Build a File Upload Component in Vue.js with the Composition API

Build a File Upload Component in Vue.js with the Composition API

Learn to build a Vue.js file upload component using the Composition API. Master file selection, custom styling, multiple files support, and more in this hands-on tutorial.
Daniel Kelly
Daniel Kelly

Our goal is to be the number one source of Vue.js knowledge for all skill levels. We offer the knowledge of our industry leaders through awesome video courses for a ridiculously low price.

More than 200.000 users have already joined us. You are welcome too!

Follow us on Social

© All rights reserved. Made with ❤️ by BitterBrains, Inc.