Run AI Locally in VS Code (FREE Copilot Alternative) | Ollama + DeepSeek Setup
By Kamlesh Bhor Β· π 20 Apr 2026 Β· ποΈ 7
π₯Introduction:
Artificial Intelligence is transforming how developers write code. Tools like GitHub Copilot and ChatGPT have made coding faster β but they come with cost, internet dependency, and privacy concerns.
π What if you could run AI directly on your laptop, inside VS Code, without any API key or subscription?
In this guide, youβll learn how to build a free, offline AI coding assistant using:
- Ollama (local AI runtime)
- DeepSeek models (coding + reasoning)
- VS Code + Continue extension
π What You Will Build
By the end of this tutorial, you will have:
- β AI assistant inside VS Code
- β No API or internet required
- β Multiple AI models working together
- β Full control over your data
π§° Prerequisites
Before starting, ensure you have:
- Windows / Mac / Linux system
- Minimum 16GB RAM recommended
- VS Code installed
- Basic understanding of development (.NET preferred)
βοΈ Step 1: Install Ollama (Local AI Engine)
Ollama allows you to run large language models locally with a simple command.
Installation:
- Visit: https://ollama.com
- Download and install
- Restart your system
Verify Installation:
ollama --version
π If version appears β installation successful
π¦ Step 2: Install AI Models
Now install the required models:
ollama pull deepseek-coder:6.7b
ollama pull deepseek-r1:8b
ollama pull llama3:8b
Β
π§ Model Roles Explained
| Model | Purpose |
|---|---|
| DeepSeek-Coder | Code generation |
| DeepSeek-R1 | Planning & reasoning |
| Llama 3 | Code review & explanation |
π Using multiple models gives better results than using just one.
π§ͺ Step 3: Test AI Locally
Run:
ollama run deepseek-coder
Try prompt:
Create a .NET Web API controller for product CRUD
π You will see full code generated locally.
π§© Step 4: Install VS Code Extension
- Open VS Code
- Go to Extensions
- Install Continue
This extension connects your local AI models to VS Code.
βοΈ Step 5: Configure Models in Continue
Open config/settings:
Ctrl + Shift + P β Continue: Open Config
Paste this:
name: Local Config
version: 1.0.0
schema: v1
models:
- name: DeepSeek Coder
provider: ollama
model: deepseek-coder:6.7b
- name: DeepSeek R1
provider: ollama
model: deepseek-r1:8b
- name: Llama
provider: ollama
model: llama3:8b
Save and reload VS Code.
π€ Step 6: Use AI in VS Code
Open Continue panel and start using AI.
π§ͺ Example 1: Generate Code
Create a .NET Web API controller for Product CRUD operations
π§ Example 2: Improve Architecture
Switch to DeepSeek-R1:
How can I improve scalability and security of this API?
π Example 3: Code Review
Switch to Llama:
Review this code and suggest improvements
π₯ Multi-Model AI Workflow (Advanced)
Instead of using one AI, use a team-based workflow:
- R1 β Plan architecture
- Coder β Generate code
- Llama β Review and optimize
π This approach improves quality significantly.
β οΈ Limitations of Local AI
While powerful, local AI has some limitations:
- β No full automation (agent mode limited)
- β Slightly slower than cloud models
- β Limited context handling
π Still, it is highly useful for everyday development.
π‘ Best Practices
- Use smaller models (7Bβ8B) for speed
- Avoid large unnecessary context
- Provide clear and structured prompts
- Use file-specific context instead of full project
π― Who Should Use This Setup?
This setup is ideal for:
- .NET developers
- Students learning AI
- Developers who want free tools
- Privacy-conscious engineers
π Benefits of Local AI
- πΈ Zero cost
- π Full privacy
- β‘ Works offline
- π§ Better understanding of AI workflows
π Common Questions
β Can I use AI in VS Code for free?
Yes, using Ollama and local models, you can run AI without any cost.
β Is this better than GitHub Copilot?
It depends:
- Copilot β faster, cloud-based
- Local AI β free, private, offline
β Do I need internet after setup?
No. Once models are downloaded, everything works offline.
π Future of Local AI
Local AI is evolving rapidly. Soon, we will have:
- Better agent workflows
- Faster models
- Full automation locally
π Learning this today gives you a strong advantage.
π Conclusion
You donβt need expensive tools to be productive.
With this setup, you can:
- Run AI locally
- Write better code
- Learn faster
π And most importantly β stay independent of cloud tools
Β
Article by Kamlesh Bhor
Feel free to comment below about this article.
π¬ Join the Discussion