Running Local LLMs for Cursor IDE: A Quick Guide to Ollama Integration
Running Large Language Models (LLMs) locally are becoming increasingly accessible, and integrating them directly into your IDE workflow can dramatically boost productivity if you got some good hardware already. This guide demonstrates how to run LLMs locally using Ollama and connect with Cursor IDE.
1. Setting Up Ollama
Ollama local LLM deployment.
-
Install Ollama: Follow the installation instructions for your operating system: https://ollama.com/docs/install
set these envs:
export OLLAMA_ORIGINS=*
export OLLAMA_HOST=0.0.0.0:11434Start the Model: Run
ollama serve &
to launch the LLM server in the background.run ollama and pull models:
ollama serve &
ollama pull llama3.1:8b
-
Pull a Model: Let's start with
llama3.1:8b
. Runollama pull llama3.1:8b
. This downloads the model – you can explore other models on the Ollama website (https://ollama.com/library). You'll see a list of available models when you runollama list
. Test available models
❯ ollama list
NAME ID SIZE MODIFIED
llama3:latest 365c0bd3c000 4.7 GB 10 hours ago
llama3.1:8b 46e0c10c039e 4.9 GB 10 hours ago
gemma3:4b a2af6cc3eb7f 3.3 GB 10 hours agoConfirm using curl:
curl -X POST http://localhost:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.1:8b",
"messages": [{"role": "user", "content": "Hello we are trying you from cursor ide!"}]
}'curl -v http://127.0.0.1:11435/v1/models
curl -X POST https://127.0.0.1:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gemma3:4b",
"messages": [
{"role": "user", "content": "Hello!"},
{"role": "assistant", "content": "Hi! Ready to help you learn JavaScript."},
{"role": "user", "content": "Teach me TypeScript basics"}
]
}'Few points to note:
llama.cpp --jinja didn't work giving "code":500,"message":"Conversation roles must alternate user/assistant/user/assistant/...",
not sure if it's due to model's limitation or not. ollama worked for selected models only
2. Connecting Ollama with Ngrok (for External Access)
To make the LLM accessible from Cursor, we’ll use Ngrok to create a secure tunnel. Because Cursor does't allow using localhost.
I tried edigin /ets/hosts, but didn't work.
- Create an Ngrok Account: Sign up for a free account at https://ngrok.com/.
- Obtain an Authtoken: Follow the instructions on the Ngrok website to obtain your authtoken.
- Configure Ngrok: In your terminal, run
ngrok config add-authtoken <YOUR_AUTHTOKEN>
. - Expose the Port: Run
ngrok http 11434
. This will provide a public URL for your local LLM.
Test through ngrok using curl:
xxx-> you should your ngrok url instead
curl -v https://xxx.ngrok-free.dev/v1/models
3. Configuring Cursor for Local LLM Access
This is the critical step to integrate the LLM into your Cursor IDE.
- Open Cursor Settings: Navigate to Settings > Models within your Cursor IDE.
- Add a New Model: Click “Add New Model.”
- Enter Model Name: Provide the exact model name you used when pulling it with Ollama (e.g.,
llama3.1:8b
). - API URL: Paste the Ngrok URL (e.g.,
https://xxx.ngrok-free.dev/v1
) into the API URL field. - Enable OpenAI API & Base URL: Ensure both the OpenAI API and Base URL are enabled and set to the Ngrok URL.
Go to settings/models
add new model with exact model name [llama3.1:8b] worked for me as of (20th Oct- 2025)
Enable OpenAI api and base url. update base url with ngrok url. [https://xxx.ngrok-free.dev/v1]
4. Testing the Integration
- Verify Model in Cursor: Check that the model appears in the list of available models within Cursor.
- Start Using the LLM: Select the
llama3.1:8b
model within Cursor and begin interacting with it directly within your code.
Troubleshooting Tips
- Conversation Formatting: LLMs are sensitive to the structure of your prompts. Ensure you're using
user
andassistant
roles correctly. - Ngrok URL: Double-check that the Ngrok URL is correct and active.
- Ollama Server: Verify that the
ollama serve
command is running without errors. - Cursor Updates: Cursor's LLM integration may evolve with updates. Keep your Cursor version current.
My Cursor version:
Version: 1.7.52
VSCode Version: 1.99.3
Commit: 9675251a06b1314d50ff34b0cbe5109b78f848c0
Date: 2025-10-17T01:41:03.967Z
Electron: 34.5.8
Chromium: 132.0.6834.210
Node.js: 20.19.1
V8: 13.2.152.41-electron.0
OS: Linux x64 6.14.0-33-generic
Comments
Post a Comment