Run Open Source Local AI Models in Excel with Ollama

Deepanshu Bhalla Add Comment ,

This article explains how to use open-source AI models in Excel using Ollama. It helps maintain data privacy as the data remains on your local machine and the models are completely free (no paid API required).

We have built an Excel add-in for Ollama that connects the Ollama server with Microsoft Excel. It offers the flexibility to connect to either a local or a remote server.

How to Run Open Source Local AI Models in Excel

Prerequisites

Step 1 : Ollama app on your machine
  • Windows: Ollama
  • macOS: Ollama

Windows: Ollama

On Windows, search for command Prompt and then right click and choose "Run as administrator".


# recommended install (run as admin)
winget install --id Ollama.Ollama -e

# (optional) start server if the installer didn't already
ollama serve

# download a model
ollama pull gemma3:4b
Step 2 : Download Excel Add-In for Ollama

You can download the excel add-in by clicking on the link below.

When you download any add-in or macro file from internet, Microsoft blocks them from running and shows the source of this file is untrusted. You need to follow the steps below to make it trusted.

  1. Go to the folder where the downloaded add-in file is located.
  2. Right-click the file and choose Properties from the menu.
  3. At the bottom of the General Tab, select the Unblock checkbox under security option and then click OK.

Refer the following steps to install Ollama add-in in MS Excel.

  1. Open Excel and click on the File tab in the ribbon.
  2. Click on Options and then select Add-ins from the left-hand menu.
  3. In the Manage drop-down menu at the bottom of the screen, select Excel Add-ins and click on the Go button.
  4. Click on the Browse button and locate the add-in file that you downloaded.
  5. Select the add-in file and click on the OK button.
  6. You should see the name of the add-in file in the Add-Ins dialog box. Check the box next to the add-in name to activate it.
  7. Once you are done with the above steps, a new tab called Ollama should be visible in your Excel workbook.
How to use Ollama's Excel add-in

Assuming your prompt is in cell A2 and then run =Ollama(A2). To run on multiple cells, you can drag the function as shown in the image below.

Run Ollama in Excel
 Parameter order: userMsg, model, systemMsg, temperature, baseUrl, maxTokens

' Simple usage
=Ollama(A2)

' Specify Model 
=Ollama(A2, "qwen3:4b")

' Model + system message + temperature
=Ollama(A2, "qwen3:4b", "Be concise", 0.5)
  
' With custom system message (leave model empty)
=Ollama(A2, "", "You are a helpful coding assistant")

' Temperature only (skip model & systemMsg)
=Ollama(A2, "", "", 0.5)
  
' Custom URL. Default URL is http://127.0.0.1:11434. (skip temperature by leaving it blank)
=Ollama(A2, "", "", ,"http://192.168.1.100:11434")

' Model + system message + temperature + baseUrl + max tokens (set maxTokens to 500)
=Ollama(A2, "qwen3:4b", "Be concise", 0.5, "http://192.168.1.100:11434", 500)


Check Configuration

In the Ollama tab, click the Check Status button to see whether the server is running, view the models you have downloaded and see the current model in use.

=== OLLAMA STATUS (Windows) ===
Status: RUNNING
URL: http://127.0.0.1:11434

Available models:
- gemma3:4b (3.1 GB) [2025-08-10]
- hf.co/unsloth/Qwen3-4B-Instruct-2507-GGUF:latest (2.3 GB) [2025-08-10]
Start Ollama Server from Excel

In the Ollama tab, click the Start Ollama button to start ollama server. Alternatively, you can also use the =StartOllama() function to start the server. If you are using any port other than 11434, you can specify in the function like =StartOllama("portNumber").

Stop Ollama Server from Excel

In the Ollama tab, click the Stop Ollama button to stop ollama server. Alternatively, you can also use the =StopOllama() function to stop the server.

Change Model and URL

In the Ollama tab, click the Change Model button to change model. Similarly, click on the Change URL button to change URL.

How to use add-in functions in your macro

Refer to the following program wherein I have used Application.Run to run the function of add-in. In this way, you can use it in your own macro.


Sub Test()

' Call the function and get the result
Dim result As String
result = Application.Run("Ollama", "Capital of USA", "gemma3:4b")

' Show the result in a message box
MsgBox result

End Sub

Ollama Excel Add-in - Functions List
  1. =Ollama(userMsg, [model], [systemMsg], [temperature], [baseUrl], [maxTokens])
    • Purpose: Main AI function to get responses from Ollama models
    • Parameters:
      • userMsg (Required) - Your question or prompt
      • systemMsg (Optional) - Instructions for the AI's behavior
      • baseUrl (Optional) - Custom server URL (uses global setting if empty)
      • model (Optional) - Specific model name (uses global setting if empty)
      • temperature (Optional) - Creativity level 0.1-2.0 (default: 0.7)
      • maxTokens (Optional) - Maximum response length (default: 32768)
  2. =TestOllama([baseUrl])
    • Purpose: Test connection to Ollama server
    • Parameters:
      • baseUrl (Optional) - Server URL to test (uses global setting if empty)
    • Returns: "SUCCESS: Connected..." or error message
    • Example: =TestOllama()
  3. =IsOllamaRunning([baseUrl])
    • Purpose: Check if Ollama server is running
    • Parameters:
      • baseUrl (Optional) - Server URL to check (uses global setting if empty)
    • Returns: TRUE or FALSE
    • Example: =IsOllamaRunning()
  4. =ListOllamaModels([baseUrl])
    • Purpose: Get list of available AI models
    • Parameters:
      • baseUrl (Optional) - Server URL (uses global setting if empty)
    • Returns: Formatted list of models with sizes and dates
    • Example: =ListOllamaModels()
  5. =GetSelectedModelInfo([baseUrl])
    • Purpose: Show which model will be used by default
    • Parameters:
      • baseUrl (Optional) - Server URL (uses global setting if empty)
    • Returns: Information about current default model selection
    • Example: =GetSelectedModelInfo()
  1. =StartOllama([customPort])
    • Purpose: Start Ollama server programmatically
    • Parameters:
      • customPort (Optional) - Custom port number (default: 11434)
    • Returns: Success/failure message
    • Example: =StartOllama()
  2. =StopOllama()
    • Purpose: Stop Ollama server programmatically
    • Returns: Success/failure message
    • Example: =StopOllama()
  3. =RestartOllama()
    • Purpose: Restart Ollama server (stop + start)
    • Returns: Combined stop and start results
    • Example: =RestartOllama()
  4. =GetOllamaStatus()
    • Purpose: Get comprehensive server and model status
    • Returns: Detailed status report including running state, models, and selected model
    • Example: =GetOllamaStatus()
  5. =PullModel()
    • Purpose: Get instructions for downloading models manually
    • Returns: Step-by-step download instructions
    • Example: =PullModel()
  1. =GetGlobalTemperature()
    • Purpose: Get current global temperature setting
    • Returns: Current temperature value (0.1-2.0)
    • Example: =GetGlobalTemperature()
  2. =GetGlobalBaseURL()
    • Purpose: Get current global base URL setting
    • Returns: Current server URL
    • Example: =GetGlobalBaseURL()
  3. =GetGlobalModel()
    • Purpose: Get current global model setting
    • Returns: Current default model name
    • Example: =GetGlobalModel()
Notes
  1. Global Settings: Use the add-in buttons to change temperature, model or server URL globally
  2. Performance: Model cache refreshes automatically when server starts/stops
  3. Flexibility: Can override any global setting in individual formulas
  4. Auto-Selection: If your preferred model isn't available, the system auto-selects the best alternative
Related Posts
Spread the Word!
Share
About Author:
Deepanshu Bhalla

Deepanshu founded ListenData with a simple objective - Make analytics easy to understand and follow. He has over 10 years of experience in data science. During his tenure, he worked with global clients in various domains like Banking, Insurance, Private Equity, Telecom and HR.

Post Comment 0 Response to "Run Open Source Local AI Models in Excel with Ollama"
Next →