🧠 Ollama Setup Guide

Complete guide to set up Ollama backend for the Chrome Extension

1. Install Ollama

Download Ollama from the official website: Download Ollama

💡
Keep Ollama Running

Ollama must be running while using the extension. The helper script in the next section will start it automatically for you.

2. Quick Start: Use Helper Script (Recommended)

The easiest way to set up Ollama with automatic CORS configuration and LAN access! This script works on macOS, Linux, and Windows.

🎯 What This Script Does

  • ✅ Automatically detects your OS
  • ✅ Configures Ollama for LAN and Localhost access
  • ✅ Sets up CORS for browser extensions (Firefox only)
  • ✅ Shows your local IP address
  • ✅ Starts Ollama in the background

Step 1: Create the script file ollama-env.sh:

#!/bin/bash

# Cross-platform Ollama environment setup script
# Starts Ollama with proper CORS configuration for browser extensions
# Works on macOS, Linux, and Windows (with Git Bash/WSL)

MODE=$1

# Detect OS
OS="$(uname -s)"
case "${OS}" in
  Linux*)     OS_TYPE="linux" ;;
  Darwin*)    OS_TYPE="macos" ;;
  MINGW*|MSYS*|CYGWIN*) OS_TYPE="windows" ;;
  *)          OS_TYPE="unknown" ;;
esac

# Kill existing Ollama processes
if [ "$OS_TYPE" = "windows" ]; then
  taskkill //F //IM ollama.exe 2>/dev/null || true
else
  pkill -f "ollama serve" 2>/dev/null || true
fi

sleep 1

# Set host to 0.0.0.0 so LAN devices can access it
export OLLAMA_HOST="0.0.0.0"

# Get local IP address (cross-platform)
get_local_ip() {
  if [ "$OS_TYPE" = "macos" ]; then
    ipconfig getifaddr en0 2>/dev/null || ipconfig getifaddr en1 2>/dev/null || echo ""
  elif [ "$OS_TYPE" = "linux" ]; then
    hostname -I 2>/dev/null | awk '{print $1}' || \
    ip -4 addr show | grep -oP '(?<=inet\s)\d+(\.\d+){3}' | head -1 2>/dev/null || echo ""
  elif [ "$OS_TYPE" = "windows" ]; then
    ipconfig 2>/dev/null | grep -i "IPv4" | head -1 | grep -oE '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}' | head -1 || echo ""
  else
    echo ""
  fi
}

# Start Ollama based on mode
if [ "$MODE" = "firefox" ]; then
  export OLLAMA_ORIGINS="chrome-extension://*,moz-extension://*"
  nohup ollama serve > ~/.ollama-firefox.log 2>&1 &
  echo "✅ Ollama started with Firefox CORS + LAN access"
else
  nohup ollama serve > ~/.ollama-chrome.log 2>&1 &
  echo "✅ Ollama started with LAN access"
fi

sleep 2

LOCAL_IP=$(get_local_ip)

echo ""
echo "🌍 Access URLs:"
echo "   • http://localhost:11434"
if [ -n "$LOCAL_IP" ]; then
  echo "   • http://$LOCAL_IP:11434"
fi
echo ""

if [ "$OS_TYPE" = "windows" ]; then
  echo "💡 Tip: Ollama is running. To stop it, run:"
  echo "   taskkill //F //IM ollama.exe"
else
  echo "💡 Tip: Ollama is running in the background. To stop it, run:"
  echo "   pkill -f \"ollama serve\""
fi
echo ""

Step 2: Make it executable and run:

# Make executable chmod +x ollama-env.sh # Run (macOS/Linux/Windows Git Bash/WSL) ./ollama-env.sh firefox # Firefox with CORS + LAN access ./ollama-env.sh chrome # Chrome with LAN access # Windows PowerShell bash ollama-env.sh firefox
✨ Done! Ollama is now configured and running. You can skip to section 4 to verify it's working, or continue to section 3 to pull a model.
🤖

3. Pull a Model (e.g. Gemma 3B)

After installation, run a model of your choice:

ollama run gemma3:1b

Once downloaded, you're ready to chat!

💡 Replace gemma3:1b with other models like llama3, mistral, etc.
🔍

4. Verify Ollama

Open this in your browser:

http://localhost:11434

Or use curl:

curl http://localhost:11434/api/tags
You should see a JSON response with available models.
⚙️

5. Configure Extension

1. Click the ⚙️ Settings icon in the extension popup.

2. You can configure:

  • Base URL (http://localhost:11434)
  • Default model (gemma:3b, llama3, etc.)
  • Theme and other preferences
🔐

Understanding CORS (Optional Reading)

Skip this section if the helper script worked! This section explains when and why CORS configuration is needed. Only read if you're troubleshooting or prefer manual setup.

Do you need to set up CORS? (Click to expand)

Whether you need to configure CORS depends on your browser:

✅ Chrome / Chromium (e.g., Brave, Edge)

If you're using Chrome-based browsers and extension version 0.1.3 or later, you likely do not need to set any CORS headers.

Ollama Client uses Chrome's Declarative Net Request (DNR) API to rewrite Origin headers in requests to localhost, which lets it bypass CORS errors without backend changes.

🦊 Firefox

Firefox does not support Chrome's DNR API, so manual configuration is required. This is why the helper script sets OLLAMA_ORIGINS automatically.

When should you manually set OLLAMA_ORIGINS?

  • You're using Firefox
  • You're on extension version < 0.1.3
  • You're calling from localhost:3000 or another frontend
  • You're still getting ❌ 403 Forbidden errors

To cover both Chrome and Firefox, you can combine origins:

OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*
🔧

Manual CORS Setup (Advanced)

Only use this if the helper script didn't work for you. Follow these platform-specific instructions to manually set OLLAMA_ORIGINS:

macOS (Launch Agent)

1. Edit the plist file:

nano ~/Library/LaunchAgents/com.ollama.server.plist

2. Add inside <key>EnvironmentVariables</key>:

<key>OLLAMA_ORIGINS</key> <string>chrome-extension://*,moz-extension://*</string>

3. Reload the agent:

# Restart the LaunchAgent (modern macOS) launchctl kickstart -k gui/$(id -u)/com.ollama.server # Or if not loaded, bootstrap first: launchctl bootstrap gui/$(id -u) ~/Library/LaunchAgents/com.ollama.server.plist launchctl kickstart -k gui/$(id -u)/com.ollama.server

Linux (systemd)

1. Edit the service file:

sudo systemctl edit --full ollama.service

2. Add under [Service]:

Environment="OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*"

3. Reload and restart:

sudo systemctl daemon-reload sudo systemctl restart ollama

Windows

1. Press Win + R, type sysdm.cpl, press Enter.

2. Go to AdvancedEnvironment Variables.

3. Add a new User variable:

  • Name: OLLAMA_ORIGINS
  • Value: chrome-extension://*,moz-extension://*

4. Restart Ollama.

💡 Multiple Origins Support

You can also allow local web apps like this:

OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*,http://localhost:3000
🧯

Troubleshooting

Common Issues:

  • Ensure ollama serve is running
  • Make sure your model is pulled: ollama pull <model>
  • Avoid firewalls/VPNs that block localhost

Still Having Issues?

  • Verify your OLLAMA_ORIGINS is correctly set
  • Check browser console for error messages
  • Restart Ollama service after config changes
📎

Helpful Links

🌐 Documentation

Ollama Docs

🧩 Extension

Chrome Store

💻 Source Code

GitHub