Chat with Local LLM Models
Right in Your Browser

Privacy-first Ollama Chrome extension and Ollama UI alternative that brings any local AI model directly to your browser tabs. Choose from hundreds of models — LLaMA, Mistral, Gemma, Qwen, GPT-OSS, and more. Features enhanced content extraction with lazy loading, advanced text-to-speech, site-specific overrides, YouTube transcript automation, and more. Enjoy offline AI chat with no cloud APIs and zero data sharing

Ollama Client - Chat with local LLMs — right inside your browser | Product Hunt
100% Privacy
AI Models
0 Cloud APIs
🌐
Chrome
🦁
Brave
📘
Edge
🎭
Opera
🦊
Firefox

Powerful Features

Everything you need for productive AI conversations, all running locally on your machine

🤖

Any Ollama Model

Use any AI model available in the Ollama library - from lightweight models to the most powerful ones. Your choice, your hardware.

  • Complete Ollama model library
  • Model selection menu
  • Streaming responses
  • Stop generation control
💬

Multi-Chat Sessions

Organize your conversations with multiple chat sessions, all saved locally using IndexedDB.

  • Multiple concurrent chats
  • Session persistence
  • Tab refresh awareness
  • Conversation history
🔧

Enhanced Content Extraction

Advanced content extraction with lazy loading support, site-specific configuration, and intelligent fallback strategies for modern web pages.

  • Defuddle integration for better parsing
  • Lazy loading & infinite scroll support
  • Site-specific extraction overrides
  • Automated YouTube transcript extraction
  • GitHub repository & profile support
  • Smart scroll strategies (none, instant, gradual, smart)
  • Network idle detection
📝

Prompt Templates

Pre-built templates for common tasks to boost your productivity.

  • Summarize content
  • Translate text
  • Explain code
  • email-professional
  • Custom templates
🔊

Advanced Text-to-Speech

Built-in speech synthesis with searchable voice selection, customizable rate and pitch controls, and seamless cross-browser compatibility for an accessible AI chat experience.

  • Searchable voice selector with language grouping
  • Adjustable speech rate (0.5x - 2.0x)
  • Voice pitch control (0.0 - 2.0)
  • Custom text testing & preview
  • Cross-browser voice loading optimization
  • Per-message TTS controls
⚙️

Advanced Configuration

Full control over AI behavior with comprehensive settings and options.

  • Ollama API parameter exposure
  • Excluded URLs (regex support)
  • Model-specific settings
  • Debug logging
🔒

Privacy First

Your conversations never leave your machine. Complete privacy and data ownership.

  • No cloud API calls
  • Local data storage
  • No telemetry
  • Open source code

Built with Modern Technologies

Leveraging the latest tools and frameworks for optimal performance and developer experience

⚛️
React
🔧
Plasmo
📘
TypeScript
🎨
shadcn/ui
🗃️
Dexie.js
📖
Readability

Get Started in 3 Easy Steps

Simple setup process to get you chatting with AI models locally

1

Install Ollama

Download and install Ollama on your system, then pull any AI model from their extensive library - from efficient 7B models to powerful 70B+ models.

2

Add Extension

Install the Ollama Client extension from the Chrome Web Store with a single click.

3

Start Chatting

Open the side panel, select a model, and start chatting on any website.
⚠️ CORS issue? Follow the setup guide .

🚀 Install Now - It's Free