1. Scope
This policy applies to the Ollama Client browser extension and related documentation pages.
2. Data We Collect
The extension does not run first-party analytics, ad trackers, or telemetry collection pipelines.
- Chat prompts and model responses generated during use
- Session metadata (titles, timestamps, branch state)
- Optional uploaded file content for local retrieval workflows
- Settings and provider configuration values
3. Where Data Is Stored
- Local browser storage (IndexedDB/Dexie and extension storage)
- Optional local SQLite artifacts used by migration/auxiliary features
- No default cloud sync from the extension itself
4. Network Communication
The extension sends prompts to provider endpoints you configure in the app.
- Common local endpoints: Ollama, LM Studio, and llama.cpp servers
- LAN endpoints are supported when configured by the user
- If you manually point to a remote endpoint, your prompt data will be sent to that endpoint
Privacy outcome depends on your configured provider endpoint, not only on extension code.
5. Permissions
The extension requests browser permissions for chat UX, optional page extraction, and provider connectivity.
storagefor settings and local statetabsand content extraction related permissions- Network/host permissions needed to reach configured provider endpoints
6. Retention and User Control
- You can delete chats and associated local artifacts in the UI
- You can disable RAG and related indexing features
- You can reset storage/configuration from settings
7. Security Notes
- Do not expose provider APIs publicly without access controls
- Treat LAN endpoints as sensitive infrastructure
- Review provider-specific privacy policies if you use non-local endpoints
8. Policy Changes
This policy may be updated as architecture evolves. The latest version is published at this URL.
9. Contact
Questions: shishirchaurasiya435@gmail.com