Privacy
Privacy Policy
Ollama Client is designed as a local-first extension. This policy explains what data is processed, where it is stored, and when it leaves your device.
Effective date: February 7, 2026
1. Scope
2. Data We Collect
- Chat prompts and model responses generated during use
- Session metadata (titles, timestamps, branch state)
- Optional uploaded file content for local retrieval workflows
- Settings and provider configuration values
3. Where Data Is Stored
- Local browser storage backed by SQL WASM (sql.js)
- Optional ZIP backups stored locally when you export
- No default cloud sync from the extension itself
4. Network Communication
- Common local endpoints: Ollama, LM Studio, and llama.cpp servers
- LAN endpoints are supported when configured by the user
- If you manually point to a remote endpoint, your prompt data will be sent to that endpoint
Privacy outcome depends on your configured provider endpoint, not only
on extension code.
5. Permissions
storagefor settings and local statetabsand content extraction related permissions- Network/host permissions to reach provider endpoints
6. Retention and User Control
- You can delete chats and associated local artifacts in the UI
- You can disable RAG and related indexing features
- You can reset storage/configuration from settings
7. Security Notes
- Do not expose provider APIs publicly without access controls
- Treat LAN endpoints as sensitive infrastructure
- Review provider privacy policies for non-local endpoints
8. Policy Changes
9. Contact
© 2026 Ollama Client. Local-first by design, with endpoint behavior controlled by the user.