Off Grid - Private AI Chat

3.6
131 reviews
10K+
Downloads
Content rating
Everyone
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image

About this app

Your phone is powerful enough to run AI. Why send your data to someone else's server?

Off Grid runs AI entirely on your device. No internet required. No accounts. No subscriptions. No data ever leaves your phone.

WHAT YOU CAN DO
- Chat with AI - Run Llama, Qwen, Gemma, Phi and other open models
- Generate images - Stable Diffusion running on your phone's NPU
- Vision AI - Point your camera at anything and ask questions
- Voice input - Whisper transcription, completely offline
- Attach documents - Analyze PDFs, code, text files privately

WHY OFF GRID
- 100% offline after downloading models
- Your conversations never leave your device
- No cloud costs, no API limits, no subscriptions
- Works on airplane mode
- Open source (MIT licensed)

PERFORMANCE
- 15-30 tokens/sec text generation on flagship devices
- 5-10 second image generation with NPU acceleration
- Supports GPU offloading for faster inference

PRIVACY FIRST
Use AI for sensitive tasks you'd never trust to the cloud - journal entries, health questions, work notes, personal photos. Your data stays yours.

Download a model, go offline, and never look back.
Updated on
Apr 18, 2026

Data safety

Safety starts with understanding how developers collect and share your data. Data privacy and security practices may vary based on your use, region, and age. The developer provided this information and may update it over time.
No data shared with third parties
Learn more about how developers declare sharing
No data collected
Learn more about how developers declare collection

Ratings and reviews

3.5
125 reviews
Conall O'Shannessy
April 4, 2026
App is actually awesome. The only reason I gave it 3 stars is because it throws an error for tools on almost every model I use on my server. For vanilla qwen or using small models on phones its fine. But it needs a disable tools toggle so we can use any server model without this limitation. Happy to update to 5 stars if fixed, or if I missed the setting. cheers
1 person found this review helpful
Did you find this helpful?
Wednesday Solutions
April 7, 2026
I'll work on this and get it out soon
B Sheets
April 14, 2026
very nice. easy to use. the graphic AIs seem to be useless but that is probably my phone. i appreciate the ability to access my server hosting llms. however the fact i cannot save and export chats is a MAJOR let down. if i am working on a project i want to have backups. that and it will not run in the background. come on off grid, earn that 5 stars!
Did you find this helpful?
Wednesday Solutions
April 15, 2026
Working on this, will ship this in the next set of releases.
Rik The Prick
April 9, 2026
crashed on all versions of Gemma 4 e4b on my snapdragon 8 elite gen 5 except one I imported locally but then wouldn't let me repair it to add an mmproj. also it took over 5 minutes to add the model itself because of how long the import loading, not to be mistaken for model loading. also within minutes of trying different versions I was suddenly told my phone had no compatible models to download even after a clean reinstall of the app. I'm going back to pocketpal.
Did you find this helpful?
Wednesday Solutions
April 10, 2026
hey,sry u hit this,thnks for sharing.few things to chec:did u have any heavy models or apps running in bg for crashes?the import has improved but cn mke it faster .the no compatible models after reinstall sounds like a bug, haven’t seen this elsewhere or with someone else. could u join slack can help fix sme of these :https://tinyurl.com/48ffcc2z

What’s new

- **Android Download Manager (WorkManager)** — Model downloads now run as a managed background job. Survives app kills, handles network drops, and resumes automatically.
- **Standalone image generation** — Run image models without loading a text model first. Lower RAM footprint and faster cold starts for image-only workflows.
- **Heuristic tool calling**
- **Remote server API auth** — Connect to authenticated remote inference endpoints. Bring your own API key for OpenAI-compatible servers.

App support

About the developer
Mohammed Ali Chherawalla
mohammed.ali.chherawalla@gmail.com
India