LocalAI – Offline AI Chat LLM

2.1
53 reviews
Content rating
Everyone
5K+
Downloads
Content rating
Everyone
Learn more
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image

About this app

LocalAI: Your 100% Offline, Private AI Assistant

Transform your Android into a powerful AI workstation. LocalAI runs Large Language Models entirely on-device using the Llama.cpp engine. No cloud, no subscriptions, zero data collection. Your prompts, documents, and photos never leave your phone.

Real-time AI chat on an airplane, off the grid, or anywhere you need total privacy.

🚀 Key Features

🔒 Absolute Privacy
All processing happens on your hardware via the Llama.cpp engine. Nothing is uploaded. No telemetry, no analytics, no data harvesting.

🧠 Run the World's Best Open-Source AI Models
Act as your own AI server! Browse, download, and manage thousands of trending GGUF models directly from the built-in HuggingFace Model Hub. Supported architectures include:
• Meta: LLaMA 4 (Scout, Maverick) & LLaMA 3.x
• DeepSeek: DeepSeek-V3.1 & DeepSeek-R1
• Nvidia: Nemotron (Nemotron-4, Mini, and variants)
• Alibaba: Qwen 3.5 & Qwen 2.5
• Google: Gemma 3n & Gemma 3
• Microsoft: Phi-4
• Mistral: Mistral Large 2.1, Mistral Nemo, Mixtral MoE
• Plus any compatible custom GGUF file!

🖼️ Vision and Multimodal AI
Load a vision-capable model (LLaVA, Qwen-VL, Moondream, SmolVLM, Gemma 3 Vision, or any model with an mmproj projector) and chat about your photos. Snap a picture and ask the AI to analyze, describe, or extract text — all processed 100% offline on your device.

📄 Chat with Your Documents
Attach PDFs, Word (.docx), Excel (.xlsx), CSV, or text files into the chat. LocalAI parses them on-device to answer questions, summarize reports, or extract data — all offline.

🎨 7 Beautiful Premium Themes
Customize your experience with professionally crafted themes:
• System Default (follows your device)
• Light Mode
• AMOLED Dark Mode
• Monokai (developer favorite)
• Emerald (nature-inspired green)
• Graphite (sophisticated neutral grays)
• Colorblind (deuteranopia-safe blue and orange)

🔤 10 Font Families
Comfortaa, Poppins, Quicksand, Raleway, Inter, Roboto, Josefin Sans, Courier Prime, Caveat, Fira Code — Regular, Medium, and Bold weights.

🌐 15 Languages
Full UI localization in English, Spanish, French, German, Italian, Portuguese, Russian, Chinese, Japanese, Korean, Arabic, Hindi, Bengali, Turkish, and Vietnamese.

🖼️ 8+ Professional Chat Backgrounds
Personalize your AI workspace with high-grade tiling patterns: Notebook (ruled paper), Blueprint, Topography, Circuit, Waves, Grid, Dots, and more.

⚙️ Advanced Inference Controls
Fine-tune the AI to your liking with expert sliders:
• Temperature (control creativity and randomness)
• Top K and Top P (focus the AI's reasoning and logic)
• Max Tokens (set response length limits)
• Live hardware monitoring (see your RAM usage, CPU architecture, and available storage in real-time)

📁 Built-in Model Manager
Browse trending models, download to local storage, and manage disk space. Pause, resume, or cancel downloads. Bookmark favorites.

💬 Persistent Chat History
Conversations stored locally via SQLite. Full Markdown rendering with syntax-highlighted code blocks, LaTeX math, and one-tap copy.

💡 Who Is LocalAI For?
• Privacy advocates — get the power of ChatGPT without giving your data to anyone.
• Professionals and students — summarize confidential PDFs and documents securely, completely offline.
• Travelers and digital nomads — draft emails, brainstorm ideas, and get answers without Wi-Fi or cellular data.
• AI enthusiasts and developers — experiment with temperature, analyze GGUF weights, test the latest open-source language models on your own hardware.

Note: Performance depends on hardware. Devices with 8GB+ RAM and modern chipsets (Snapdragon 8 Gen 3+, Dimensity 9300+) deliver significantly faster generation.

Download LocalAI today and take full control of your AI — privately, offline, and on your terms.
Updated on
May 5, 2026

Data safety

Safety starts with understanding how developers collect and share your data. Data privacy and security practices may vary based on your use, region, and age. The developer provided this information and may update it over time.
No data shared with third parties
Learn more about how developers declare sharing
No data collected
Learn more about how developers declare collection

Ratings and reviews

2.0
51 reviews
Islam Abukoush
April 25, 2026
Not that bad in general, and a good collection of models (some of them don't work though), and you can't upload images even in models that supposedly support images, you can upload but they give an error.
2 people found this review helpful
Did you find this helpful?
Apex Creators
April 26, 2026
We have forwarded the issue to our technical team. It will be resolved soon. Please reach out to us at info@apexcreators.co.in if you have any other issues.
Marrket IQ (Jagadeesh)
April 12, 2026
The bestest app for Local LLM ai usage. Easily beats Anything LLM and Ollama for Android. Looking forward with more better updates like tool features and more attachment options. Refer MSTY app for windows can you make upgrades like that for your app that would be great.
3 people found this review helpful
Did you find this helpful?
Apex Creators
April 12, 2026
Thank you for your feedback. We are working on such a feature.
Jim Hoath
April 13, 2026
looks good and very responsive on models of the right size. would give it 5* but when it tries to analyse an image it keeps saying; Something went wrong: Multimodal support not enabled. Call initMultimodal first. Tap retry to try again. this happens on all models I've tried so far which are capable
Did you find this helpful?

What’s new

• We update the app regularly to fix bugs, optimize performance and improve the experience.

Thanks for using LocalAI – Offline AI Chat LLM.