Llamatik — Local AI Chatbot Powered by On-Device Models
Llamatik is an offline AI chatbot designed to showcase the capabilities of the Llamatik Kotlin Multiplatform library. All AI text generation happens directly on your device, using lightweight, efficient open-source models. No internet connection is required for chat, and no conversations are ever sent to any server.
Whether you want to summarize text, create ideas, draft content, or simply explore what a fully on-device LLM can do, Llamatik makes it fast, private, and reliable.
⸻
🔒 100% On-Device AI
All text generation runs locally.
Your prompts, responses, and model interactions never leave your device.
⸻
⚡ Fast and Lightweight
Llamatik supports small and efficient models such as:
• Gemma 3 270M
• SmolVLM 256M / 500M
• Phi-1.5
• Qwen 2.5 5B (quantized)
• Llama 3.2 1B (quantized)
These models run smoothly on modern Android devices without requiring cloud compute.
⸻
📦 Built-In Model Downloader
Easily download, install, and switch between supported models.
Choose the model that best fits your device performance and your use case.
⸻
🎨 Clean & Modern Interface
A friendly interface makes chatting intuitive:
• Smart suggestions
• Quick prompts
• Model selector
• Beautiful onboarding
• Minimalist design
⸻
🔐 Privacy First
Llamatik does not collect, store, or transmit chat content.
Your data remains on your device at all times.
⸻
📱 Experimental Showcase
Llamatik is an experimental demo created to help developers and curious users experience the capabilities of the Llamatik library on mobile devices.
⸻
💡 Notes
• Requires downloading at least one model to begin chatting.
• Internet is only used to download models or for analytics/crash reporting (Firebase).
• Chat content is fully offline.