3.6
80 reviews
5K+
Downloads
Content rating
Everyone
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image
Screenshot image

About this app

Maid is an cross-platform free and open source application for interfacing with llama.cpp models locally, and with Ollama, Mistral and OpenAI models remotely.
Updated on
Apr 18, 2025

Data safety

Safety starts with understanding how developers collect and share your data. Data privacy and security practices may vary based on your use, region, and age. The developer provided this information and may update it over time.
No data shared with third parties
Learn more about how developers declare sharing
No data collected
Learn more about how developers declare collection

Ratings and reviews

3.6
80 reviews
SAB
October 13, 2025
Its a good application and I appretiate the developer for the hard work. I wish it had API server to use it in local network and support to connect to local LLM server like LMSA. I hope its stays open and free for all with out those annoying ads. again, excellent work.
Did you find this helpful?
Paul R (PaultheNerd)
July 13, 2024
A great way to run LLMs locally, either ones you've downloaded yourself, or ones available from HuggingFace! Issues I've had: 1. HuggingFace download is slow and does not resume if the app loses focus. 2. Maid replies, and then there is an <|endoftext|> and it continues writing odd stuff after that... 3. If you change to desktop mode, you can't change back because the drop-down is off screen. Kinda my bad but a bit unfair!
1 person found this review helpful
Did you find this helpful?
E Marr
August 9, 2025
Great app, simple and easy to use. Works great with local models. I'm using LM Studio for backend. I appreciate the hard work put into this app. if I could suggest a feature is to add MCP server support for streamable http. it's not that difficult to implement and will bring your app to life.
Did you find this helpful?

What’s new

Add supabase login