Chat with AI: the only chat app that puts every major AI model in your hands - without locking you into one provider or charging a monthly fee.
Unlike apps that make you pay for their API access or lock you into one ecosystem, Chat with AI lets you connect your own API keys. You choose the model. You control the cost. You own your data.
▸ No subscriptions. No ads. No data collection.
━━━━━━━━━━━━━━━━━━━━━━━━
✓ One app for every major AI provider
Switch between OpenAI, Google Gemini, Anthropic Claude, and 200+ models via OpenRouter — all from the same clean interface. One tap to change models mid-conversation.
✓ Bring your own API keys
Don't pay twice. If you're already paying for OpenAI, Gemini, or any OpenAI-compatible API, just add your key and chat. No markups, no middleman.
✓ 200+ free and paid models via OpenRouter
OpenRouter gives you access to every major model — including free tiers — through one API. No account juggling. Set it up once.
✓ 100% private conversations with local AI
Connect to Ollama or LMStudio running on your network and chat without a single byte leaving your device. No internet required for local models.
✓ Advanced features for power users
• MCP server support — extend AI with custom tools
• Tavily Search — give your AI real-time web access
• Text-to-speech — hear responses read aloud
• Multiple chat sessions — organize by topic or project
• Custom API endpoints — connect any OpenAI-compatible service
━━━━━━━━━━━━━━━━━━━━━━━━
HOW IT WORKS
1. Download and open the app
2. Add your API key (or tap "Use OpenRouter Free Tier" to start immediately)
3. Choose your model from the dropdown
4. Start chatting — switch models anytime
That's it. No account creation. No data stored on our servers.
━━━━━━━━━━━━━━━━━━━━━━━━
WHO IS THIS FOR?
• Developers comparing AI outputs across models
• Researchers needing flexible, multi-provider access
• Privacy-conscious users who want local, offline AI
• Professionals running self-hosted AI on their network
• Power users tired of paying monthly fees for one provider
• Anyone who wants to compare AI responses side by side
━━━━━━━━━━━━━━━━━━━━━━━━
SUPPORTED PROVIDERS & MODELS
• OpenAI: GPT-4o, GPT-4.5, GPT-4 Turbo, GPT-3.5 Turbo
• Google: Gemini 2.5 Pro, Gemini 2.0 Flash, Gemini 1.5 Pro
• Anthropic: Claude 4.5 Opus, Claude 4.5 Sonnet, Claude 3.5 Sonnet
• OpenRouter: 200+ models including Llama 4, DeepSeek V3, Qwen 3, Mistral, Gemma, Phi-4, and free models
• Ollama: Any model running on your local Ollama server
• LMStudio: Any model served via LMStudio
• Any OpenAI-compatible API endpoint
━━━━━━━━━━━━━━━━━━━━━━━━
PRIVACY
• No account required
• API keys stored securely on your device only
• Chat history stays on your device
• We never see, log, or store your conversations
• Complete transparency about how data flows
With local models (Ollama/LMStudio): zero data leaves your device, ever.
━━━━━━━━━━━━━━━━━━━━━━━━