Memoria
your AI, your memory, on your own devices
A local AI chat that quietly remembers your conversations — on Mac and on iPhone. Nothing leaves your device.
Still being built. We'll tell you the moment it's out.
Memoria is a local AI chat that runs entirely on your own Mac and iPhone. It remembers context across sessions so you don't have to re-explain yourself every time, and it never sends a single byte to a developer's server.
The Mac version runs fully offline with Ollama, or plugs into your own OpenAI / Anthropic / Gemini key when you want a bigger model. The iPhone version runs an on-device LLM (Gemma 3 / 4), so it works on a plane, in the mountains, or anywhere without a signal. Build your own slash-commands — `/english` to translate, `/grammar` to break down a sentence, `/cal` for proofreading — by describing them in plain language. No coding, just intent.














