Local AI

We stumbled upon running AI models locally when we were brainstorming on cutting down costs as a small startup. Running models on high power GPU cloud will take enormous amounts of energy and cost. After testing local models, we are impressed with the quality of output and boy, did we decide to not go back.

Using local AI not only reduces costs for us, but also for you as a user. Almost all modern personal computers are equipped with decent amount of CPU/ GPU that is capable of running small AI models. If you are paying $10/month for a audio transcription service, another $15 for a PDF summarizer, and one more $20 for a AI writing assistant, you can stop all those recurring bills right now with local AI provided by Memotron.

And the best part, you can use local AI even in free offline version.

On a side note, not in any possible interpretation, we want to undermine the use cases where we need larger models and higher processing power.

Go to AI Settings from Settings menu and toggle the AI models that you want to use in the app. If you are running short of disk space on your computer, you can turn off any model to remove it from your device.

Updated on