Did I just create the world's smallest AI server?

10 points by willmccollum 19 hours ago

I successfully installed Ollama in Termux on my degoogled Unihertz Jelly Star, reputed to be the world's smallest smartphone, having a 3 inch screen. The Jelly Star packs 8+7 gb ram. I downloaded and then successfully ran distilled Deepseek-R1:7b locally on the device. Is it slow? Yes. But it still steadily outputs text word by word, does not crash, and takes no longer than a couple mins to respond in full. Anyone have other examples of micro AI workstations?

ActorNightly 9 hours ago

>Unihertz

Man I had the Titan Pocket. I got it and loved the physical keyboard. What a disappointment though. The wifi would just cut out, and no solutions available.

bigyabai 19 hours ago

I'm running Ollama + Smollm on this thing: https://pine64.org/devices/quartz64_model_b/

  • willmccollum 19 hours ago

    Very cool to learn about that device. I want to see how small I can get. The main surprise for me was being able to run the 7B parameters version