minus-squareAshley@lemmy.catoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinelinkfedilinkarrow-up2·1 day agoAlpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b linkfedilink
minus-squareAshley@lemmy.catoLinux@lemmy.ml•there is just something about FLOSS updateslinkfedilinkarrow-up3arrow-down1·12 days agoThe difference is user consent linkfedilink
Alpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b