WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

523

Many models available, been playing with this for a few days locally. It will take a bit of computing power to run the largest of the models.

Most distros have a package for it already in their repo.

Many models available, been playing with this for a few days locally. It will take a bit of computing power to run the largest of the models. Most distros have a package for it already in their repo.

(post is archived)

[–] 1 pt

The smallest models can run on a pi5. I can run the 32B< models pretty damn fast with a Ryzen 9950x / 4080 super / 96 GB / NVME SSD. I can get the 70B to run, but I wouldn't call it fast.

Watching something like nvtop, I can see it pull my GPU's cpu to 50% pretty easy on the 32B model I'm using right now.

I have no hope of running the full 670B models anytime soon. Need server level hardware to get that done easily.