WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

922

I was on a ham radio net recently and the topic of AI came up. The other operator I was talking with convinced me to download Ollama (Meta, yuck) to at least play and see how it works.

Supposedly there are versions of Meta's Ollama that aren't woke or at least are more politically incorrect, but I didn't have the 500GB space on the VM I was working with, yet...

But before I move into building and deploying some massive AI into a VM, are there any recommendations for a better non-Facebook AI?

I was on a ham radio net recently and the topic of AI came up. The other operator I was talking with convinced me to download Ollama (Meta, yuck) to at least play and see how it works. Supposedly there are versions of Meta's Ollama that aren't woke or at least are more politically incorrect, but I didn't have the 500GB space on the VM I was working with, yet... But before I move into building and deploying some massive AI into a VM, are there any recommendations for a better non-Facebook AI?

(post is archived)

[–] 1 pt

Excellent, thanks!

[–] 1 pt

Depending on how much ram you have, choose wisely.

8b > 8~16GB of ram.

70b > 64GB or more

[–] 1 pt

Is it all CPU based or does it offload to the GPU? There a couple older workstations I could get my hands on with still decent CPUs that go up to 128 Gb of RAM, but sourcing a decent GPU for them would be expensive.

[–] 1 pt (edited )

It’s both (detected at the installation).

I have two instances running at the same time on one computer.

First is running natively on my GPU, the second on a VM with 8 shared cores.

[–] 0 pt

How do I know which was installed if I simply performed the "ollama run dolphin-llama3" command and it fetched the Dolphin model?

Am I even using the right vernacular? I need to skill up on all this, LOL.

[–] 1 pt

The 8b is selected by default.

You can check by running

ollama run dolphin-llama3:8b

It will either download the model or run it (if you already have it installed)