WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

451

(post is archived)

[–] 1 pt

There's an open reproduction of Alpaca if you have a GPU and want to run it yourself. Theoretically can also be run on CPU but inference (e.g., generating words) will be a lot slower.

Check here: https://github.com/tloen/alpaca-lora

And to run it on low powered devices, here: https://github.com/rupeshs/alpaca.cpp

And a web UI for doing various text generation chatbots and the like here: https://github.com/oobabooga/text-generation-webui