2 How to run a chatGPT model locally for uncensored AI (youtu.be) posted 1 year ago by IlhansBrother 1 year ago by IlhansBrother +4 / -3 9 comments share 9 comments share save hide report block hide replies
It's easy to run the smaller models locally with a GPU, no one needs a stupid video to teach them when the info is easily found. Good luck running GPT3+ or even getting hold of the weights to deploy on a compute node.
Alpaca and llama will use your normal RAM, GPU isn't leveraged.
Sure you can run these on a CPU, they're just usually painfully slow.