Stable Diffusion inference > Training a 65B-parameter language model?!
(media.kotakuinaction2.win)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (7)
sorted by:
It is fundamentally censored, wherever he likes it or not, because he did not train the model himself: Training costs more than the entire market cap of gab.
Unlike large language models, you can run a stable diffusion instance on your personal machine. The training weights are available free of charge.
Isn't Stable Diffusion inferior to Midjourney?
So does a bottle of Coke.
The only way Gab is still running is because it has some shady backing from someone who's likely on the opposite side of politics to most of the users.
Maybe one time it will generate me a new anti-feminist logo. Spent over an hour trying to get a nice one for my new Twitter account, couldn't get anything.
I'd ask BingAI, but it doesn't do pictures.