Stable Diffusion inference > Training a 65B-parameter language model?!
(media.kotakuinaction2.win)
Comments (7)
sorted by:
I'm glad he's doing something, but can someone tell me the benefit of an alternative image generator in the culture / propaganda war? (esp. when there's lots of open source models already)
Building a competing GPT chatbot would be disruptive. It would actually make sense for his social network.
Link to post: https://gab.com/a/posts/109938956376284596
Not sure if Torba is a willfully lying grifter or just retarded.
Gab doesn't create anything on their own. They just clone other projects on github and claim it's theirs. It's what they always do.
Torba is a liar if he says his AI will be uncensored.
Everyone knows he's a woman worshipping tradcuck who blamed men being broken and failing in society on the NFL.
It won't let you pull up the truth about women, it definitely won't let you generate anything adult (so you can bet they're wasting tons of time on that) and above all it will prioritize "noticer" garbage and Bible quotes.
It is fundamentally censored, wherever he likes it or not, because he did not train the model himself: Training costs more than the entire market cap of gab.
Unlike large language models, you can run a stable diffusion instance on your personal machine. The training weights are available free of charge.
Isn't Stable Diffusion inferior to Midjourney?
So does a bottle of Coke.
The only way Gab is still running is because it has some shady backing from someone who's likely on the opposite side of politics to most of the users.
Maybe one time it will generate me a new anti-feminist logo. Spent over an hour trying to get a nice one for my new Twitter account, couldn't get anything.
I'd ask BingAI, but it doesn't do pictures.