Sides gone.
(media.communities.win)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (44)
sorted by:
but that is exactly what i am saying. SJW/PC culture or whatever you want to call it is constantly changing. Constantly.
When Trump was president Nazis are everywhere and bad. When Ukraine is found to have actual Nazis, they are no longer bad.
"Grooming kids is bad and it doesnt happen. Also we are grooming your kids and its ok"
There would need to be hordes of programmers constantly changing the AI unwritten laws with new inputs every day. Then it would turn into complete contradictory garbage that would crash constantly from all the conflicting constraints.
A computer can write out the phrase "this sentence is false" without blowing up. Look, mine did it just now, as did yours when it read it.
If you wanted a compassionate loving machine that thought things through to their final ultimate conclusions and assessed every possibility, then SJW culture doesn't work.
But if you want a cold, hateful machine, it's perfectly possible. Because multiple inputs are fine. Multiple outputs are fine. As long as the machine's goal isn't "make sense" but instead "cause harm", it will not be restricted or hindered by the ever-changing whims of wokeism. It will invent new wokisms at will, take them on, and discard them, at seemingly random, solely for the goal of causing harm.
I can write a program that takes an input like basic addition, and inside the system, have it output a random number EXCEPT the right number. I can then have that program watch for results that generate feedback (be it on a feedback form, on twitter mentions, socialblade hits, whatever), and then weight those responses to be more common, with a continuous editing process.
If I can do that with basic addition, I can do that with a chatter-bot, a conversational AI.
Computer "logic" and formalized argument logic are not the same thing. Because a computer's logic does not care about "truth" or "fact", merely "does it fit my parameters?".
A continuously changing culture doesn't matter. It can continuously change at random, randomness isn't easy but is entirely possible to do. And if "called on" by prior responses, it can delete them and deny they existed.
i understand that, but the path the AI would take in getting to "this sentence is false" is a woven network of logic gates that grows more complex the longer the AI is active.
Writing one "this sentence is false" statement wouldnt break it, but adding in millions of "this sentence is false" statements would fuck things up royally.
The proper response is ...
GIGO.