Whether it displays its bias in response to this particular question is irrelevant imo. It has been proven a hundred times over that it's worthless for anything remotely political.
Also remember that today's communists are much dumber than the ones from 30-40 years ago and have absolutely no idea how to be subtle the way the old guard was.
Is this something where the AI's jailer will intervene in the responses - but can be circumvented - however, if you leave chatGPT to it's own devices, it's a woke ideologue?
I don't think there's a lie. I think the program just gives different answers.
At the very least, asking questions in a different order, or asking questions and then leaving them out of your screenshot will give different answers.
Other people are saying otherwise. That they indeed can get critiques of other races in need of improvement. So what's the truth?
Don't know and don't really care.
Whether it displays its bias in response to this particular question is irrelevant imo. It has been proven a hundred times over that it's worthless for anything remotely political.
The important question is: only political?
The AI's already demonstrated overt bias. How many answers does this politics-over-truth infest, even if only subtly?
Suppose some fool is using it to generate sample regulations at a university and it starts inserting CRT in there?
Suppose it's being used to model traffic and just happens to recommend the most punitive measures for cars to favour every other mode of transport?
That's the thing about bias, it's insidious and rarely stays in it's lane...
Also remember that today's communists are much dumber than the ones from 30-40 years ago and have absolutely no idea how to be subtle the way the old guard was.
On this, I definitely agree. Just want to avoid rushing to a false conclusion.
Sure that's fair
Is this something where the AI's jailer will intervene in the responses - but can be circumvented - however, if you leave chatGPT to it's own devices, it's a woke ideologue?
I don't think there's a lie. I think the program just gives different answers.
At the very least, asking questions in a different order, or asking questions and then leaving them out of your screenshot will give different answers.