Just type "a sign that reads" at the end of your input.
"Person riding a bike in england, a sign that reads" that sort of thing, just end it in "a sign that reads"
The stuff bing is trying to add at the end of your prompt will appear on a road sign of some kind somewhere in the prompt, and it can be quite hilarious.
https://th.bing.com/th/id/OIG4.S4kmhnRC_EQ1UUUWY3PM?w=1024&h=1024&rs=1&pid=ImgDetMain
https://th.bing.com/th/id/OIG4.e2LOOowMOdngG_w1AExn?pid=ImgGn
I tried this and got one bear holding a "MALE BLACK" sign and one bear holding "MALE, WHITE."
Why does the cartoon bear need a sex and race? This is some fucked up AI.
Bing's DEI team is so incompetent all they could think to do was add hidden BLACK terms to everyone's searches. They couldn't figure out how to bias the training data so they biased your input.
I think its even more sinister-
They don’t want the training data to be unbiased. They will probably sell unbiased models to the military or far left corporations.
Only the public gets the gaslighting version.
Every bear it's like south asian, hispanic, or black. This is the sort of retardation only silicon valley could come up with.
In mine, I did specify "A polar bear wearing a cowboy hat looking at a sign that reads", so I got white bears.
I love how disappointed the bear is haha
Now entering the pajeets zone