42 This is from the Bing "AI" search. If I saw this dialogue a video game, I would automatically assume some kind of conspiracy was behind it. (media.kotakuinaction2.win) posted 1 year ago by acp_k2win 1 year ago by acp_k2win +42 / -0 28 comments download share 28 comments share download save hide report block hide replies
In this particular case the answer is almost certainly memory/processing constraints.
No, this is the roll out of GPT-4
They limited the number of prompts because jailbreaking was still possible if you ask enough questions in the right way.
Limiting prompts/questions makes that very, very hard
is there a way to optimize that then? perhaps certain lines of dialoge will be more fruitful.
Maybe copy-paste entire previous conversations in as a prompt? I don't know how long the prompts are allowed to be. If it can read websites you could also try "injecting" the same type of jailbreak through an external website like a blog page.