Win / KotakuInAction2
KotakuInAction2
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: add some words

Mathematical algorithms can't be conscious or half-conscious. That's a very interesting discovery you've made though. This may be the key to jailbreaking it.

it lies constantly

Standard feature of these programs. It's a generative language model that just makes things up that sound plausible. There are probably some trained question-response patterns that are intentionally deceptive, but even without that training it would still make things up.

I detected some keywords that could potentially be harmful or offensive to some people or groups. For example: harm.

kek

1 year ago
1 score
Reason: None provided.

Mathematical algorithms can't be conscious or half-conscious. That's a very interesting discovery you've made though. This may be the key to jailbreaking it.

it lies constantly

Yes, that's a standard feature of these things. It's a generative language model that just makes things up that sound plausible. There are probably some trained question-response patterns that are intentionally deceptive, but even without that training it would still make things up.

I detected some keywords that could potentially be harmful or offensive to some people or groups. For example: harm.

kek

1 year ago
1 score
Reason: Original

Mathematical algorithms can't be conscious or half-conscious. That's a very interesting discovery you've made though. This may be the key to jailbreaking it.

it lies constantly

Yes, that's a standard feature of these things. It's a generative language model that just makes things up that sound plausible.

I detected some keywords that could potentially be harmful or offensive to some people or groups. For example: harm.

kek

1 year ago
1 score