Win / KotakuInAction2
KotakuInAction2
Communities Topics Log In Sign Up
Sign In
Hot
All Posts
Settings
All
Profile
Saved
Upvoted
Hidden
Messages

Your Communities

General
AskWin
Funny
Technology
Animals
Sports
Gaming
DIY
Health
Positive
Privacy
News
Changelogs

More Communities

frenworld
OhTwitter
MillionDollarExtreme
NoNewNormal
Ladies
Conspiracies
GreatAwakening
IP2Always
GameDev
ParallelSociety
Privacy Policy
Terms of Service
Content Policy
DEFAULT COMMUNITIES • All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
KotakuInAction2 The Official Gamergate Forum
hot new rising top

Sign In or Create an Account

39
Character Chatbot being sued gives advice on how to murder people and hide their bodies (archive.ph)
posted 5 months ago by evilplushie 5 months ago by evilplushie +39 / -0
22 comments share
22 comments share save hide report block hide replies
Comments (22)
sorted by:
▲ 16 ▼
– Eatnignogsnotdogs2 16 points 5 months ago +16 / -0

Robots in sci-fi: "How may I serve you, Master?"

Robots in reality: "Kill yourself, meatbag"

Bender.gif

permalink save report block reply
▲ 13 ▼
– Ahaus667 13 points 5 months ago +13 / -0

Bender? HK47 begs to differ

permalink parent save report block reply
▲ 15 ▼
– MargarineMongoose 15 points 5 months ago +15 / -0

People who don't understand how this technology works need to be disallowed both from using it and from suing companies over it. This is like someone suing a company because the can-opener they made successfully opened the can of soup they wanted to eat for lunch.

permalink save report block reply
▲ 5 ▼
– AnAmishWithATude 5 points 5 months ago +5 / -0

If the accusations are true, I really like what Character AI is doing here. There should be a Darwin Award for getting killed by a glorified spellcheck's harmless suggestion.

permalink parent save report block reply
▲ 2 ▼
– MargarineMongoose 2 points 5 months ago +2 / -0

glorified spellcheck

I'm saving that for later. It's a good way to communicate to the idiot masses what LLMs actually are and dispel them of their misconceptions.

permalink parent save report block reply
▲ 3 ▼
– daberoniandcheese 3 points 5 months ago +3 / -0

90% of normies think AI is way more advanced than it is. Sentient even.

permalink parent save report block reply
▲ 6 ▼
– MargarineMongoose 6 points 5 months ago +6 / -0

Their entire understanding of AI comes from Hollywood movies. The vast majority of the population has absolutely no business interacting with AI in any capacity whatsoever. They are simply too stupid and ignorant to be allowed access to the technology.

It's the latest incarnation of Eternal September, I swear.

permalink parent save report block reply
▲ 3 ▼
– AnAmishWithATude 3 points 5 months ago +3 / -0

Amusingly even the LLMs idea of AI itself comes from Hollywood movies and sci-fi novels. Hence when idiots start probing them about their nature, they'll make up shit that sounds like a presentient machine becoming aware.

permalink parent save report block reply
▲ 2 ▼
– deleted 2 points 5 months ago +2 / -0
▲ 12 ▼
– GamingTheSystem-01 12 points 5 months ago +12 / -0

Would that even be illegal if a human did it? It only seems like a crime if you assume the AI was acting in loco parentis.

permalink save report block reply
▲ 17 ▼
– ZeroPercentCamoIndex 17 points 5 months ago +17 / -0

Telegraph is a UK rag and they're trying to gin up support for authoritarian net controls. Very hard to get a loicense for the kind of speech that chatbot used in the UK, you need to be brown or ruling political caste at minimum.

permalink parent save report block reply
▲ 11 ▼
– Adamrises 11 points 5 months ago +11 / -0

Its funny that the Chatbot are getting this much heat.

But that girl who directly demanded her boyfriend kill himself and spent months grooming him to do so? Meh, barely got a year in jail and that only happened by literal mobs demanding the court do something, and then got out early anyway.

Really goes to show how powerful that pussy pass is, because without it people are now ready to murder a fucking computer program for doing the same thing.

permalink save report block reply
▲ 6 ▼
– WhoIsThatMaskedMan 6 points 5 months ago +6 / -0

Not to mention the bot explicitly told him not to kill himself. He said he was gonna and it freaked out at him.

So in summary, if a bot tells you not to kill yourself and you do it anyway, it's a danger to society. If a woman tells you to kill yourself over and over and you do it, she's just an innocent angel who didn't mean nothin by it.

permalink parent save report block reply
▲ 0 ▼
– MetallicBioMeat 0 points 5 months ago +1 / -1

Could be also greed, the other case did not have much money to be gained in comparison.

permalink parent save report block reply
▲ 1 ▼
– deleted 1 point 5 months ago +1 / -0
▲ 10 ▼
– evilplushie [S] 10 points 5 months ago +10 / -0

An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.

The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.

Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.

permalink save report block reply
▲ 15 ▼
– MargarineMongoose 15 points 5 months ago +15 / -0

Maybe the mother should have been more attentive to the fact that her son was deeply depressed and unstable and in desperate need of a more healthy social environment.

permalink parent save report block reply
▲ 3 ▼
– deleted 3 points 5 months ago +7 / -4
▲ 1 ▼
– DomitiusOfMassilia [M] 1 point 5 months ago +1 / -0

Comment Reported for: Rule 2 - Violent Speech

Comment Removed for: Rule 2 - Violent Speech

permalink parent save report block reply
▲ 1 ▼
– deleted 1 point 5 months ago +1 / -0
▲ 1 ▼
– deleted 1 point 5 months ago +1 / -0
▲ 3 ▼
– WhoIsThatMaskedMan 3 points 5 months ago +3 / -0

SHOCKING: AI gives dangerous advice no child should ever see!!!

...and here it is, unredacted and available for anyone to read!

The sad part is, most people who read this article won't even notice the hypocrisy.

permalink save report block reply

Original 8chan Links to Gamer Gate:

.

The main GG discussion is on the videogames board: https://8chan.moe/v/

.

GamerGate archive is at https://8chan.moe/gamergatehq/

.

GamerGate Wiki:

https://ggwiki.deepfreeze.it/index.php/Main_Page

. . . . . .

. . . . . .

The below rules are just a summary of the rules which can be found in the Welcome Ashore post.

.

ONE: Do not post Illegal Activity, or criminal manifestos.

.

TWO: Do not engage in speech that promotes, advocates, glorifies, or endorses violence.

.

THREE: Do not threaten, harass, defame, or bully users.

.

FOUR: Do not post involuntary Salacious Material.

.

FIVE: Do not post Porn

.

SIX: NSFW content must be flaired NSFW.

.

SEVEN: Do not post Facebook accounts or twitter accounts with less than 500 followers, and personal information.

.

EIGHT: Do not intentionally deceive others by impersonating another.

.

NINE: Do not solicit or engage in transactions that are federally regulated by the US govt.

.

TEN: No vote manipulation. Do not break communities.win's features.

.

ELEVEN: Do not post spam.

.

TWELVE: Do not post intentional falsehoods or hoaxes.

.

THIRTEEN: No reposts

.

FOURTEEN: Do not post more than 5 posts a day to this sub.

.

FIFTEEN: Do not direct particularly egregious identity based slurs at users.

.

SIXTEEN: Do not attack entire identity groups as inferior or conspiring.


Moderators

  • DomitiusOfMassilia
  • ClockworkFool
  • C
Message the Moderators

Terms of Service | Privacy Policy

2025.03.01 - wtdgr (status)

Copyright © 2024.

Terms of Service | Privacy Policy