An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.
The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.
Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.
An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.
The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.
Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.
Comment Reported for: Rule 2 - Violent Speech
Comment Removed for: Rule 2 - Violent Speech