An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.
The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.
Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.
Maybe the mother should have been more attentive to the fact that her son was deeply depressed and unstable and in desperate need of a more healthy social environment.
An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.
The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.
Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.
Maybe the mother should have been more attentive to the fact that her son was deeply depressed and unstable and in desperate need of a more healthy social environment.
Comment Reported for: Rule 2 - Violent Speech
Comment Removed for: Rule 2 - Violent Speech