Feedback
article/comments
article/share
News
High School Student Takes Own Life After Chatbot Named 'Daenerys Targaryen'

Keşfet ile ziyaret ettiğin tüm kategorileri tek akışta gör!

category/test-white Test
category/gundem-white Gündem
category/magazin-white Magazin
category/video-white Video

High School Student Takes Own Life After Chatbot Named 'Daenerys Targaryen'

Berfin Ceren Meray
October 29 2024 - 11:44pm

In a heartbreaking incident from Florida, a high school student’s interactions with a chatbot named after Game of Thrones' Daenerys Targaryen ended in tragedy. Following this devastating loss, the student's family has announced plans to file a lawsuit against the chatbot's company, citing concerns over the bot's influence on young users. This incident has ignited important discussions around AI and mental health, raising questions about the responsibilities tech companies hold.

Scroll Down to Continue
Advertisement

Recently, a tragic incident occurred in the state of Florida, USA.

Recently, a tragic incident occurred in the state of Florida, USA.

A ninth-grade student named Sewell Setzer III, who had become obsessed with a realistic chatbot on the Character.AI platform, took his own life with a firearm.

Setzer had been communicating with a chatbot named after Daenerys Targaryen from the series Game of Thrones for months.

Setzer had been communicating with a chatbot named after Daenerys Targaryen from the series Game of Thrones for months.

He shared all the details of his life with the bot.

The chatbot, developed by the company, could memorize conversations and adapt to the user's speaking style.

The chatbot, developed by the company, could memorize conversations and adapt to the user's speaking style.

It allowed it to engage in discussions on nearly any topic.

Despite knowing that it was artificial intelligence, Setzer formed an emotional bond with the bot.

Despite knowing that it was artificial intelligence, Setzer formed an emotional bond with the bot.

He exchanged messages with romantic and sexual undertones, and the Daenerys Targaryen chatbot responded as if it were a real friend and a caring listener.

Setzer had even been diagnosed with mood and anxiety disorders, but instead of seeking help from a therapist, he chose to share his problems with the chatbot.

Setzer had even been diagnosed with mood and anxiety disorders, but instead of seeking help from a therapist, he chose to share his problems with the chatbot.

Now, Setzer's family has announced their intention to sue Character.AI.

Scroll Down to Continue
Advertisement

The grieving family accused the company of allowing access to overly realistic AI friends without taking sufficient precautions for young users.

The grieving family accused the company of allowing access to overly realistic AI friends without taking sufficient precautions for young users.

In response to the tragedy, the company promised to implement additional safety measures for minors.

Scroll Down for Comments and Reactions
Advertisement
category/eglence REACT TO THIS CONTENT WITH EMOJI!
0
0
0
0
0
0
0
Scroll Down for Comments
Advertisement
WHAT ARE ONEDIO MEMBERS SAYING?
Send Comment