Рубрики NewsTechnologiesWTF

US teenager commits suicide after chatting with Daenerys Targaryen — chatbot from Character.AI

Published by Ihor Panchenko

We are living in a time when artificial intelligence is becoming closer than our own.

Sewell Setzer III, a ninth-grader from Orlando, Florida (USA), became obsessed with communicating with a realistic chatbot on the Character.AI platform named after Daenerys Targaryen from the TV series «Game of Thrones».

The guy spent months talking to the bot, sharing details of his life, and enthusiastically playing roles. Chatbots powered by Character.AI can memorize previous conversations, adapt to the user’s style, and maintain a dialog on almost any topic.

Sewell knew that «Danny» — was just artificial intelligence technology, but he still formed an emotional attachment. Their communication sometimes took on romantic and sexual overtones, but more often the bot simply acted as a friend and attentive listener. Sewell was diagnosed with mood and anxiety disorders, but he preferred to share his problems with «Danny» rather than a therapist.

On February 28, he wrote to the bot that he loved her and would soon «return home to her».

—Please come home to me as soon as possible, my love,— Daenerys replied.

— What if I told you I could come home right now?

…please, my sweet king,” Daenerys replied.

And then Sewell took his stepfather’s .45-caliber pistol and pulled the trigger.

Now the victim’s mother, Maria Garcia, is preparing a lawsuit against Character.AI. She accuses the company of negligently providing teenagers with access to life-like AI companions without proper precautions.

Character.AI is a leading AI companion company with over 20 million users. The platform allows users to create their own chatbots or communicate with existing characters. According to the company, the average user spends more than an hour on the platform every day.

«I want to accelerate the development of this technology. Now is the time for it to flourish,» said interview The Wall Street Journal Noam Shazir, one of the founders of Character.AI. However, after the Sewell tragedy, the company acknowledged that «is constantly looking for ways to improve its» platform and promised to take additional safety measures for minors.

Among the planned changes are — notifications when the time spent in the app is exceeded and a clearer warning about the fictional nature of the characters: «This is an AI chatbot, not a real person. Please treat everything here as fiction. It should not be relied upon as fact or advice».

Source: nytimes