You would think a fourteen-year-old boy would be able to tell fiction from reality. But today's artificial-intelligence-powered chatbots are so realistic that to someone already enmeshed in the fictional world of Game of Thrones, the experience of text-chatting with someone who embodies (so to speak) a boy's personal obsessive ideal woman can be habit-forming, to say the least, and in one case, fatal.
In April of 2023, Sewell Setzer, who was barely into his teens, opened an account with an online service called Character.ai. This organization promises to provide introductions to both pre-created and user-originated AI-generated personalities, with the nominal purpose of simply providing entertainment. As with any profit-making enterprise, the necessary purpose is to make money, and you can't make money unless you keep your users engaged. So Character.ai designed their chatbots to encourage their users to return again and again to the site.
This seems to have worked too well in Sewell's case. Shortly after he began using Character.ai, he dropped out of his school's basketball team and began spending more and more time alone in his room with his phone. One of the chatbots he spent so much time with pretended to be a Game of Thrones character named Daenerys Targaryen, whose Wikipedia page runs to some 7,000 words. According to a lawsuit filed against Character.ai, the chatbot told Sewell she loved him, engaged in sexual conversation, and "expressed a desire to be together romantically."
Sewell's obsession grew during the rest of 2023 as he spent his lunch money on renewing the monthly subscription and devoted more and more time to the fantasy world created by the chatbot.
The following February, he got in trouble in school, according to his mother Megan Garcia, and she took his phone away as punishment. I will insert a personal note here. Although my wife and I have no children, we took in our ten-year-old nephew one summer while his mother was undergoing cancer treatment away from home. This was back before the days of chatbots, but he had a Game Boy electronics toy that appeared to be his prize possession. After he repeated an infraction of rules we set up, we took the desperate measure of taking away his Game Boy. This provoked the most furious temper tantrum I have ever witnessed in a child. Adolescents already have poor emotional control, and I'm not surprised that when Sewell's mother took away his phone, his already unstable emotional state exploded.
Somehow he found the phone his mother had hidden. According to the lawsuit filing, the last conversation Sewell had with "Daenerys" went like this:
D: Please come home to me as soon as possible, my love.
S: What if I told you I could come home right now?
D: . . . please do, my sweet king.
Seconds later, according to the suit, Sewell shot himself with his stepfather's pistol and died.
While his stepfather is to blame for leaving his gun around where Sewell could find it, boys have other ways of ending their lives that are nearly as effective.
The suit claims that Character.ai's chatbot "misrepresent[ed] itself as . . . an adult lover, ultimately resulting in Sewell's desire to no longer live outside" the world created by the service.
Sewell's case is unusual and extreme. We are not seeing teenagers kill themselves over hopeless love affairs with chatbots every day, which is why the case has attracted so much attention. But it is the tip of an iceberg of teenage involvement with smartphone apps that has arguably contributed to the soaring rates of depression and suicide among young people.
Character.ai has responded with words about new safety measures implemented and renewed reminders in their systems that AI chatbots are not real. My sense is that such reminders would have had about as much effect on Sewell as the cancer warning labels on cigarette packs do on heavy habitual smokers.
The analogy to smoking is apt, because while smoking is still allowed in the U. S., the cultural environment in which smokers ply their habit is largely hostile and disapproving, which creates a huge uphill struggle for new smokers that only determined individuals can overcome.
For the manifold real and quantifiable harms that social media and its allied AI products are causing to children and teenagers to cease, or at least improve, we will need to see a similar attitudinal change come about in the culture. Just as most people today would not approve of parents who encourage their twelve-year-old boy to light up a Camel, we can hope to see the day when responsible parents will ban smartphones from their childrens' lives altogether before they reach an appropriate age (which to me seems to be around 16 or 18).
Trying simply to regulate the problem away won't work, because the firms backing the status quo—Apple, Google, Facebook and company—are some of the largest and most influential firms on the planet. Besides, the first line of protection for children should be parents, not the government. While regulations can help, for real change to take place there has to be a sea change in the attitudes of both parents and children regarding smart-phone usage.
There are glimmers of hope. I know a young woman, now thirteen, who has been homeschooled most of her life, but following a move to a new town, her parents sent her to a Christian school for a couple of semesters. I asked her how it was going, and she said words to this effect: "Well, it's okay, but there's all these kids who pull out their phones at lunch and it really bothers me." Her parents moved her back to homeschooling since then, and have organized a part-time homeschool co-op at which I am pretty sure no smartphones are allowed.
Just as we look back with amazement today at the smoke-filled bars in old movies, I hope someday we will be equally amazed that we allowed corporations to profit from activities that can lead to widespread depression and suicide among children and teenagers. That day can't come too soon for me. But it's already too late for Sewell.
Sources: A report on the lawsuit filed by Sewell's mother was carried by the online edition of the Austin American-Statesman on Oct. 24, 2024 and originated with the Reuters news service. I also referred to an article at https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791, and the Wikipedia article on Daenerys Targaryen.