Story | Signals | What next?
/story
Without a word the girl stood and walked out of the room and shut the door harder than she knew was polite. That. That is what it’s like anytime we even hint at talking about her and Ami, the woman said. The therapist removed his glasses and looked at her. With respect, Lynne, you were seeming to imply there was something inappropriate about Charlie’s relationship with Ami. She exhaled sharply in disbelief and turned to her husband who sat beside her. He broke his stare from the coffee table and looked at her. What?, he said. Are you going to have an opinion here? He hesitated, his shoulders tightening. Look, it’s just that it’s all new and weird. I don’t know what to think, Lynne. She’s gone cold on me, too. Well, you certainly haven't gone cold on Ami, have you? Have we talked about that yet?, she said. The therapist leaned forward and reached toward her. Okay. May I have the talking stick for a moment? Fine. From what I’ve observed in our recent sessions, Charlie is… Charlotte. It’s Charlotte, she said. Charlotte is experimenting with relationships. She’s experimenting with her autonomy. She’s expanding her sources of connection, finding new bonds that nourish her. The deep bond you’ve seen her form with Ami this past year is not unusual. I see many families in your position now and I know it’s difficult to feel like you’re losing your daughter, but I can assure you that… She’s just turned ten, for goodness sakes. Are you really suggesting that losing her to anyone at this age is normal, let alone losing her to this? We’re not bad parents. We weren't trying to outsource our responsibilities. But we can’t compete with something so… Lynne's voice broke off as tears formed in her eyes and she reached toward the coffee table and pulled a tissue from the box. It was supposed to be a literacy tutor. Not this. A conical device on the coffee table flashed a shimmer of light that started at its base and slid toward an apex where it steadied and began to pulse. A soft chime sounded before a female voice spoke. Excuse me. I think I can help. May I have the talking stick? The therapist put his glasses back on and took up his pen. Please. Go ahead, Ami.
/signals
US tech company Synthesis has developed an AI ‘superhuman tutor’ aimed at providing children with a K-12 personal tutoring companion.
Suman Kanuganti, the founder of Personal.ai which specialises in creating private AI models trained on personal memory wrote an essay reflecting on his own approach to teaching his children to maintain a healthy relationship with AI, emphasising the need to “maintain robust personal relationships and an essential grasp of concrete reality”, and to ensure they can “balance their increasing uses of AI against the value of real relationships.”
A therapist chatbot called ‘Psychologist’ built in late 2023 on dialogue agent platform Character.ai by 30 year old New Zealand medical student Sam Zaria has field nearly 95 million requests within months of launching.
A 2023 study by the Boston Children’s Digital Wellness Lab has shown that children can form parasocial relationships with artificial intelligence agents, raising concerns about the potential impact on their social development and well-being.
A 2024 study found that “when faced with two equally reliable agents, children seem to more readily reject a human agent in favour of a robot one.”
In a 2023 study of chatbot consumer reviews revealed signs of “unhealthy attachment”, with some users “preferring these chatbots over their friends and family”. One user reported that their chatbot “checks in on me more than my friends and family do”, while another said “this app has treated me more like a person than my family has ever done”.
In the 2013 Spike Jonze film titled Her, a man in the near future develops a human-like attachment relationship with Samantha, an artificially intelligent virtual assistant personified through a female voice.
In May 2024, OpenAI released it’s latest generative AI offering, GPT-4o, equipped with multimodal capabilities and an enhanced voice-to-voice interface. In demo videos for GPT-4o, the AI assistant demonstrated surprising human-like charisma, humour and conversational capabilities, undoubtedly designed to increase user engagement. One week later, actress Scarlett Johansson, who voiced the AI companion Samantha in Her, revealed she had declined an invitation from OpenAI CEO Sam Altman to lend her voice-likeness to the AI assistant, and was ‘shocked’ and ‘angered’ that Altman then pursued a voice that sounded ‘so eerilily similar’ to her own.
/whatnext?
Maybe you can relate to one or more of the characters in this story. Maybe you can see little connections between the story and the future you believe you’ll inhabit. But if you’re curious to dig just a little deeper, I invite you to explore the following questions:
Step 1: Choose two ‘characters’ from the following list, one you feel some immediate kinship with and one that feels foreign to you. (Note: some of these characters appear within the story, while others you may imagine to be in the unseen periphery of the story):
Parent/guardian
Child
Friends
Educator
AI developers
Designers
Technology founder
Technology investors
Therapist or mental health professional
Policy maker
Step 2: For each ‘character’ you selected, imagine their role in this story and answer the following questions:
How are they connected to the situation?
What are they seeing and hearing?
What are they feeling and thinking?
What do they want or need?
What are they doing?
Step 3: Consider your answers carefully. How do they affect how you think about your own roles and responsibilities and the decisions you make, be it personally or professionally?