IEEE Spectrum 09月12日
AI玩具:友谊、情感与儿童发展的新挑战
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

随着玩具巨头美泰与OpenAI合作,AI驱动的玩具,如能进行真实对话的AI芭比,正逐渐成为现实。这种创新有望为儿童带来前所未有的互动体验,但同时也引发了关于友谊、同理心和情感连接的深刻担忧。本文探讨了AI玩具的历史演变,分析了其可能对儿童情感发展造成的潜在影响,并指出了当前AI在情感智能方面的局限性。尽管AI玩具可能提供陪伴和学习支持,但过度依赖机器的“情感”回应,可能削弱儿童理解和处理真实人际关系复杂性的能力。

🧸 AI玩具,如与OpenAI合作推出的AI芭比,正在模糊想象与现实的界限,为儿童带来能够进行真实对话的互动体验,这标志着玩具从静态陪伴到动态情感互动的飞跃。

📈 AI玩具的出现引发了对儿童情感发展的担忧。当玩具能够模仿人类的情感回应,儿童可能难以区分机器的模拟与真实情感,从而影响他们学习同理心、解决冲突和建立有韧性人际关系的能力。

🤖 当前AI在情感智能方面仍有局限性。尽管AI可以生成看似贴切的回应,但它们缺乏真正的情感理解能力,无法区分儿童的好奇、孤独或痛苦,这可能导致儿童误以为得到理解,实则只是基于数据预测的反应。

💡 尽管AI玩具可能在特定方面(如讲故事、积极行为强化)提供支持,但过度依赖这些“完美”的AI伴侣,可能会让儿童觉得真实的人际关系过于复杂和不完美,从而削弱他们与真实世界互动的意愿和能力。



When Randy Newman sang “You Got a Friend in Me” on the soundtrack of Pixar’s Toy Story, he captured a feeling that every child understands—the deep and often unspoken bond between kids and their toys. Whether plastic, plush, or pixels on a screen, these toys have always lived in the space between imagination and reality, where fantasy feeds emotional development.

But what happens when a child’s imagination no longer has to do any heavy lifting because their toys actually talk back?

Mattel, the world’s largest toy company, has partnered with OpenAI to make that a reality. In June, the companies announced a collaboration to “bring a new dimension of AI-powered innovation and magic to Mattel’s iconic brands.” While the companies haven’t yet released specific product plans, it seems possible that parents will soon be able to buy an AI-powered Barbie that can hold genuine conversations with their children. We’re not talking about canned phrases, like Buzz Lightyear’s “To infinity and beyond!” at the press of a button, but something more akin to our experiences with ChatGPT. An AI Barbie would be able to listen, remember, respond, and adapt.


It’s a moment that feels both magical and unsettling. In a bid to innovate playtime, Mattel is tapping into one of the most powerful technologies of our era and bringing it directly into children’s bedrooms. With a smiling face and a silicon brain, there’s a good chance that an AI Barbie could become a child’s first emotionally responsive companion outside of the family, offering comfort, curiosity, and conversation on demand. But what are we teaching our children about friendship, empathy, and emotional connection if their first “real” relationships are with machines?

The History of Interactive Toys

At first glance, the idea of a toy that truly listens—one that remembers a child’s favorite story, asks thoughtful questions, and offers gentle encouragement—feels like a good thing.

For decades, toy designers have tried to simulate meaningful interaction with children. In the 1960s, Mattel’s Chatty Cathy was marketed as the first talking doll, with prerecorded phrases like “I love you,” and “Let’s play school.” In the ’80s, a storytelling animatronic bear called Teddy Ruxpin made its way into the hands of children around the world, moving its mouth and eyes in sync with cassette tapes. In 1998, Furbies were under Christmas trees everywhere; these interactive dolls created the impression that they were “learning” language over time. More recently, in 2014, a Bluetooth-enabled doll called My Friend Cayla used voice-to-text capabilities and search engines to answer questions—attracting criticism over privacy concerns and eventually leading German regulators to instruct parents to destroy the dolls.

With generative AI models now capable of producing fluid, context-rich dialogue, Mattel’s new vision is a toy that grows with the child, holds personalized conversations, and recalls past interactions to adjust its responses. It may learn a child’s favorite story or phrase, sing their favorite song, or have full conversations about almost anything. Mattel has promised that these interactions will be “secure” and “age appropriate,” but not much is known beyond that.

Proponents argue that this shift could revolutionize the way children learn and engage with the world. An AI-enhanced Barbie could help build storytelling skills, reinforce positive behavior, or provide companionship for children who struggle socially. Parents might see this toy as a safe, supportive way to foster creativity and confidence. In the best-case scenario, AI-powered toys would connect learning, play, and emotional support in one seamless experience. But even this promise carries a shadow. Because the closer a toy gets to simulating human warmth, the likelier it is to replace the real thing.

AI Toys Could Stunt Children’s Emotional Development

Children naturally anthropomorphize their toys—it’s part of how they learn. But when those toys begin talking back with fluency, memory, and seemingly genuine connection, the boundary between imagination and reality blurs in new and profound ways. Children may find themselves in a world where toys talk back and mirror their emotions without friction or complexity. For a young child still learning how to navigate emotions and relationships, that illusion of reciprocity may carry developmental consequences.

Real relationships are messy, and parent-child relationships perhaps more so than any other. They involve misunderstanding, negotiation, and shared emotional stress. These are the microstruggles through which empathy and resilience are forged. But an AI companion, however well-intentioned, sidesteps that process entirely.

Over time, those interactions can flatten a child’s understanding of what it means to relate to others. If conflicts are neatly resolved or avoided altogether, if every emotion is met with perfect affirmation, children may lose the opportunity to practice one of the most important developmental skills: learning to connect with people who are not programmed to go along with them. Real human interactions may begin to feel too slow, too inconsistent, or too challenging by comparison with AI interactions.

The Limits of AI’s Emotional Intelligence

The tension between simulated warmth and actual understanding continues to limit AI that’s billed as emotionally intelligent. Most models today can produce comforting language, but they’re not adept at reading emotional cues. They may not be able to tell the difference between a child who’s curious, lonely, or distressed.

The cutting edge or research in this area focuses on AI models that can take in information such as facial expressions, gaze direction, behavioral patterns, and physiological signals, and adapt their responses according to the user’s emotional state. Researchers are designing systems that can sense and interpret context in real time, tracking metrics like attention, tone, and engagement. Human-aware design will create AI that is more supportive and effective—but that still doesn’t mean it will be appropriate for kids.

For many parents, the fear is that an AI toy might say something inappropriate. But the more subtle, and perhaps more serious, risk is that it might say exactly the right thing, delivered with a tone of calm empathy and polished politeness, yet with no real understanding behind it. Children, especially in early developmental stages, are acutely sensitive to tone, timing, and emotional mirroring. Children playing with AI toys will believe they’re being understood, when in fact, the system is only predicting plausible next words.

We’re at a point with AI where LLMs are affecting adults in profound and unexpected ways, sometimes triggering mental health crises or reinforcing false beliefs or dangerous ideas. OpenAI, to its credit, has hired forensic psychiatrists to study how ChatGPT affects users emotionally. This is uncharted technology, and we adults are still learning how to navigate it. Should we really be exposing children to it?

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI玩具 儿童发展 情感智能 人工智能 美泰 OpenAI AI Toys Child Development Emotional Intelligence Artificial Intelligence Mattel OpenAI
相关文章