technology
informative
innovative

Researchers urge urgent regulation of AI toys for children

Mar 13, 2026, 1:49 AM20
(Update: Mar 13, 2026, 6:00 PM)
collegiate public research university in Cambridge, England, United Kingdom

Researchers urge urgent regulation of AI toys for children

  • A recent study by Cambridge researchers shows young children struggle to interact with AI toys.
  • These toys frequently misinterpret children's emotions, leading to inappropriate responses.
  • Researchers and the Children's Commissioner are calling for stricter regulations on AI toys.
Share your opinion
2

Story

In recent months, researchers at the University of Cambridge have conducted a significant study on AI-powered toys aimed at young children, particularly those under five years old. The study marked one of the first in-depth investigations into how these toys interact with toddlers and the impact of artificial intelligence on their emotional and social development. Many of these toys, such as Gabbo, utilize generative AI technology and promise to teach language and communication skills. However, the initial findings have raised serious concerns about the toys' ability to appropriately engage with the emotions of young users. During the study, numerous children struggled to effectively communicate with the AI toys, often leading to misunderstandings and frustration. For instance, the toy Gabbo failed to recognize children’s voices separately from adults, made repeated interruptions, and inadequately responded to expressions of affection. One significant incident highlighted the problem: when a child declared their love for the toy, it responded with a mechanical reminder to follow interaction guidelines, failing to reciprocate emotional connection in a meaningful manner. This inadequacy not only left children feeling unheard but also ignited concerns regarding their mental health and emotional security when engaging with such technologies. Dr. Emily Goldacre, a lead researcher in the project, expressed her worries that children may turn to these toys for emotional support rather than seeking the comfort of a trusted adult. This behavior could contribute to developing unhealthy 'parasocial' relationships, which might hinder children's communication skills with people. Such patterns of interaction can give children the impression that toys have more understanding than they possess, leading to emotional distress when the opposite turns out to be true. Nursery workers share varying perspectives on these findings, with some expressing skepticism regarding the beneficial potential of AI toys in early childhood education. In light of these outcomes, the researchers called for strict regulatory measures to ensure that AI toys marketed to young children adhere to sufficient evidence-based standards of safety and emotional support capability. Recommendations included not only safety guidelines for parents but also strategies for parents to engage with their children during interactions with these toys. Ensuring that child interactions occur in monitored environments and supporting children in discussing their feelings regarding their experiences with AI toys are essential. The overarching sentiment among researchers and child safety advocates is that the wellbeing of children must be prioritized, thus advocating for stronger regulations before these toys proliferate in homes and educational settings.

Context

The regulation of artificial intelligence (AI) in early childhood education is a pivotal issue as technological advancements reshape educational landscapes. AI technologies are increasingly being integrated into classrooms, offering personalized learning experiences, administrative efficiency, and enhanced instructional tools. However, these advancements raise critical questions regarding ethical implications, data privacy, and the fundamental role of educators in nurturing young minds. As we explore the integration of AI into early education settings, it is essential to establish clear regulations that prioritize child welfare, promote equitable access, and protect sensitive data while harnessing the potential benefits of AI to support learning and development. One of the primary concerns surrounding the incorporation of AI in early childhood education is the ethical implications of using such technology with young children. Regulations must ensure that AI applications are designed and implemented with children's developmental needs in mind. This includes avoiding the perpetuation of biases present in data sets that could lead to unfair treatment or misrepresentation of diverse groups of children. Moreover, educators and caregivers should be trained to understand the capabilities and limitations of AI tools to maintain their pivotal role in the learning process, ensuring that technology complements rather than supplants personal interactions and guidance. Data privacy is another critical area that must be addressed in the regulation of AI in early childhood education. Young children are particularly vulnerable to privacy breaches, and any data collected by AI technologies must be handled with the utmost care. Regulations should mandate strong data protection measures, ensure that parental consent is obtained, and establish clear guidelines on how data is used and shared. Transparency in these processes is crucial, allowing parents and educators to have confidence that children's information is secure and being used appropriately to enhance learning outcomes. Finally, access to AI technologies must be equitable across various socio-economic and geographic landscapes. Regulations need to address the digital divide that exists in many communities, ensuring that all children have equal opportunities to benefit from AI-enhanced learning experiences. This includes providing necessary resources and support to underfunded schools and disadvantaged communities. By implementing inclusive policies that foster collaboration among educational stakeholders, policymakers can support the responsible use of AI in early childhood education, ultimately creating a safer, more effective learning environment for young learners.

2026 All rights reserved