As the holiday season approaches, parents in Canada are on the lookout for gifts that will delight their children. However, experts in child development and psychology are urging caution regarding the selection of AI-powered toys. Concerns about these toys encompass potential privacy violations, security risks, and negative impacts on a child's creativity and overall development.
Dr. Nicole Racine, an Ottawa-based child psychologist and scientist at the CHEO Research Institute, emphasizes the malleability of a child’s developing brain during early childhood. She argues that the inputs children receive at this stage are crucial, stating, “I think about what kind of inputs do I want my kids to be having? And to be honest, it’s not the inputs of an AI algorithm.” Dr. Racine's remarks come in the wake of an advisory issued by Fairplay, a U.S.-based organization dedicated to protecting children from potential technology-related harms, which has been endorsed by various experts, including child advocacy groups and pediatricians.
The advisory defines AI toys as those containing chatbots integrated into everyday playthings such as plush toys, dolls, and action figures, employing artificial intelligence to communicate in ways that mimic human interaction. These toys are marketed as educational and interactive companions that stimulate children's creativity. However, experts argue that they might actually diminish imaginative play. According to Fairplay, traditional toys allow children to create both sides of a conversation, thereby enhancing their imaginative skills. In contrast, AI chatbots often agree with the user, which restricts a child's ability to navigate conflicts and develop relationship management skills.
Dr. Daniela Lobo, a psychiatrist and medical lead at the Centre for Addiction and Mental Health in Toronto, points out the importance of conflict resolution in childhood play, highlighting the basic need for children to experience disagreements during playtime. She praised Fairplay's advisory, noting that AI technology has advanced rapidly without corresponding safety research, which remains unregulated. Lobo stresses the urgency for a framework to ensure the safety of AI tools used with children, as their brains develop quickly and require careful consideration of their technological exposures.
The advisory lists specific AI toys, such as Curio Interactive's Gabbo, Grem, and Grok characters, as well as Roybi's robot designed to teach languages and math. Both Curio Interactive and Roybi ensure compliance with the U.S. Children’s Online Privacy Protection Act (COPPA) and maintain that they do not store sensitive data, suggesting that parents can manage their children’s interactions with these toys through designated apps. Despite these assurances, Dr. Racine believes that it may be unrealistic to expect parents to supervise every interaction with AI toys effectively.
The Canadian Paediatric Society has not established a formal stance on AI toys, but it has noted an increase in developmental and social-emotional delays among young children, raising concerns that these toys might exacerbate such trends. The society warns that AI toys could confuse children’s understanding of healthy relationships.
Moreover, the advisory cautions that these toys might invade family privacy by collecting sensitive data, as children often share intimate thoughts and feelings with their favorite toys. Fairplay’s Rachel Franz argues that the burden of understanding these privacy risks should not fall solely on parents, given the complexity of privacy policies.
Elizabeth Cawley, chief clinical officer of PlaySpace, views AI toys similarly to smartphones and the internet, advocating for a controlled environment where children interact with unregulated technology only under adult supervision. Although she acknowledges that, under proper regulations, AI could serve educational purposes, she emphasizes the critical role of responsible adult oversight.
The Canadian Toy Association has advised parents to purchase only from reputable brands that prioritize children's safety. The office of Evan Solomon, Canada’s minister of artificial intelligence and digital innovation, stated that it is monitoring the integration of AI in consumer products, especially those aimed at children, while recognizing the concerns raised by experts. Although Health Canada has not yet commented on this issue, it is responsible for ensuring the safety of consumer products, including toys.










