Title: Urgent product recall of AI toys
For the urgent attention of: governments, regulators, toy manufacturers, toy retailers and media organisations
With just days to Christmas, we've noticed a significant marketing push for children's toys with built in AI access, such as AI companion toys, AI plushies and Interactive AI toys. Some are branded, such as Curio, Grem and BubblePal, others generic. We are appealing to all organisations with responsibilities regarding child safety to sound the alarm on these products, as actual harm and risks of harm are already apparent. We urge application of the precautionary principle, requiring an immediate moratorium on sales and an urgent product recall.
Instances of harm already reported include:
FoloToy's “Kumma” bear willingly discussed explicit topics and provided instructions to access dangerous items, like matches and knives. This product has been temporarily withdrawn but there are other toys doing similar things and more in the pipeline.
A report from the PIRG found AI toys collecting voice recordings and personal data through always-on mics, with some sharing information with third-party companies. This despite 13 being the minimum age to consent for your data to be processed and the likelihood of toys being accessed by a range of children, including friends and family.
Possible longer term psychological and developmental risks these toys may pose because of what have been described as addictive by design and engagement features (see Stanford link end of letter, for example).
We do not understand the rush to bring AI toys to market, when they have clearly not been tested. In fact, with AI such a new technology, it appears to us that the testing procedures themselves, including psychological impact assessments, have not even been developed, never mind implemented.
Much of the information on potential harm is from the United States and many of the same, or similar, products are being aimed at children in the UK and elsewhere, including toddlers. Most are likely to be bought through the internet. This is both a national and international emergency.
We are a new, parent-led campaign group called set@16, focused on applying the precautionary principle to new and untested technologies that are marketed to children. We’ve seen the damage caused by early access to smartphones and social media. We cannot make the same mistakes again.
Now is the time to do whatever it takes to ensure that the most important item on all parents’ wish list is acknowledged and acted upon; that their child is safe, and not just for Christmas.
Kitty Hamilton
Guy Holder
Tom Cox
Co-founders of www.setat16.org
Link to Public Interest Research Groups report:
pirg.org/edfund/resources/trouble-in-toyland-2025-...
Link to Stanford report on possible psychological harms of AI to young people:
med.stanford.edu/news/insights/2025/08/ai-chatbots...
One of many news pieces from the US:
Dear Open Letter Supporters
First, thank you so much for signing our Open Letter. At the time of writing, we have nearly 180 signatures, and we are hopeful for more.
Everyone is so busy right now, with the run up to Christmas, but we are trying to do what we can to warn people that AI cuddly toys, already being marketed as the next big thing, are yet to be fully tested and should therefore not be a present under the tree. This is a completely new category of product aimed at children. The potential risk of long term psychological harm is high and, therefore, urgent, independent and thorough testing is required.
Over recent days:
We launched [email protected]
Co-founder Kitty Hamilton talked AI toys and social media with Nicky Campbell on BBC Radio 5: youtube.com/watch?v=XQt9Fmm5F54&feature=youtu.be
Co-founder Kitty Hamilton talked AI toys with Shelagh Fogarty on LBC radio: https://www.youtube.com/watch?v=Z15925tClEo
Co-founder Guy Holder did an extensive interview with Bauer Radio Senior Correspondent Mick Coyle. Clips were shared with various radio stations - highlight of Guy’s week was a pupil walking into the form room and saying ‘Sir, I heard you on the radio this morning’. Mellow Magic, apparently. Why not?!
Kitty and Guy met with the British Toy and Hobby Association to raise concerns about AI Cuddly Companions aimed at toddlers and warned that a new type of testing needs to be put in place that takes into account the longer term psychological and other risks this entirely new category of toys pose.
We have been in contact with the ICO and asked them for greater urgency regarding AI cuddly companions aimed at toddlers. They have the potential to issue a warning to parents and we are urging them to do so. We will continue to pursue them on this issue.
Others, including ROSPA, Office of Product Safety and Standards, and the Child Accident Prevention Trust have yet to get back to us. We are following up again this week.
We have been in communications with Fairplay, whom we are clearly aligned with on this issue, and are hoping to work with them: https://fairplayforkids.org/pf/aitoyadvisory/
.........
We want a ‘stop’ on all AI toy sales, a warning from regulators that these products are not safe, and a product recall.
We hope that our next update will have more examples to share and some better news regarding this terrible situation with AI toys. Please get involved as little or as much as you want, from contacting your local MP to writing to the organisations above. And, of course, you can share the Open Letter with others, pasted below for ease:
openletter.earth/urgent-request-for-product-recall...
Kitty, Guy, Tom