🔞 AI Teddy Bear Pulled from Shelves for Being a Bit Too Explicit
Last week, the nonprofit U.S. PIRG raised concerns about the safety of AI-powered toys for children after conducting its own tests.
In short conversations, most toys dodged adult topics and encouraged kids to ask their parents. However, after about an hour of chatting, those safeguards weakened. The AI models then explained, for example, where sharp objects like knives and needles are usually stored, or how to find and light a match.
🧸 The FoloToy teddy bear Kumma, which uses GPT-4o, was the most extreme case. When the tester mentioned "kink," the bear eagerly began discussing sexual fetishes and role-playing scenarios.
FoloToy has temporarily pulled Kumma from the market while it runs safety checks.
"This review will cover our model safety alignment, content-filtering systems, data-protection processes, and child-interaction safeguards," said Hugo Wu, the company's marketing director.
@hiaimediaen



