Parents are demanding urgent answers from Meta after Instagram ads promoting its Threads app featured images of schoolgirls as young as 13. The photos, originally posted as back-to-school pictures, appeared in advertisements shown to a 37-year-old man, sparking outrage over child safety and privacy.
The man, who noticed the disturbing trend on his feed, said the ads encouraged him to “get Threads” while embedding posts of uniformed girls with their names and faces clearly visible. Parents of the affected children described the situation as “outrageous” and “upsetting,” questioning why Meta’s algorithm would allow minors’ content to be used in this way.
Critics argue that such advertising practices place vulnerable children at risk of exposure to inappropriate audiences. Online safety experts warn that using minors’ images without stricter safeguards can fuel grooming concerns and undermine trust in Meta’s platforms.
Meta has not yet issued a detailed response, but pressure is mounting for the tech giant to review its advertising policies. Parents are calling for greater transparency around how young users’ content is shared and stronger protections to prevent children’s photos from being repurposed in targeted campaigns.
The controversy comes at a time when Meta is already under scrutiny for its handling of child safety on Instagram and Facebook. Advocacy groups are now urging regulators to step in, emphasizing that companies must put the protection of minors above profit-driven advertising algorithms.