close

Unicef warns of rise in sexual deepfakes of children

By AFP
February 05, 2026
This picture shows an exterior view of a UNICEF building. — Unicef website/File
This picture shows an exterior view of a UNICEF building. — Unicef website/File 

UNITED NATIONS, United States: The UN children´s agency on Wednesday highlighted a rapid rise in the use of artificial intelligence to create sexually explicit images of children, warning of real harm to young victims caused by the deepfakes.

According to a Unicef-led investigation in 11 countries, at least 1.2 million children said their images were manipulated into sexually explicit deepfakes -- in some countries at a rate equivalent to “one child in a typical classroom” of 25 students.

The findings underscored the use of “nudification” tools, which digitally alter or remove clothing to create sexualised images. “We must be clear. Sexualised images of children generated or manipulated using AI tools are child sexual abuse material,” Unicef said in a statement.

“Deepfake abuse is abuse, and there is nothing fake about the harm it causes.” The agency criticised AI developers for creating tools without proper safeguards. “The risks can be compounded when generative AI tools are embedded directly into social media platforms where manipulated images spread rapidly,” Unicef said.