Stay Tuned!

Subscribe to our newsletter to get our newest articles instantly!

Tech Technology

UNICEF calls for criminalisation of AI content depicting child sex abuse


At least 1.2m children disclose having their images manipulated into sexually explicit deepfakes in 2025, UNICEF says

The United Nations children’s agency UNICEF on Wednesday called for countries to criminalise the creation of AI-generated child sexual abuse content, saying it was alarmed by reports of an increase in the number of artificial intelligence images sexualising children.

The agency also urged developers to implement safety-by-design approaches and guardrails to prevent misuse of AI models. It said digital companies should prevent the circulation of these images by strengthening content moderation with investment in detection technologies.

“The harm from deepfake abuse is real and urgent. Children cannot wait for the law to catch up,” UNICEF said in a statement. Deepfakes are AI-generated images, videos, and audio that convincingly impersonate real people.

UNICEF also raised concerns about what it called the “nudification” of children, using AI to strip or alter clothing in photos to create fabricated nude or sexualized images.

At least 1.2 million children across 11 countries disclosed having their images manipulated into sexually explicit deepfakes in the past year, according to UNICEF.

Britain said on Saturday it plans to make it illegal to use AI tools to create child sexual abuse images, making it the first country to do so.

Concerns have increased in recent years about the use of AI to generate child abuse content, particularly chatbots such as xAI’s Grok – owned by Elon Musk – which has come under scrutiny for producing sexualised images of women and minors.

Reuters investigation found the chatbot continued to produce these images even when users explicitly warned the subjects had not consented.

xAI said on January 14 it had restricted image editing for Grok AI users and blocked users, based on their location, from generating images of people in revealing clothing in “jurisdictions where it’s illegal.” It did not identify the countries.

It had earlier limited the use of Grok’s image generation and editing features only to paying subscribers.



Source link

mt-admin

About Author

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like

Tech

Emirates Palace Spends A Hefty Sum For Works…

Grursus mal suada faci lisis Lorem ipsum dolarorit ametion consectetur elit. a Vesti at bulum nec odio aea the dumm
Tech

Do not neglect the idea of Factual Knowledge in the Wroks done Perspective..

There are many variations of passages of Lorem Ipsum available but the majority have suffered alteration in that some injected