

Children are being abused and exploited by the misuse of generative AI. AI-generated child sexual abuse material (CSAM) refers to the rising usage of generative AI by perpetrators to produce sexually explicit synthetic photos of real children including fake images and deepfake nudes, which are actual images of children that have been digitally altered to show them in sexually explicit ways without the children’s knowledge or consent.
However, CSAM can be detected and eliminated more effectively and precisely by AI-powered tools.AI algorithms are able to identify potentially harmful materials by analyzing image and video content.AI can also help identify and track those who distribute and consume CSAM.
AI can identify suspicious child behavior patterns by keeping an eye on online chats, messages, and activities. It can even automatically alert parents, guardians, or law enforcement agencies to take prompt action to safeguard the child.
The AI for Safer Children initiative was started in 2020 by the United Nations Interregional Crime and Justice Research Institute (UNICRI) Center for AI and Robotics and the Ministry of Interior of the United Arab Emirates (UAE) in response to the rapidly increasing number of reports of child sexual exploitation and abuse. The objective of this project is to strengthen law enforcement’s ability to use artificial intelligence (AI) and related technologies to combat child sexual exploitation and abuse.
The AI for Safer Children Global Hub is a centralised platform that helps law enforcement employ AI to combat online child sexual exploitation and abuse.In order to prevent, identify, and punish online child sexual exploitation and abuse, the Global Hub seeks to develop a community of practical AI users.
UK government introduced new legislation aimed to prevent Artificial Intelligence (AI) from being utilized to produce child sexual abuse content.The new law will make it illegal to use AI to create abusive content by digitally superimposing children’s faces on other explicit photographs or “nudeifying” real-life images of children. The maximum sentence for offenders is five years in prison.



