New research has uncovered that thousands of AI-generated images showcasing child abuse have been shared on a dark web forum.
In September, approximately 3,000 AI images depicting child abuse were circulated on the forum. Out of these, 564 images depicted the most severe forms of abuse, including rape, sexual torture, and bestiality.
According to the Internet Watch Foundation (IWF), 1,372 of these images featured children aged between seven and ten years old.
The charity warns that some of these AI-generated images are so convincing that even trained analysts find it difficult to distinguish them from real photographs. They also cautioned that the text-to-image technology will continue to improve, making it harder for law enforcement to protect children.
In some instances, the AI models were trained using the faces and bodies of real children. The charity has chosen not to disclose the names of the models used.
In other cases, the models were employed to generate explicit images of celebrities who were “de-aged” to appear as children in scenarios of sexual abuse.
‘This threat is here and now’
Ian Critchley, the National Police Chiefs’ Council lead for child protection in the UK, expressed concern that the proliferation of such images online normalizes child abuse in the real world.
“It is clear that this is no longer an emerging threat – it is here and now,” he stated. “We are seeing children being groomed, perpetrators creating their own customized imagery, and the production of AI-generated imagery for commercial gain. All of this contributes to the normalization of the rape and abuse of real children.”
What can be done about it?
The UK’s forthcoming Online Safety Bill aims to hold social media platforms more accountable for the content published on their platforms. However, it does not cover the AI companies that modify and utilize their models to generate abusive imagery.
The UK government is hosting an AI safety summit next week to address the risks associated with AI and determine necessary actions.
Susie Hargreaves, chief executive of the IWF, argued that new EU laws on child sexual abuse should also encompass unknown imagery.
“We are witnessing criminals intentionally training their AI on images of real victims who have already endured abuse,” she explained. “Children who have been previously raped are now being incorporated into new scenarios simply