Pedophiles Taking Advantage Of AI For Their Nefarious Acts


Pedophiles are using artificial intelligence (AI) to create images of celebrities as children. The Internet Watch Foundation (IWF) said images of a well-known female singer reimagined as a child are being shared by predators.

On one dark web forum the charity says images of child actors are also being manipulated to make them sexual. Hundreds of images of real victims of child sexual abuse are also now being created using bespoke image generators. The details came from the IWF’s latest report into the growing problem, as it tries to raise awareness about the dangers of pedophiles using AI systems that can create images from simple text instructions.

Since these powerful image generation systems entered the public domain, researchers have warned that they have the potential to be misused to generate illicit images. In May, Home Secretary Suella Braverman and US Homeland Security Secretary Alejandro Mayorkas issued a joint statement committing to tackling the “alarming rise in despicable AI-generated images of children being sexually exploited by pedophiles.”

The IWF’s report details how researchers spent a month logging AI imagery on a single darknet child abuse website, and found nearly 3,000 synthetic images that they would consider illegal. Analysts said there is a new trend of predators taking single photos of well-known child abuse victims and recreating many more of them in different sexual abuse settings.

One folder they found contained 501 images of a real world victim who was about 9-10 years old when she was subjected to sexual abuse. In the folder predators also shared a fine-tuned AI model file to allow others to generate more images of her. The IWF says some of the imagery, including that of celebrities as children, is extremely realistic and would be indistinguishable to untrained eyes.

Analysts saw images of mostly female singers and movie stars that had been de-aged using the imaging software to make them look like children. The report did not identify which celebrities had been targeted.

The charity said it was sharing the research to get the issue put into the agenda at the UK government’s AI Summit next week at Bletchley Park. In one month, the IWF investigated 11,108 AI images which had been shared on a dark web child abuse forum.

Of those, 564 images were classified as Category A, the most serious kind of imagery, and 1,372 depicted primary school-aged children (seven to 10 years old). In June, the IWF warned that predators were starting to explore the use of AI to make depraved images of children, but now the IWF says the fears are a reality.

“Our worst nightmares have come true,” said Susie Hargreaves OBE, the chief executive of the IWF. “Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.”

The IWF report reiterates the real-world harm of AI images. Although children are not harmed directly in the making of the content, the images normalise predatory behavior and can waste police resources as they investigate children that do not exist.

In some scenarios new forms of offence are being explored too, throwing up new complexities for law enforcement agencies. For example, the IWF found hundreds of images of two girls whose pictures from a photoshoot at a non-nude modelling agency had been manipulated to put them in Category A sexual abuse scenes. The reality is that they are now victims of Category A offences that never happened.















Share post:




More like this


The Federal Competition and Consumer Protection Commission (FCCPC) has...


The newly appointed Police Public Relations Officer (PPRO) for...

UEFA Champions League: Mbappe Stars As PSG Beat 10-man Barcelona

Luis Enrique returned to Barcelona to knock his former...

Tennis: Nadal Makes Triumphant Return At Barcelona Open

Rafael Nadal enjoyed a winning start at the Barcelona...