Police are investigating the spread of AI-generated photos of naked girls

A group of mothers from Almendralejo (Badajoz) has denounced the publication and dissemination of images generated with artificial intelligence of their daughters simulating fake nudes.

Oliver Thansan
Oliver Thansan
18 September 2023 Monday 11:14
18 Reads
Police are investigating the spread of AI-generated photos of naked girls

A group of mothers from Almendralejo (Badajoz) has denounced the publication and dissemination of images generated with artificial intelligence of their daughters simulating fake nudes. These are images that show the faces of minors, whose ages range from 12 to 17, with a naked body that is not their own, and it is not ruled out that, in addition to the spread on WhatsApp, have been shared on sites such as OnlyFans or pornographic sites.

At the moment, the National Police has received several complaints and an investigation has already been opened, according to Police sources.

Among the minors affected is the daughter of the gynecologist Miriam al-Adib, very active and present on social networks, who has shared a video to denounce this "barbarity", which is still a crime. As he explained, the criminals would have taken a photo of his daughter through a mobile application and added a naked body with artificial intelligence, a crime that "they committed with my daughter and with dozens of other girls , here in Almendralejo and maybe in some places around”.

The scientific popularizer has appealed to the creators of these montages: "Stop now". He also pointed out to all those who are sharing the montages to remind them that it is a crime to distribute the images, even if they are not the authors of the facts. "To the funny people who came up with this, if you're a little intelligent, help repair all this you've put together," al-Adib wrote in the text accompanying the video.

On the other hand, Miriam al-Adib wanted to send a message of support to the victims to remove all the burden of guilt that the girls may feel. "Girls, don't be afraid to report these acts, tell your mothers."

Many families have already gone to the Police to file a report and Miriam al-Adib has assured that some people involved have already given a statement at the police station. "This will not stay like this", he added.

In just 24 hours, the video has received dozens of messages of support for victims and relatives, as well as similar cases of reporting for crimes of simulated child sexual exploitation material.

The proliferation of artificial intelligence tools and applications has also reached photo retouching, and there are a number of applications available that are advertised as being designed specifically to turn a photograph of a person appearing clothed into another that that same person appears naked.

Before the case made public yesterday by this group of mothers from Almendralejo, the influencer Laura Escanes already denounced on August 16 that, through one of these applications, some of her photos had been manipulated so that she appeared naked.

And in May it was the turn of the singer Rosalía to receive, who accused the singer JC Reyes of editing a photo of her with artificial intelligence so that she appeared bare-chested, and which the reggaeton singer had shared on his account from Instagram. In the first instance, Reyes assured that it had been Rosalía herself who had provided him with the image, but he had to end up acknowledging that she had edited it with artificial intelligence.