Artificial porn without consent: what to do if your image is used without permission?

"A woman's body is not public property, it is not a commodity for your marketing strategy.

Oliver Thansan
Oliver Thansan
05 June 2023 Monday 10:22
15 Reads
Artificial porn without consent: what to do if your image is used without permission?

"A woman's body is not public property, it is not a commodity for your marketing strategy." This is how a tweet from the singer Rosalía started, which recently went viral after the singer JC Reyes uploaded a photo of her half-naked to her social networks. Rosalía's complaint is real, but the image is not: it was generated through artificial intelligence.

While false information has received enormous attention for its devastating democratic effects, digital violence against women is still not considered a social problem, despite experts saying it is on the rise. In 2019, Deeptrace Labs, a company dedicated to the investigation of digital disinformation, discovered that more than 90% of the manipulated images and videos that circulate on the deepweb – that is, the hidden part of the internet, the one that has not been indexed by search engines – are non-consensual fake pornography.

In Spain, in 2022, 375,506 criminal offenses were recorded over the Internet, 72% more than in 2019, according to data from the Ministry of the Interior. In fact, one in five crimes is committed online. Among the most common cybercrimes, in addition to phishing – sending emails that impersonate a company to extract personal data from the recipient – ​​and ransomware attacks – introducing malicious software that blocks access to a company's computer system – , there is sextortion: the blackmail of spreading photographs or images of a naked person.

Last January, the popular content creator QTcinderella, 29, tearfully denounced having felt violated after her image had been used to generate a porn video. Her case joins that of other celebrities, such as Scarlett Johansson, who have been victims of the same crime: pornographic deepfakes, that is, videos of a sexual nature where a person's identity is impersonated. “Everyone is a potential target,” Johansson declared in The Washington Post.

“Due to the proliferation of generative artificial intelligence applications, fake nudes of any woman, famous or not, are becoming easier to create and disseminate,” says artificial intelligence ethicist Patricia Ventura. “The problem does not lie in the technology itself, but in the use we make of it. The rise of this type of crime is due to the exponential multiplication of tools, their democratization and their easy handling, ”she adds.

Deepfakes have been used from the very beginning to create pornographic depictions of women. In fact, deepfake technology was popularized in 2017 through a Reddit forum dedicated to sharing videos with the faces of famous Hollywood actresses transposed into the bodies of porn artists, according to digital false information expert Henry Ajder in his latest podcast for the BBC. In 2019, DeepNude appeared, an app created for this sole purpose. Its founder had to close it shortly after announcing its existence due to an overload in the system and possible coercion of women.

Although tech companies have tools to signal what content has been generated through artificial intelligence, the eradication of deepfakes will not come from Silicon Valley companies. "Specific legislation is necessary because technology companies are only governed for their own benefit," adds Ventura.

The consequences for women affected by this false content can be devastating. On a psychological level, these videos cause as much pain and suffering as the crime of revenge porn, that is, the dissemination of intimate images or videos published without consent. "This type of abuse, where the image and reputation of a woman is played with, destroys the victim," explains the lawyer specialized in sexist violence Laia Serra.

On a social level, the victims claim to have difficulties rebuilding their lives, whether at the time of finding a partner or a job, as well as being singled out and mocked. “Pornographic deepfakes are a way to denigrate women through their bodies and perpetuate patriarchy. If we generate misinformation about a male politician, we manipulate his speech; On the other hand, if we want to harm a political woman, we do it through her body, ”adds Serra.

This type of crime can have unintended consequences on the part of the person who spreads the images, as well as on the part of all those who contribute to their viralization, as happened in the case of the Iveco worker who committed suicide after a video your real estate will circulate through a work Whatsapp group intended to report job advertisements.

"Since there is no specific crime for pornographic deepfakes, a generic crime against moral integrity is usually used," explains Serra. “Deep down, it doesn't matter if the photos are real or not, because the damage they cause is the same. In each case, the context and intention must be assessed and whether the asset being protected is the privacy or reputation of a person. Both through civil and criminal proceedings, there is a crime against moral integrity. And the Data Protection Agency has opened a channel to remove those videos created without consent, so you can also act through administrative channels ”, she adds.

From a technological point of view, the European Commission is working on a proposal for an artificial intelligence law to regulate deepfakes which, among other things, proposes that the videos generated by this technology be labeled as such. Spain is in the process of creating an artificial intelligence supervisory agency, announced by the Government of Spain on December 28, 2021, to monitor compliance with European regulations regarding this technology.