A law against online pedophilia threatens the privacy of Europeans

Whoever has not used a messaging application at some time to send a message or risqué image should throw the first stone.

Oliver Thansan
Oliver Thansan
22 May 2023 Monday 22:22
10 Reads
A law against online pedophilia threatens the privacy of Europeans

Whoever has not used a messaging application at some time to send a message or risqué image should throw the first stone. We do it believing ourselves to be in a space protected from the eyes of others. What is said or done is between us and the code. This could be about to change in the European Union, according to digital rights activists, in the name of the fight against child sexual abuse. The European Commission intends to approve a regulation before next October – that is, of direct application in all EU countries – that would force all messaging platforms, including those for encrypted communication, to monitor communications in search of of suspicious behavior. Almost all organizations in defense of digital rights and a large number of technology experts warn of the possible loss of fundamental rights and dangers to network security that its approval would entail. However, both the Commissioner for the Interior, Ylva Johansson, and the rapporteur for the project, the Spanish Francisco Javier Zarzalejos, maintain that the Regulation to Prevent and Combat Child Sexual Abuse does not deal with privacy but with the rights of children and accuse activists to "distort the meaning of the project".

“It is not about children, I have read the entire draft, and in 140 pages it does not talk about child sexual abuse. Only about the control of the internet”, denounces Sergio Salgado, an activist from the platform in defense of digital rights Xnet, which today publishes together with Political Watch, The Commoners, Eticas, Inter_ferencias and Guifi•net the report “

The current bill calls for the creation of algorithms that scan communications, particularly sexually explicit material (also between adults), to detect pedophilic material or behavior. There is talk of a "neutral technology", that is, the algorithm in question would navigate something like with its eyes closed, and would only open them if it detects suspicious cases. This material would then be sent to a data processing center, where the human hand would do a first screening. From there, if necessary, it would be sent to the authorities of the respective countries. The problem is that, today, critics of the law say, this technology does not exist. And everything that has been tried before has been discarded due to the unaffordable number of false positives it generated. "The approach is not to say what technology each platform has to use, but to demand that certain standards be met," defends Zarzalejos in conversation with La Vanguardia. The popular MEP affirms that each service has its own particularities, therefore, they must apply different measures to control criminal use by its users and the objective of the regulation is "not to become obsolete."

“We know for a fact that the best technologies in the world to do this have an error rate of between 10% and 20%. On WhatsApp alone, 10 billion messages are sent daily, an error rate of 10% to 20% is in the billions. The error rates are going to be huge,” says Ella Jakubowska, an expert on digital human rights at the European Digital Rights Network (EDRi). To this Zarzalejos responds that "a success rate of 90%, taking into account that later there would be a human review, is not scandalous."

However, when talking about false positives we are not talking in the abstract, some platforms such as Facebook or Gmail already use similar technologies to identify nudity. Salgado explains, for example, the case of a person in the United States who suffered legal problems for a decade after Google identified some photos of his son that he sent for a medical consultation as child sexual abuse material. Jakubowska goes further. Irish police falsely identified hundreds of people as possible abusers after sharing images of their children on the beach or even images of consenting adults.

On May 11, 30 of the leading European IT and cybersecurity experts published an open letter to European leaders warning that the commission's proposal was based on highly imprecise technology and "would endanger the security of all , including some of the most vulnerable: children.” But Zarzalejos disagrees and ensures that this technology has to "adapt and calibrate". “Algorithms can be as specific as you want. It seems suspicious to me that there is technology for everything and not for this, ”he says.

The report on Chat Control (as the project is known among activists) criticizes that efforts are going to be wasted on "unnecessary and even harmful" actions when those who are dedicated to combating these crimes denounce that they are not even coping to eliminate pedophilic images as detected on the network. "In fact, these types of criminals do not use mainstream messaging," recalls Salgado, who also warns that opening back doors to access encrypted communication not only violates the right to privacy of communications, but also weakens the security of the entire network. “There are no back doors just for the good guys,” he explains. “A less secure internet with the excuse of protecting children is a less secure internet for everyone. More open to interference by intelligence services, criminals, foreign powers and... pedophiles”.

“Commissioner Johansson has been saying for some time that they want to discourage the use of encryption because it prevents police and governments from being able to break into private citizens' conversations,” says Jakubowska. Asked about the matter, Zarzalejos admits that the encryption of communications is one of the most difficult issues of the project. Although he assures that his proposal "does not contemplate any measure that breaks the encryption", he does acknowledge that there is a "debate" about "technical solutions" such as scanning communications before they occur. "Before an image is sent, for example, it could be detected if it is material of sexual abuse," she explains. But he insists: "The encryption is not going to be touched or anyone is going to be required to access the content."