Brussels will force technology companies to review user communications in case of pedophilia risk | Technology
is the headline of the news that the author of WTM News has collected this article. Stay tuned to WTM News to stay up to date with the latest news on this topic. We ask you to follow us on social networks.
During the first months of the pandemic, the demands for cases in which material related to the sexual abuse of minors was used grew by 25% in several countries. These data, and many others, which for the European Commission are “the tip of the iceberg” of the problem, have led it to propose a regulation that aims to involve providers of digital services and internet access in the fight against this crime. And if it happens that these companies do not collaborate, they will be forced to do so under the control of independent authorities or judges, according to the proposal approved by the Community Executive this Wednesday. The regulation, in which Brussels has been working for almost a year, has caused complaints from some associations in defense of privacy, considering that it calls into question the right to privacy.
“Many companies are doing nothing to detect [estas situaciones] today”, said the European Commissioner for the Interior, the Swedish Ylva Johansson, this Wednesday. Aware of the suspicions that a regulation of this style arouses, she has stressed that “no detection will be made without an order [de una autoridad independiente o de jueces]”. Although he then added that “once the order has been issued, the detection will be mandatory”.
The regulation that the Commission has put on the table and that must now be amended and approved by the Council of the EU and by the European Parliament states that technology companies (providers of digital services, internet access, sellers and download services of applications) are required to carry out a risk analysis on the “improper use for the dissemination of material containing sexual abuse of minors or solicitations of children [de grooming]”. Likewise, when the authorities detect that there is a risk, an order may be issued. The companies themselves will be responsible for implementing detection systems for these contents. Those methods may be automated, but will ultimately need human oversight.
“There is a process with many safeguards to be able to have this permission or ask another independent authority or a court to issue an order for a certain period. Before that, they have to consult and data protection authorities. And only when there is a detection order, companies are authorized but also obliged to detect and report this content”, explained Commissioner Johansson in a meeting with various media, including EL PAÍS.
The Commission also foresees the creation of a European office against child sexual abuse (EU Center on Child Sexual Abuse), with a budget of 28 million euros, which will cover the investigations of the companies themselves, develop research tools and will issue orders to intervene communications in case of inaction by the platforms.
One of the questions hanging over the final wording of the text was what would happen to end-to-end encrypted messages. Several messaging services, such as WhatsApp or Signal, use this communication encryption technique to ensure that no one other than the sender and receiver, not even the platform itself, can see the content. “The text of the regulation does not prohibit encryption, but it discourages its use so that the company can scan content and demonstrate that it is taking preventive measures to limit the dissemination of sexual content of minors,” says Ella Jakubowska, privacy analyst at the European Digital Rights Association (EDRi).
A year of negotiations
The European Parliament last summer approved a temporary regulation that allows technology companies to scrutinize user communications (emails, text messages and attached documents, such as photos and videos) in search of pedophile content. The regulation, which was baptized as Chatcontrol (chat control), temporarily repeals Directive 2002/58/EC, known as ePrivacy, which regulates aspects such as the confidentiality of information, the treatment of personal data and third-party cookies. The suspension has a deadline: December 31, 2022. The objective of the Commission was to have a permanent regulation ready before that moment that provides legal coverage for the intervention of communications.
Now all companies must control that pedophile content is not disseminated. In the US, the National Center for Missing and Exploited Children (NCMEC) has long requested the cooperation of technology companies in this matter. That monitoring is mostly done in an automated way: there are artificial intelligence tools that look for keywords. Google already tracks documents uploaded to Google Drive; Apple announced last year its intention to do the same with photos taken from iPhones, but the wave of protests that the decision unleashed led the Cupertino company to freeze the measure until it carefully studied its pros and cons.
According to sources familiar with the process, one of the issues that would have motivated the EU’s interest in precipitating a new regulation is Facebook’s intention to end-to-end encrypt messages sent by Facebook Messenger, its instant messaging application. Little used in Europe, it is especially popular in the US and has some 1,300 million active users, which makes it the second most used tool, only behind WhatsApp (some 2,000 million users).
Several civil associations have tried to sit down with Commissioner Johansson to discuss the details of the regulation that is being presented today. They have not been successful. “Our great concern is that the regulation is a gateway to examine the communications of all citizens, not just those suspected of spreading pedophile material,” Jakubowska.
Commissioner Johannson has defended her project pointing out precedents on the control of communications and for other purposes than to prevent a crime such as sexual abuse of minors: “Today we have a directive on privacy [en las comunicaciones digitales] which says that companies can always scan all interpersonal communication if it is in order to protect themselves from malware and the spam. In these cases they are allowed to do so and they do. They do it in all communication, both in encrypted interpersonal communication [o encriptada] as in the unencrypted. And they do it, of course, for profit, because if we had a lot spam Y malware we would not use that communication. So they do it for purely commercial purposes. [Ahora] It is not legal to do this type of scanning for printing, for scanning, for child sexual abuse material.” The Commission’s approach is clear, if it is justified to offer a better service and make money, why not to prevent a crime?
The European Data Protection Supervisor also has reservations about the suitability of private communications being scrutinized by third parties. The body, in charge of ensuring compliance with privacy regulations in European institutions, published a non-binding opinion in 2020 in which it already questioned whether subjecting citizens’ communications to constant scrutiny could be compatible with the right to privacy. From the Supervisor’s office they rule out commenting on the new regulations, at least for the moment.
You can follow EL PAÍS TECNOLOGÍA at Facebook Y Twitter or sign up here to receive our weekly newsletter.
Exclusive content for subscribers
read without limits