E-privacy: even what is said in video conferences could be searched in the future

In the fight against child pornography, the EU wants to have the Internet screened. Pirate politician Patrick Breyer warns that innocent people could come under suspicion - and that's not even the only problem.

Question: Mr. Breyer, what are the EU's plans for monitoring Internet communications?

Breyer: The EU Commission is planning to screen all private electronic messages without any prior warning and across the board, in order to search for child pornography. There is a current draft law on this, which the European Parliament is negotiating with the EU governments this Thursday. It provides that e-mails, messages and chats may be searched by the providers of these services for allegedly suspicious photos or texts. The big U.S. companies - Google, Facebook, Microsoft - already practice chat control, as came out last year.

Question: They wouldn't be allowed to do that at the moment?

Breyer: That's right, the General Data Protection Regulation doesn't allow it, and the amendment to the so-called e-privacy directive, which has been in effect since December, additionally includes digital privacy of telecommunications and correspondence. The planned new regulation on chat control is intended to undermine this and make the suspicionless, fully automatic screening of content possible - and if the content is conspicuous from their point of view, the services are then supposed to file criminal charges. Even what is said in video conferences could be screened in the future on the basis of the planned regulation, the EU Commission has explicitly confirmed.

Question: But the fight against child pornography is a noble goal.

Breyer: Yes, but the fight against child pornography will not be won by merely pushing it into other channels. The disadvantages for innocent citizens, including children and victims of abuse, are enormous.

Question: Those who have nothing to hide, it is often said, have nothing to fear.

Breyer: That is known to be nonsense. First, each of us has things we don't want others snooping around in. Secondly, it often happens that content - and thus people - who have nothing whatsoever to do with child pornography end up on such a hit list and then first end up in the mills of law enforcement.

Question: Can you prove that?

Breyer: The basis for such searches, when it comes to pictures and videos, are databases of private organizations and companies in which alleged cases of child pornography are listed. However, it has been shown that a large proportion of this content is irrelevant under criminal law. Artificial intelligence is also being used to try to identify entirely new child pornography content. But that, too, is massively error-prone.

Question: In what way?

Breyer: For example, even perfectly legal adult nude photos end up in the hands of some Facebook contractors, such as in India, and ultimately also the police. A recent statistic from the Swiss Federal Police shows that only 14 percent of the content reported there by U.S. technology companies was criminally relevant - in the remaining cases, the company algorithms reported harmless forms of nudity, such as in vacation photos. So for the most part, people were reported who had done nothing wrong.

Question: But that can be cleared up quickly?

Breyer: Not always. Even a false suspicion of child pornography and an interrogation that may be based on it can have fatal consequences for the person concerned, for his family, his marriage, his job. But digital privacy of correspondence must also be protected for the sake of children and young people.

Question: What do you mean by that?

Breyer: For one thing, young people also exchange self-recorded nude pictures with each other, which do not belong in the hands of unauthorized persons. And victims of abuse must also be able to exchange information about their trauma with others, friends, as well as counseling, therapy and self-help facilities, without having to fear that what they write will be recorded, evaluated and passed on by algorithms.

Question: Would the Internet services be obliged to do this?

Breyer: Not at first, but that is planned in a second law in the next few months. Such a screening obligation is unique in the world. It could even be extended to end-to-end encrypted messengers, meaning that providers could be required to build in backdoors. And such vulnerabilities would then not only pose a threat to Internet security, because they could also be exploited by criminals, but would also arouse the desires of law enforcement agencies and intelligence services.

Question: Which ones, for example?

Breyer: They could demand to be able to use the backdoor to extract such content and intercept it themselves. So far, this has failed because it is not technically possible with end-to-end encryption. But as soon as it is technically possible, it will also be legally possible for us.

Question: But law enforcement agencies and intelligence services also have legitimate interests and serve to protect and secure us all.

Breyer: Yes, but what we're talking about here is content protected by fundamental rights, countless legitimate private, business and state secrets. The former German judge at the European Court of Justice, Ninon Colneric, has just come to the conclusion in an expert opinion that a general and suspicionless search of electronic communications content violates citizens' rights under the EU Charter of Fundamental Rights to protection of their communications and privacy.

Question: An expert opinion that you commissioned.

Breyer: The word of a former ECJ judge carries weight. And my commission does not change the force of her statement: Ms. Colneric explicitly refers to a decision of the European Court of Justice on connection data and location data that was issued as recently as October 2020. That, too, was about fully automated searches - and the ECJ accepted that only on an exceptional, temporary basis in situations where national security is threatened. This standard, says Ms. Colneric, must apply all the more when it comes to screening not just traffic data, but content.

Question: And the fight against child pornography is not enough?

Breyer: No, banned images do not threaten national security. And in any case, it is more than questionable whether the general screening of all private correspondence is a suitable tool at all. The channels used by child porn rings are not Facebook and the like, after all. Besides, law enforcement agencies are completely overburdened and notice ongoing abuse far too late, so prosecuting physical abuse and protecting children must have absolute priority.

Comments

Popular posts from this blog

Chrome targeted by criminals: Why users need to update quickly now

Face and voice recognition: Why TikTok wants to be allowed to collect biometric data in the USA

Microsoft Teams, Zoom, WebEx: Berlin authority warns against popular video systems