scorecardresearch
Europe plans to scan private messages for photos related to child porn, putting encryption at risk

Europe plans to scan private messages for photos related to child porn, putting encryption at risk

With an intent to curb content and material associated with child abuse and pornography, collectively called CSAM, the European Commission wants tech companies to "detect, report, block, and remove" CSAM from their platforms.

Story highlights
  • The European Commission has proposed a new law for private messaging apps.
  • It wants them to scan users' private messages for content related to child porn and abuse.
  • Scanning messages on services such as WhatsApp would weaken end-to-end encryption in them.

The European Commission has proposed a new regulation under which private messaging apps will be required to scan their users' chats for child sexual abuse material (CSAM) — a controversial move that experts believe will put encryption in these apps at risk. The commission's new obligation is meant for platforms such as WhatsApp, iMessage, and Snapchat, all of which are based on end-to-end encrypted messaging.

With an intent to curb content and material associated with child abuse and pornography, collectively called CSAM, the European Commission wants tech companies to "detect, report, block, and remove" CSAM from their platforms. But doing that would require these companies to scan messages their users exchange freely, keeping in mind that whatever they are sharing is end-to-end encrypted.

The regulation brings several new obligations for "online service providers", which is a broad term that includes app stores, websites, and private messaging apps. When ordered, these companies will have to scan chats of select users to see if they have shared materials related to CSAM as well as any messages that may be related to "grooming" or the "solicitation of children." For scanning to work effectively, the messaging apps will have to use artificial intelligence and machine vision tools that will analyse texts, photos, and videos to find relevant material.

The European Commission said these "detection orders" will be given from time to time for specific users so as to reduce privacy infringements. However, the commission did not say who would be targeted — an individual or a group or some other category of users.

While the new obligations for private messaging apps do not mention anywhere that they have to compromise end-to-end encryption, privacy experts are wary that requiring these apps to scan messages through dedicated software to detect CSAM would hinder the effective implementation of end-to-end encryption.

The proposal, a copy of which was leaked earlier this week, has raised eyebrows from the privacy advocates, including cryptography professor Matthew Green, who is calling the European Commission's soon-to-be diktat "the most terrifying thing" he has ever seen. He added, "It describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration."

Jan Penfrat of the digital advocacy group European Digital Rights (EDRi), said, "This looks like a shameful general #surveillance law entirely unfitting for any free democracy."

"There's no way to do what the EU proposal seeks to do, other than for governments to read and scan user messages on a massive scale," Joe Mullin, a senior policy analyst at the digital rights group Electronic Frontier Foundation, told CNBC. "If it becomes law, the proposal would be a disaster for user privacy not just in the EU but throughout the world."

Since the European Commission is extremely influential for other countries when it comes to rules and regulations for the internet, the new regulation to scan encrypted messages could be misused, such as in despot states.