While child rights groups have welcomed Apple’s announcement, Whatsapp’s CEO has expressed concern by calling it the wrong approach.
Last Thursday, Apple announced a controversial plan to proactively scan all iPhone users’ backed-up images for known child sexual abuse material (CSAM). The move is significant for the iPhone maker, which frequently touts its phones’ privacy as a selling point, and has made end-to-end encryption integral to services like iMessage and FaceTime, which make it impossible for anyone to intercept messages in transit — even for Apple. Apple also announced steps like notifying parents if minors view or send what its systems detect as sensitive content, and warns users searching for CSAM that interest in the subject is harmful.
With this scanner, Apple will essentially be matching hashes of iCloud photos and videos with hashes of known CSAM material and will inform the authorities after the match results cross a certain threshold. While nothing has been announced about this software being deployed in India, the National Center for Missing and Exploited Children in the United States will receive these reports from Apple. NCMEC is a children’s safety advocacy group that works with law enforcement agencies in the United States.
How the tech industry is reacting
Will Cathcart, CEO, WhatsApp: The Facebook-owned messaging app’s CEO immediately expressed concern at Apple’s plans. “I think this is the wrong approach and a setback for people’s privacy all over the world,” Cathcart tweeted on Monday. “We reported more than 400,000 cases to NCMEC last year from WhatsApp, all without breaking encryption,” he added. (Read our coverage on how WhatsApp reports CSAM in India through NCMEC here). He termed Apple’s approach as bringing “something very concerning” into the world, adding that computing devices haven’t had such mandates in spite of existing for decades, and termed the move a form of surveillance:
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
— Will Cathcart (@wcathcart) August 6, 2021
Tim Sweeney, CEO, Epic Games: Sweeney is locked in a lawsuit with Google and Apple to reduce commissions on in-app purchases for Epic Games’s highly popular Fortnite franchise. “Apple’s dark patterns that turn iCloud uploads on by default, and flip it back on when moving to a new phone or switching accounts, exacerbate the problem. Further, in many contexts Apple has forced people to accumulate unwanted data, as with mandatory iCloud email accounts,” Sweeney argued in a thread where he called the scanner “government spyware”. “The existential threat here is an unholy alliance between government the monopolies who control online discourse and everyone’s devices, using the guise of private corporations to circumvent constitutional protections,” Sweeney said. He framed the scanner as a threat to democratic freedoms:
My view is: Yes, do think of the children. Think of the dystopia they'll grow up in if we tolerate the unchecked growth of private monopolies with unlimited surveillance power increasingly taking on governing roles, yet now unshackled from liberal democratic processes.
— Tim Sweeney (@TimSweeneyEpic) August 7, 2021
How child rights groups are reacting
Marita Rodriguez, Executive Director, Strategic Partnerships, NCMEC: In a letter circulated to Apple’s employees obtained by 9to5Mac, Rodriguez welcomed the scanner. “[E]veryone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection,” Rodriguez wrote to Apple. “We know that the days to come will be filled with the screeching voices of the minority. Our voices will be louder,” she added. “Thank you for finding a path forward for child protection while preserving privacy,” Rodriguez said.
Joanna Shields, Founder, WeProtect Global Alliance: “Apple’s new child safety features are proof that detecting CSAM and protecting users’ privacy can be compatible if we want them to be. A great example from one of the world’s major global technology leaders – I hope many others will now follow,” Shields tweeted. WeProtect is a public-private multistakeholder body whose mission is to “break down complex problems and develop policies and solutions to protect children from sexual abuse online.”
Ashton Kutcher, Co-Founder, Thorn: Digital Defenders of Children: The actor tweeted that “I believe in privacy – including for kids whose sexual abuse is documented and spread online without consent. These efforts announced by Apple are a major step forward in the fight to eliminate CSAM from the internet.”
How digital rights groups are reacting
Electronic Frontier Foundation: The US-based digital rights nonprofit said in a blog post written by Director of Federal Affairs India McKinney and Senior Staff Technologist Erica Portnoy that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” arguing that “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.”
We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire. — Electronic Frontier Foundation
Greg Nojeim, Co-Director, Security & Surveillance Project, Center for Democracy & Technology: In a press release on CDTR’s website, Nojeim said that “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world.” He pointed out, “In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences.”
How experts are reacting
Alex Stamos, Adjunct Professor, Center for International Security and Cooperation, Stanford University: The computer scientist Stamos said in a Twitter thread that “I am both happy to see Apple finally take some responsibility for the impacts of their massive communication platform, and frustrated with the way they went about it. They both moved the ball forward technically while hurting the overall effort to find policy balance.” He added that “I have friends at both the EFF and NCMEC, and I am disappointed with both NGOs at the moment. Their public/leaked statements [see above] leave very little room for conversation, and Apple’s public move has pushed them to advocate for their equities to the extreme,” chiding the phone maker for not consulting other stakeholders before making this decision:
“I also don’t understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups. A reasonable target should be scanning shared iCloud albums, which could be implemented server-side.
In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won’t provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.” — Alex Stamos
Technical analyses*: In a series of technical assessments, Benny Pinkas, Deputy Director and Head of Scientific Committee, Department of Computer Science, Bar Ilan University; David Forsyth, Fulton Watson Copp Chair in Computer Science at the University of Illinois Urbana-Champaign; and Mihir Bellare, Professor, Department of Computer Science and Engineering, University of California San Diego; all opined that in terms of security, Apple’s systems were sound. Pinkas wrote, “I believe that the Apple PSI [Private Set Intersection] system provides an excellent balance between privacy and utility, and will be extremely helpful in identifying CSAM content while maintaining a high level of user privacy and keeping false positives to a minimum.”
Forsyth wrote, “It is highly unlikely that harmless users will be inconvenienced or lose privacy because the false positive rate is low, and multiple matches are required to expose visual derivatives to Apple. Apple will review these potential reports and notify NCMEC if appropriate. Even if there is a false alert, this review will ensure that harmless users are not exposed to law enforcement actions.”
Bellare wrote, “Apple has found a way to detect and report CSAM offenders while respecting […] privacy constraints. When the number of user photos that are in the CSAM database exceeds the threshold, the system is able to detect and report this. Yet a user photo that is not in the CSAM database remains invisible to the system, and users do not learn the contents of the CSAM database.”
* Apple facilitated and made copies of these technical assessments available on its website.
Also read
- Why Are Apple’s Plans To Scan ICloud Photos For Child Sexual Abuse Material Concerning?
- How WhatsApp Deals With Child Sexual Abuse Material Without Breaking End To End Encryption
- Instagram Announces Three New Safety Measures For Young Users, Including Limiting Advertisers’ Reach
- India Leads In Generation Of Online Child Sexual Abuse Material
Have something to add? Subscribe to MediaNama and post your comment
You must be logged in to post a comment Login