The Washington PostDemocracy Dies in Darkness

The Cybersecurity 202: Apple’s move against child pornography is shifting battle lines for law enforcement and technologists

with Aaron Schaffer

This post has been updated to remove a tweet that contained incorrect information. 

Apple’s latest move to fight the digital sharing of child pornography is opening up some fissures in a seven-year standoff between technologists and law enforcement over fighting the spread of criminal activity online. 

That fight has centered primarily on FBI and Justice Department demands for special police access to encrypted communications that would otherwise be shielded from everyone, including the platform where the conversation is happening. 

Justice Department and FBI officials say that access — with a warrant — is vital to stop terrorists, purveyors of child pornography and other criminals from “going dark” and acting with impunity online. Technologists nearly uniformly say creating such an encryption backdoor will make everyone more hackable and that the trade-offs aren’t worth it. The fight’s flared periodically since 2014 with no significant give or take from either side. 

The Apple system aims to thread that needle as it relates to the most commonly shared child pornography images

The system is still drawing criticism from the vast majority of technologists and privacy experts, who say it’s overly invasive and warn that it could create a slippery slope toward more government surveillance.

But it’s also drawing praise from a few who call it a reasonable compromise to combat the spread of such images.

“We’re talking about young children who are being sexually abused and their material is being redistributed online,” Hany Farid, a computer science professor at the University of California at Berkeley, told me. “To me, these are incredibly modest technologies where the upside is considerably greater than what I see as largely hypothetical downsides.”

The Apple system doesn’t look at actual photos but at digital fingerprints associated with them called hashes. 

It scans photo hashes on phones and tablets before they’re uploaded to iCloud to see if they match hashes associated with known child pornography. Apple neither looks at the photos nor gathers information about photos that don’t match the digital signatures. 

If the number of matches crosses a certain threshold, Apple will forward information about the user to the National Center for Missing and Exploited Children, which will presumably share that information with law enforcement. 

Apple says in a technical paper that it has set the threshold high enough that there’s a roughly 1 in 1 trillion chance that a person will be flagged mistakenly for possessing child pornography. Other large tech companies make similar scans for child pornography but do it on cloud servers rather than on devices as Apple will. 

The company has said it will refuse any government requests to search for things other than child pornography and that its system offers more privacy protections than those from other tech platforms. It is part of a suite of changes that includes a new on-device feature to inform parents about explicit images shared by minors and a program that flags searches for child pornography on Siri and iPhone.

This is Apple saying we do have a role in combating this material and here’s a way we can do that without breaking encryption,” Troy Hunt, creator of the website Have I Been Pwned and a Microsoft regional director, told me. “I think the fact Apple can do this in a way that, as it stands today, will not invade the privacy of anyone who is not sharing this material, that’s a positive thing.” 

Farid and Hunt were among a handful of technologists to praise the system. 

Matt Tait, chief operating officer of the cybersecurity firm Corellium, described it as the least invasive way of reducing the sharing of child pornography, which has increased dramatically during the digital age. 

There was also praise from Congress. 

Here’s Sen. Richard Blumenthal (D-Conn.) who co-sponsored a bill last year aimed at limiting the spread of online child pornography that experts feared would have effectively forced encryption backdoors. 

Technologists roundly opposed the bill. It was later modified to remove threats to encryption but it did not become law. 

The majority of technologists and privacy advocates, however, are strongly opposed to the Apple system. 

The criticisms fall into two broad categories.

First, some worry it will be ineffective, essentially driving people who share child pornography to do it on platforms that are better shielded from law enforcement. 

“Bad actors can (and will) switch to other platforms, meaning that at the end of the day, Apple’s system could be at best minimally effective,” Chelsea Komlo, a researcher at the cryptography, security, and privacy lab at the University of Waterloo in Ontario, Canada, told me by email. 

She also argued that the system requires users to put too much trust in Apple to not abuse the technology or search for things beyond child pornography.

This is not a backdoor, it is taking a wall off of a house, looking through a user’s items, and reporting to law enforcement if they match items on a list,” Komlo told me. “And there is no guarantee or transparency about what items are on that list, it could be anything.”

Others worry that it will be a slippery slope toward scanning phones for other restricted material — especially in China and other repressive nations. 

“Tim Cook infamously said when Apple removed VPN apps from the App Store in China that ‘we follow the law wherever we do business,’ ” Jonathan Mayer, a computer science professor at Princeton University, told me. “What happens if China or another undemocratic country demands that Apple use this system for illegitimate surveillance or censorship?”

Mayer designed a system similar to Apple’s as a research project with a graduate student and has plans to present it at an academic conference this week. 

The pair found that the technology works effectively, he said. But they urged against implementing such a system because of concerns about how tech firms and the countries where they operate would implement them. 

There are difficult follow-on questions, especially about ensuring these systems aren't misused. We couldn't answer those questions. And now Apple can't answer those questions,” Mayer told me. 

He added: “Apple made its announcement as a fait accomplis, both undermining open dialogue and — frustratingly — doing little to address the problems that we highlighted.”

The keys

Phishing emails can be more effectively written by artificial intelligence than humans, researchers said at Def Con.

The researchers from Singapore's Government Technology Agency were able to tailor personal phishing emails that tricked their colleagues, Wired’s Lily Hay Newman reports

Also at the Def Con and Black Hat conferences:

  • Twitter handed out cash prizes to researchers who found biases in their image-cropping algorithm. The algorithm is biased against people with white or gray hair and favored people with slim, young faces, according to researchers. The revelations come after Twitter users already determined the algorithm was more likely to show White faces than Black ones.  
  • An underground workforce of drifters is necessary for cybercriminals to function, Readme’s Shaun Waterman writes.
  • Researcher Cheng-Da Tsai (Orange Tsai) discovered a “totally new” attack surface in Microsoft Exchange.
  • A hacker was able to take over Amazon Kindle e-readers with malware-laced e-books.
Officials worry that hackers could hit them hard when schools reopen.

U.S. schools confronted a record number of hacks in 2020, Bloomberg CityLab’s ​​Nic Querolo and Shruti Singh report. Officials fear the trend could get even worse as they return in the fall. 

Many schools plan to continue to offer virtual learning as an option, leaving them even more vulnerable to hacks.

“We see no evidence that this is abating,” Consortium for School Networking CEO Keith Krueger said. “Criminals are having luck with it, they’re obviously having it with big cases we’re reading about every day. With back to school, we’re bracing ourselves for a real challenge this fall.”

Chat room

Facebook monitored communications between a journalist and a Russian hacker, Sheera Frenkel of the New York Times wrote in a Twitter thread:

Journalist Kim Zetter:

Nathaniel Gleicher, Facebook's head of security policy, also weighed in:

Securing the ballot

Mesa County’s election equipment passwords end up online, prompting state investigation

Republican issues subpoenas for Wis. election information (Associated Press)

Global cyberspace

U.S. warned Brazil that Huawei would leave it 'high and dry' on 5G (Reuters)

Pegasus spyware scandal: Years of questions but no answers for Mexico victims (Reuters)


  • Researchers at the Harvard Kennedy School Belfer Center's Cyber Project released a paper that “seeks to create a road map toward … answering how a 21st century threat can be tackled by the tools available in its own time.”


  • Former NSA director and U.S. Cyber Command commander Mike Rogers and cybersecurity executives speak at the HIMSS Global Health Conference & Exhibition today at 11:30 a.m.

Secure log off