Home » Technology » WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

Share This Post

Technology

WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

WhatsApp won’t be adopting Apple’s new Child Safety measures, meant to stop the spread of child abuse imagery, according to WhatsApp’s head Will Cathcart. In a Twitter thread, he explains his belief that Apple “has built software that can scan all the private photos on your phone,” and said that Apple has taken the wrong path in trying to improve its response to child sexual abuse material, or CSAM.

Apple’s plan, which it announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows it to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found to be sharing child abuse imagery. Another prong of Apple’s Child Safety strategy involves optionally warning parents if their child under 13 years old sends or views photos containing sexually explicit content. An internal memo at Apple acknowledged that people would be “worried about the implications” of the systems.

Cathcart calls Apple’s approach “very concerning,” saying that it would allow governments with different ideas of what kind of images are and are not acceptable to request that Apple add non-CSAM images to the databases it’s comparing images against. Cathcart says WhatsApp’s system to fight child exploitation, which partly utilizes user reports, preserves encryption like Apple’s and has led to the company reporting over 400,000 cases to the National Center for Missing and Exploited Children in 2020. (Apple is also working with the Center for its CSAM detection efforts.)

WhatsApp’s owner, Facebook, has reasons to pounce on Apple for privacy concerns. Apple’s changes to how ad tracking works in iOS 14.5 started a fight between the two companies, with Facebook buying newspaper ads criticizing Apple’s privacy changes as harmful to small businesses. Apple fired back, saying that the change “simply requires” that users be given a choice on whether to be tracked.

It’s not just WhatsApp that has criticized Apple’s new Child Safety measures, though. The list of people and organizations raising concerns includes Edward Snowden, the Electronic Frontier Foundation, professors, and more. We’ve collected some of those reactions here to act as an overview of some of the criticisms levied against Apple’s new policy.


Matthew Green, an associate professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple’s plans and about how the hashing system could be abused by governments and malicious actors.

The EFF released a statement that blasted Apple’s plan, more or less calling it a “thoroughly documented, carefully thought-out, and narrowly-scoped backdoor.” The EFF’s press release goes into detail on how it believes Apple’s Child Safety measures could be abused by governments and how they decrease user privacy.

Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a thread on the potential dangers to queer children and Apple’s initial lack of clarity around age ranges for the parental notifications feature.

Edward Snowden retweeted the Financial Times article about the system, giving his own characterization of what Apple is doing.

Politician Brianna Wu called the system “the worst idea in Apple History.”

Writer Matt Blaze also tweeted about the concerns that the technology could be abused by overreaching governments, trying to prevent content other than CSAM.

Epic CEO Tim Sweeney also criticized Apple, saying that the company “vacuums up everybody’s data into iCloud by default.” He also promised to share more thoughts specifically about Apple’s Child Safety system.

Not every reaction has been critical, however. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work “a major step forward” for efforts to eliminate CSAM.

Share This Post