No featured image available
Apple unveiled plans Thursday for software intended to detect and flag sexually explicit images of children in a move critics say threatens user privacy.
The software, to be added to iOS and iPadOS in an update later this year, will perform “on-device” scans of users’ images that are uploaded to iCloud, and match them against a database of sexually explicit content of children, the company announced Thursday. If enough images match content in the database, Apple will manually review them and report the user to the National Center for Missing and Exploited Children (NCMEC), according to the announcement.
The company will also add new tools to its Messages app that alert children and their parents if they receive or send sexually explicit images, using on-device scans that detect sexual content in message attachments. Apple will blur the sexually explicit photo, and said the child would be provided with “helpful resources” and “reassured.”
Some cybersecurity experts and privacy advocates worry the software creates an opportunity for abuse and threatens user privacy.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog post.
“That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change,” they said.
Matthew Green, a cryptography professor at Johns Hopkins University who leaked details on the software Wednesday before it was announced, warned the program could be used by governments to access citizens’ private communications.
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
— Matthew Green (@matthew_d_green) August 5, 2021
He pointed to a letter sent to Facebook’s CEO Mark Zuckerberg by former Attorney General William Barr, along with officials from the United Kingdom and Australia, asking for “lawful access mechanisms” to Facebook’s encrypted messaging services on the basis of combating child pornography.
“Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems,” Green tweeted.
Apple has made privacy a key part of its brand, launching a privacy-focused ad campaign in May and famously refusing to unlock the phone of a terrorist at the request of the FBI. The company stressed the new software would respect user privacy, noting it did not gather data on images that didn’t match the database.
“No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this,” privacy whistleblower Edward Snowden tweeted Thursday. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”
Apple did not immediately respond to the Daily Caller News Foundation’s request for comment.
All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact [email protected].