Apple will scan iPhones for child sex abuse images, winning praise and raising concerns

Published: August 7, 2021 3:38 PM EDT
Updated: August 20, 2021 5:00 PM EDT

Apple has announced the rollout of a new feature that is both winning praise and raising concerns. The tool, scheduled for rollout later this year, will scan photos and text messages on Apple devices looking for known images of child sex abuse.

Jim Lewis, an expert in cybersecurity, said Apple “has gone out of its way to make this as privacy friendly as possible.”

“There will be part of the program that has access to data…what they call hashes of imagery, in other words, the picture reduced to a numeric formula,” said Lewis, senior vice president at the Center for Strategic and International Studies in Washington, D.C. “Apple will use that numeric formula to look for things, images, that match it on your device.”

If there’s a match, the photos will be shown to an Apple employee. Verified sensitive material will then be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.

Many users, instead of reading when terms and conditions come up, instead, they check the box. Dakota Spaide is an iPhone user. “There’s definitely something, some sort of clause that like they got me with. But at this point, I can’t function without my phone so I just click it anyway,” said Spaide.

The move is drawing both applause and outcry on Twitter, Tom Hanson reports for “CBS This Morning: Saturday.”

Ashton Kutcher called it “a major step forward in the fight to eliminate” child sexual abuse material from the internet.

But security watchdogs are concerned the new software could be exploited by hackers and foreign governments.

And other privacy advocates are concerned that too much other information could be taken. They say Apple has always been a company that protects information from law enforcement agencies and the government. But, they worry that scanning all private photos could call that information into question.

The head of WhatsApp, Will Cathcart, said he’s “concerned.”

“I think this is the wrong approach and a setback for people’s privacy all over the world,” he said in a series of tweets.

“Can this scanning software running on your phone be error-proof? Researchers have not been allowed to find out,” he tweeted.

FGCU Justice Studies professor Pamella Seay says not to just click away and install iOS 15. She believes Apple is crossing a line with the new software that makes even her, an avid Apple user, uncomfortable.

“There is no such thing as privacy anymore. And this just proves that beyond all shadow of a doubt. You have no privacy. And the further this goes, the worse it is,” Seay said.

“Since when did my telephone become a means for law-enforcement?” she said. “Is child pornography wrong? Correct. You should never have child pornography and you should never be a part of it. However, Apple can go beyond that.”

Lewis said the Apple program “is designed in such a way that the chances of it making a mistake … of it saying something is child pornography when it’s not … are infinitesimally small.”

“I don’t think this is a serious threat to privacy,” he said.

Lewis said all tech companies are looking at ways to deal with malicious content. “There’s a lot of bad stuff on the internet and it’s more than overdue that they try and step in to change that,” he said.

Apple says its efforts to protect children will evolve and expand over time.

“Should we allow Apple to do this kind of surveillance and to have this kind of intrusive review of everything you create on your phone. How far will it go beyond that?” said Seay.

Seay predicts this will go to court at some point. WINK News reached out to Apple about the update but we have not yet heard back.