Apple’s plan to scan US iPhones raises privacy red flags

Apple has

announced plans to scan iPhones for images of child abuse

, raising immediate concerns regarding

user privacy and surveillance

with the move.

Has Apple's iPhone become an iSpy?

Apple says its system is automated, doesn’t scan the actual images themselves, uses some form of hash data system to identify known instances of child sexual abuse materials (CSAM) and says it has some fail-safes in place to protect privacy.

Privacy advocates warn that now it has created such a system, Apple is on a rocky road to an inexorable extension of on-device content scanning and reporting that could – and likely, will – be abused by some nations.

What Apple’s system is doing

There are three main elements to the system, which will lurk inside iOS 15, iPadOS 15 and macOS Monterey when they ship later this year.

Scanning your images

Apple’s system scans all images stored in iCloud Photos to see whether they match the CSAM database held by the

National Center for Missing and Exploited Children

(NCMEC).

Images are scanned on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

When an image is stored on iCloud Photos a matching process takes place. In the event an account crosses a threshold of multiple instances of known CSAM content Apple is alerted. If alerted, the data is manually reviewed, the account is disabled and NCMEC is informed.

The system isn’t perfect, however. The company says there’s a less than one-in-one-trillion chance of incorrectly flagging an account. Apple has more than a billion users, so that means there’s better than a 1/1,000 chance of someone being incorrectly identified each year. Users who feel they have been mistakenly flagged can appeal.

Images are scanned on the device.

Scanning your messages

Apple’s system uses on-device machine learning to scan images in Message sent or received by minors for sexually explicit material, warning parents if such images are identified. Parents can enable or disable the system, and any such content received by a child will be blurred.

If a child attempts to send sexually explicit content, they will be warned and the parents can be told. Apple says it does not get access to the images, which are scanned on the device.

Watching what you search for

The third part consists of updates to Siri and Search. Apple says these will now provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when people make what are deemed to be CSAM-related search queries, explaining that interest in this topic is problematic.

Apple helpfully informs us that its program is “ambitious” and the efforts will “evolve and expand over time.”

A little technical data

The company has

published an extensive technical white paper

that explains a bit more concerning its system. In the paper, it takes pains to reassure users that it does not learn anything about images that don’t match the database,

Apple’s technology, called NeuralHash, analyzes known CSAM images and converts them to a unique number specific to each image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value.

As images are added to iCloud Photos they are compared to that database to identify a match.

If a match is found, a cryptographic safety voucher is created, which, as I understand it, will also allow an Apple reviewer to decrypt and access the offending image in the event the threshold of such content is reached and action is required.

“Apple is able to learn the relevant image information only once the account has more than a threshold number of CSAM matches, and even then, only for the matching images,” the paper concludes.

Apple is not unique, but on-device analysis may be

Apple isn’t alone in being required to share images of CSAM with the authorities. By law, any US company that finds such material on its servers must work with law enforcement to investigate it. Facebook, Microsoft, and Google already have technologies that scan such materials being shared over email or messaging platforms.

The difference between those systems and this one is that analysis takes place on the device, not on the company servers.

Apple has always claimed its messaging platforms are end-to-end encrypted, but this becomes a little semantic claim if the contents of a person’s device are scanned before encryption even takes place.

Child protection is, of course, something most rational people support. But what concerns privacy advocates is that some governments may now attempt to force Apple to search for other materials on people’s devices.

A government that outlaws homosexuality might demand such content is also monitored, for example. What happens if a teenage child in a nation that outlaws non-binary sexual activity asks Siri for help in coming out? And what about discreet ambient listening devices, such as HomePods? It isn’t clear the search-related component of this system is being deployed there, but conceivably it is.

And it isn't yet clear how Apple will be able to protect against any such mission-creep.

Privacy advocates are extremely alarmed

Most privacy advocates feel there is a significant chance for mission creep inherent to this plan, which does nothing to maintain belief in Apple’s commitment to user privacy.

How can any user feel that privacy is protected if the device itself is spying on them, and they have no control as to how?

The

Electronic Frontier Foundation

(EFF) warns this plan effectively creates security backdoor.

“When Apple develops a technology that’s capable of scanning encrypted content, you can’t just say, 'Well, I wonder what the Chinese government would do with that technology.' It isn’t theoretical,” warned John Hopkins professor

Matthew Green

.

Alternative arguments

There are other arguments. One of the most compelling of these is that servers at ISPs and email providers are already scanned for such content, and that Apple has built a system that minimizes human involvement and only flags a problem in the event it identifies multiple matches between the CSAM database and content on the device.

There is no doubt that children are at risk.

Of the nearly 26,500 runaways reported to NCMEC in 2020, one in six were likely victims of child sex trafficking. The organization’s CyberTipline, (which I imagine Apple is connected to in this case) received

more than 21.7 million reports

related to some form of CSAM in 2020.

John Clark, the president and CEO of NCMEC,

said

: “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in CSAM. At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known.”

Others say that by creating a system to protect children against such egregious crimes, Apple is removing an argument some might use to justify device backdoors in a wider sense.

Most of us agree that children should be protected, and by doing so Apple has eroded that argument some repressive governments might use to force matters. Now it must stand against any mission creep on the part of such governments.

That last challenge is the biggest problem, given that Apple when pushed will always

follow the laws of governments in nations it does business in

.

“No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this,” warned noted privacy advocate

Edward Snowden

. If they can scan for CSAM today, “they can scan for anything tomorrow."

Please follow me on

Twitter

, or join me in the

AppleHolic’s bar & grill

and

Apple Discussions

groups on MeWe.

Popular Articles