Comment: Apple’s child protection measures get mixed reactions from experts

Comment: Apple’s child protection measures get mixed reactions from experts

Ben Lovejoy

- Aug. 6th 2021 6:12 am PT

@benlovejoy

0

Facebook

Twitter

Pinterest

LinkedIn

Reddit

The announcement yesterday of

Apple’s child protection measures

confirmed

an earlier report

that the company would begin scanning for child abuse photos on iPhones. The news has seen mixed reactions from experts in both cybersecurity and child safety.

Four concerns had already been raised

before the details were known, and Apple’s announcement addressed two of them …

CSAM scanning concerns

The original concerns included the fact that digital signatures for child sexual abuse materials (CSAM) are deliberately fuzzy, to allow for things like crops and other image adjustments. That creates a risk of false positives, either by chance (concern one) or malicious action (concern two).

Apple addressed these by announcing that action would not be triggered by a single matching image. Those who collect such material tend to have multiple images, so Apple said a certain threshold would be required before a report was generated. The company didn’t reveal what the threshold is, but did say that it reduced the chances of a false positive to less than one in a trillion. Personally, that completely satisfies me.

However, two further risks remain.

The Electronic Frontier Foundation (EFF)

highlighted the misuse risk

, pointing out that there is no way for either Apple or users to audit the digital fingerprints. A government can

tell

Apple that it only contains CSAM hashes, but there is no way for the company to verify that.

Right now, the process is that Apple will manually review flagged images, and only if the review confirms abusive material will the company pass the details to law enforcement. But again, there is no guarantee that the company will be allowed to continue following this process.

Cryptography academic Matthew Green

reiterated this point

after his pre-announcement tweets.

The EFF says

this is more than a theoretical risk:

In Hong Kong, for example,

criticism of the Chinese government

is classified on the same level as terrorism, and is punishable by life imprisonment.

iMessage scanning concerns

Concerns have also been raised about the AI-based scanning iPhones will conduct on photos in iMessage. This scanning doesn’t rely on digital signatures, but instead tries to identify nude photos based on machine-learning.

Again, Apple has protections built in. It’s only for suspected nude photos. It only affects child accounts as part of family groups. The child is warned that an incoming message might be inappropriate, and then chooses whether or not to view it. No external report is generated, only a parent notified if appropriate.

But again, the slippery slope argument is being raised. These are all controls that apply right now, but

the EFF asks

what if a repressive government forces Apple to change the rules?

The organization also argues that false matches are a definite risk here.

Again, that’s not an issue with Apple’s current implementation due to the safeguards included, but creating technology that can scan the contents of private messages has huge potential for future abuse.

The EFF also highlights an issue raised by some child-protection experts: that a parent or legal guardian isn’t always a safe person with whom to share a child’s private messages.

Some of the discussion highlights that tricky tightrope Apple is trying to walk. For example, one protection is that parents are not automatically alerted: The child is warned first, and then given the choice of whether or not to view or send the image. If they choose not to, no alert is generated.

David Thiel

was one of many to point out the obvious flaw there:

Apple’s child protection measures can’t please everyone

Everyone supports Apple’s intentions here, and personally I’m entirely satisfied by the threshold protection against false positives. Apple’s other safeguards are also thoughtful, and ought to be effective. The company is to be applauded for trying to address a serious issue in a careful manner.

At the same time, the slippery slope risks are very real. It is extremely common for a government – even a relatively benign one – to indulge in mission-creep. It first introduces a law that nobody could reasonably oppose, then later widens its scope, sometimes salami-style, one slice at a time. This is especially dangerous in authoritarian regimes.

Conversely, you could argue that by making this system public, Apple just tipped its hand. Now anyone with CSAM on their iPhone knows they should switch off iCloud, and abusers know if they want to send nudes to children, they shouldn’t use iMessage. So you could argue that Apple shouldn’t be doing this at all, or you could argue that it should have done it without telling anyone.

The reality is that there’s no perfect solution here, and every stance Apple could take has both benefits and risks.

Yesterday, before the full details were known, the vast majority of you

opposed the move

. Where do you stand now that the details – and the safeguards – are known? Please again take our poll, and share your thoughts in the comments.

Take Our Poll

Photo:

David Watkis

/

Unsplash

FTC: We use income earning auto affiliate links.

More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out

our homepage

for all the latest news, and follow 9to5Mac on

Twitter

,

Facebook

, and

LinkedIn

to stay in the loop. Don’t know where to start? Check out our

exclusive stories

,

reviews

,

how-tos

, and

subscribe to our YouTube channel

Guides

AAPL Company

Breaking news from Cupertino. We’ll give you the latest from Apple headquarters and decipher fact from fiction from the rumor mill.

Privacy

Privacy is a growing concern in today's world. Follow along with all our coverage related to privacy, security, and more in our guide.

Opinion

About the Author

Ben Lovejoy

@benlovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!

Ben Lovejoy's favorite gear

NordVPN

Apple Watch Series 3

Populární články