Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Apple wants to scan user photos to hunt for child abuse (Updated: It's official)

Update: No sooner than we published the rumor, Apple confirmed it with a blog post.
By

Published onAugust 5, 2021

Apple iPhone 12 Pro vs iPhone 12 Max camera 3
Robert Triggs / Android Authority
TL;DR
  • A new report alleges that Apple plans to subvert iPhone privacy in the name of stopping child abuse.
  • Reportedly, the company plans to scan user photos for evidence of child abuse. If found, the algorithm would push that photo to a human reviewer.
  • The idea of Apple employees accidentally monitoring legal photos of a user’s children is certainly concerning.

Update, August 5, 2021 (04:10 PM ET): Not long after we published the article below, Apple confirmed the existence of its software that hunts for child abuse. In a blog post titled “Expanded protections for children,” the company laid out plans to help curb child sexual abuse material (CSAM).

As part of these plans, Apple will roll out new technology in iOS and iPadOS that “will allow Apple to detect known CSAM images stored in iCloud Photos.” Essentially, on-device scanning will occur for all media stored in iCloud Photos. If the software finds that an image is suspect, it will send it to Apple which will decrypt the image and view it. If it finds the content is, in fact, illegal, it will notify the authorities.

Apple claims there is a “one in one trillion chance per year of incorrectly flagging a given account.”


Original article, August 5, 2021 (03:55 PM ET): Over the past few years, Apple has pushed hard to solidify its reputation as a privacy-focused company. It frequently cites its “walled garden” approach as a boon for privacy and security.

However, a new report from Financial Times throws that reputation into question. According to the report, Apple is planning on rolling out a new system that would rifle through user-created photos and videos on Apple products, including the iPhone. The reason Apple would sacrifice iPhone privacy in this way is to hunt for child abusers.

See also: What you need to know about privacy screen protectors

The system is allegedly known as “neuralMatch.” Essentially, the system would use software to scan user-created images on Apple products. If the software finds any media that could feature child abuse — including child pornography — a human employee would then be notified. The human would then assess the photo to decide what action should be taken.

Apple declined to comment on the allegations.

iPhone privacy coming to an end?

Obviously, the exploitation of children is a huge problem and one that any human with a heart knows should be dealt with swiftly and vigorously. However, the idea of someone at Apple viewing innocuous photos of your kids that neuralMatch accidentally flagged as illegal seems like an all-too-real problem waiting to happen.

There’s also the idea that software designed to spot child abuse now could be trained to spot something else later. What if instead of child abuse it was drug use, for example? How far is Apple willing to go to help governments and law enforcement catch criminals?

It’s possible Apple could make this system public in a matter of days. We’ll need to wait and see how the public reacts, if and when it does happen.