Tech

Apple Delays Release of Child Abuse Scanning Tech After Backlash

"We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
iPhone
Image: Getty Images
Screen Shot 2021-02-24 at 3
Hacking. Disinformation. Surveillance. CYBER is Motherboard's podcast and reporting on the dark underbelly of the internet.

Apple is delaying the rollout of features that are designed to combat the spread of child abuse imagery on its products, the company said in a statement on Friday.

The move comes after a fierce backlash from outside researchers, academics, and the information security community, some of whom argued that one of the features, which would scan photos stored on user's iPhones and uploaded to iCloud for violating content, could create its own privacy and security risks.

Advertisement

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in the statement.

In short, one of Apple's features would compare user's photos to a list of hashes of known child abuse imagery provided by outside groups such as the National Center for Missing and Exploited Children (NCMEC). Tech companies have deployed similar systems for years and used them to scan files stored in the cloud. Apple, meanwhile, would have moved some of this scanning to the Apple device itself.

One of the main concerns from the security community was whether this sort of system could later be adapted to other use cases beyond child abuse, such as searching for terrorist or political content. When Motherboard asked Apple during a previous call with journalists if Apple would pull out of the Chinese market if the government there demanded Apple leverage this system for another purpose, the speaker from Apple said a decision like that would be above their pay grade.

Another feature Apple announced would scan attachments in iMessage sent or received by children. If the system detected sexually explicit content, it would blur the image and provide a warning to the child. If the child opened the message, the device would then inform the child's parents.

When Apple first announced the features, some researchers pointed to the conflict between Apple's stance as a privacy-centric tech company, and the idea of scanning for images on users' devices. Matthew Green, who teaches cryptography at Johns Hopkins University, tweeted "The promise could not have been more clear" along with an image of a recent Apple advert which reads "What happens on your iPhone, stays on your iPhone."

Apple has defended its feature as being more privacy protecting, and as protected with multiple layers of mechanisms to stop abuse of the tool, including ultimate human review of content once the system detects a user going over a certain threshold of offending content.

Update: This piece has been updated to include additional context around the planned features and the backlash against them.

Subscribe to our cybersecurity podcast, CYBER.