Apple revisits plans for new child safety tools after privacy backlash

On Friday, the company said it would pause testing of the tool in order to gather more feedback and make improvements.

The plan focuses on a new system that, if finally launched, will check iOS devices and iCloud photos for images of child abuse. It includes a new activation feature that would notify minors and their parents about sexually explicit incoming or sent image attachments in iMessage and scramble them.

Apple’s announcement last month that it would begin testing the tool matches a recent increased focus on child protection among tech companies – but it was light on specific details and was quickly greeted with tweets. indignant, critical headlines and calls for more information.
So Friday, Apple (AAPL) said it will hamper the implementation of the features.

“Last month we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. “the company said. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few months to gather feedback and make improvements before releasing these features of child safety of critical importance. “

In a series of press calls to explain the tool planned last month, Apple stressed that consumer privacy would be protected because the tool would turn photos on iPhone and iPad into unreadable hashes, or complex numbers. , stored on users’ devices. Those numbers would be compared to a hash database provided by the National Center for Missing and Exploited Children (NCMEC) once the images are uploaded to Apple’s iCloud storage service. (Apple later said other organizations would be involved in addition to NCMEC.)

Only after a number of hashes would match the NCMEC photos, would the Apple review team be alerted so they could decrypt the information, deactivate the user’s account, and alert. NCMEC, which could alert law enforcement to the existence of potentially abusive images.

Many child safety and security experts have praised the intent of the plan, recognizing a company’s ethical responsibilities and obligations regarding the products and services it creates. But they also said the efforts presented potential privacy concerns.

“When people hear that Apple is ‘searching’ for child sexual abuse material (CSAM) on end-user phones, they immediately think of Big Brother and ‘1984’”, Ryan O’Leary, head of privacy research and legal technology at market research firm IDC, told CNN Business last month. “This is a very nuanced problem and one which at first glance can seem quite frightening or intrusive.”

Critics of the plan have applauded Apple’s decision to put the test on hold.

Digital rights group Fight for the Future called the tool a threat to “privacy, security, democracy and freedom” and called on Apple to put it on a permanent stop.

“Apple’s project to analyze photos and messages on the device is one of the most dangerous proposals of any tech company in modern history,” said Fight for the Future director Evan Greer. , in a press release. “Technologically, this is equivalent to installing malware on millions of devices, malware that can easily be abused and cause enormous damage.

Correction: A previous version of this story incorrectly stated the name of the digital rights group Fight for the Future.

Comments are closed.