Apple revisits plans for new child safety tools after privacy backlash
On Friday, the company said it would pause testing of the tool in order to gather more feedback and make improvements.
The plan focuses on a new system that, if finally launched, will check iOS devices and iCloud photos for images of child abuse. It includes a new activation feature that would notify minors and their parents about sexually explicit incoming or sent image attachments in iMessage and scramble them.
“Last month we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. “the company said. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few months to gather feedback and make improvements before releasing these features of child safety of critical importance. “
Only after a number of hashes would match the NCMEC photos, would the Apple review team be alerted so they could decrypt the information, deactivate the user’s account, and alert. NCMEC, which could alert law enforcement to the existence of potentially abusive images.
Many child safety and security experts have praised the intent of the plan, recognizing a company’s ethical responsibilities and obligations regarding the products and services it creates. But they also said the efforts presented potential privacy concerns.
“When people hear that Apple is ‘searching’ for child sexual abuse material (CSAM) on end-user phones, they immediately think of Big Brother and ‘1984’”, Ryan O’Leary, head of privacy research and legal technology at market research firm IDC, told CNN Business last month. “This is a very nuanced problem and one which at first glance can seem quite frightening or intrusive.”
Critics of the plan have applauded Apple’s decision to put the test on hold.
“Apple’s project to analyze photos and messages on the device is one of the most dangerous proposals of any tech company in modern history,” said Fight for the Future director Evan Greer. , in a press release. “Technologically, this is equivalent to installing malware on millions of devices, malware that can easily be abused and cause enormous damage.
Correction: A previous version of this story incorrectly stated the name of the digital rights group Fight for the Future.