Apple is delaying the launch of a tool intended to scan iOS devices for images of child sexual abuse material (CSAM) after privacy advocates raised concerns about the risk of misuse and surveillance by authoritarian governments.
The technology, called NeuralHash, was first announced in August and was due to roll out with iOS 15. Apple said it was taking time to make improvements, and did not say when the new scan tech would be deployed.
NeuralHash uses machine learning to scan images on Apple phones and tablets before they are uploaded to iCloud. It then compares cryptographic identifiers, known as hashes, with a database of child sexual abuse imagery provided by the National Center for Missing and Exploited Children.
If there’s a match that crosses the CSAM scanning tool’s threshold, Apple will block the cloud upload, shut down the account and alert law enforcement after conducting a human review.
The proposal created a fierce backlash among privacy advocates, who warned that the tool could be hijacked by an authoritarian state to detect political imagery or identify oppressed minorities.
Other tech companies such as Facebook and Google carry out similar image scanning for CSAM material, but this is carried out in the cloud. Critics took issue with the scanning being done on a device, arguing that while Apple’s intentions were good the move undermined Apple’s marketing slogan “what happens on your iPhone, stays on your iPhone”.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataApple sought to address these concerns with an FAQ and a series of interviews. However, the CSAM scanning tech continued to prove damaging for Apple’s carefully curated privacy image.
In a statement sent to media outlets on Friday, Apple said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The company has also delayed the rollout of separate updates that would see an algorithm scan for sexually explicit images sent via iMessage on a minor’s phone. If enabled, it would have notified parents that their child had viewed sexually explicit imagery.
Critics warn that it provides a tool for parents to carry out surveillance on children, while supporters say it can help shield children from graphic content.
Security researchers also demonstrated it was possible to trick the hashing algorithm to flag non-CSAM content, raising the risk of images being sent deliberately to create false positives.
While the news was welcomed by privacy campaigners, Andy Burrows, the NSPCC’s head of child safety online policy, said it was an “incredibly disappointing delay”.
He added: “Apple were on track to roll out really significant technological solutions that would undeniably make a big difference in keeping children safe from abuse online and could have set an industry standard.”
The CSAM delay caps of a difficult week for Apple in which hundreds of employees shared horror stories under the hashtag #AppleToo.
South Korea also forced Apple to allow software developers to use their own payment systems, effectively blocking Apple from collecting app purchase commissions. It made another climb down in response to an antitrust probe in Japan, allowing “reader apps” to link to their own external website globally and avoid the so-called “Apple tax”.
Data privacy is increasingly becoming a major challenge for Silicon Valley juggernauts, as outlined in a recent GlobalData report.