Apple‘s carefully cultivated reputation for preserving its users’ privacy is at risk after some 90 privacy rights groups signed an open letter urging the tech giant to abandon plans to start scanning children’s messages for nudity and sexual content.
They’ve added their voices to the thousands of privacy advocates who have already condemned Apple’s new initiatives. If implemented, the new features would also see adult users’ content scanned for images portraying sexual abuse of children.
The organisations behind the letter fear these features could end up putting children in harms way and warn that autocratic regimes may be able to use the features to spy on their citizens.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter, organised by the US-based non-profit Center for Democracy & Technology (CDT), said.
The news comes as data privacy is increasingly becoming a major challenge for Silicon Valley juggernauts, as outlined in a recent GlobalData report.
“Data privacy regulation is the biggest threat to Big Tech,” the researchers wrote. “At stake is the very future of the ad-funded business model that supports Facebook, Google, and many others.”
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataWhile Apple has so far managed to stay above the fray thanks to it not selling ads or sharing user data to third-parties, the CSAM initiatives could seemingly backfire and change that.
“Apple’s sweeping privacy changes represent a thinly-veiled mass surveillance programme under the guise of protecting children, which will be exploited by governments and companies,” Ella Jakubowska, policy advisor at European Digital Rights, told Verdict. “There is no ‘good’ way to undermine encryption. Once the backdoor has been opened, anyone will be able to let themselves in – and that puts all of us at risk.”
Apple and the privacy of children
The Cupertino-headquartered company has a long history of branding itself as a champion of user privacy.
Back in 2014, Apple CEO Tim Cook published an open letter seemingly condemning many of his fellow Silicon Valley executives for leveraging users’ private data to make a profit.
“But at Apple, we believe a great customer experience shouldn’t come at the expense of your privacy,” he wrote.
Since then, the iPhone maker has doubled down on its commitment to keep its users’ privacy safe from the snooping eyes of governments and anyone else.
Over the years, Cupertino has actively refused to unlock phones belonging to terrorism suspects when asked by law enforcement agencies and has been a long-time advocate for end-to-end encryption, which protects private messages from being read by anyone not in the conversation. It also installed ad-blockers in its Safari web browser in 2015.
More recently, Apple has given iPhone users the ability to prevent apps like Facebook tracking their internet journeys. At a recent product launch, nearly every feature was centred around privacy. These initiatives included the ability to make temporary anonymous email addresses in iCloud and to disable tracking pixels in its Mail app.
That’s why Cupertino’s plan to start scanning users’ Apple devices as well as the messages children receive has come as a bit of a surprise to privacy and human rights advocates who believe it runs counter to the iPhone maker’s primary unique selling point.
“It’s so disappointing and upsetting that Apple is doing this, because they have been a staunch ally in defending encryption in the past,” Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project, told Reuters.
What is NeuralHash?
Apple’s initiative to protect children is really two initiatives. The first would scan kids’ iMessages for nudity or images of a sexual nature. If detected, children under the age of 13 would be sent a notification on their Apple device that opening the message would send an alert to their parent’s phone.
The second initiative, which most of the press has focused on, centres around Apple’s NeuralHash technology. Introduced by Cupertino on 5 August, the technology scans users’ images on iPhones and iPads before they are uploaded to iCloud.
The original plan was that the NeuralHash technology would then compare cryptographic identifiers, known as hashes, with a database of child sexual abuse imagery provided by the National Center for Missing and Exploited Children (NCMEC).
If there’s a match that crosses the CSAM scanning tool’s threshold, Apple would block the cloud upload, shut down the account and alert law enforcement after conducting a human review.
Apple made a point of highlighting the system’s privacy safeguards, limiting scans to iCloud photos and setting a threshold of as many as 30 matches found before an alert would be generated.
Despite these reassurances, the plan by Apple to basically surveil children and other users on their devices has backfired. Within days, thousands of organisations, privacy advocates and private citizens vehemently condemned the new system.
The protests forced Cupertino to backtrack somewhat over the weekend, saying that the system would now only hunt for images that have been flagged by clearinghouses in multiple countries.
On Tuesday, researchers posted code for a reconstructed Python version of NeuralHash on GitHub, claiming to have reverse-engineered it from previous versions of iOS. The resulting generic version suggests that while the system can deal with image resizing and compression just fine, it doesn’t do too well with cropping or rotations.
Another Github later managed to create a collision, meaning two different pictures had created the same hash, casting doubt on the NeuralHash system’s reliability, The Verge reported.
Apple has said its will have safeguards in place to prevent collisions.
The open letter
The new open letter organised by the CDT joins the voices arguing against the Apple plans to start checking users’ devices for CSAM content in devices and to scanning the messages of children for sexual content.
The group of co-signatories include Access Now, American Civil Liberties Union, Big Brother Watch, Center for Democracy & Technology, Digital Rights Watch, Tech For Good, The Sex Workers Project of the Urban Justice Center, The Tor Project, and Simply Secure.
Similarly to criticism voiced over the past two weeks, the signers warned that whilst Apple’s heart seems to be in the right place, the technology could be exploited by governments to not just scan for CSAM, but also for other things.
“Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them,” the letter said.
“And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.”
The letter also warned that the notifications parents get whenever their children open a message of a sexual nature on an Apple device “could threaten the child’s safety and wellbeing.”
“LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk,” the letter warned. “As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent.”
What will it mean for Apple’s image?
Apple takes pride in its image as a bastion of user privacy. However, the recent data privacy firestorm could seriously hurt these efforts.
“The backlash is already there and is badly damaging Apple’s image of being a data privacy champion,” Laura Petrone, principal analyst at GlobalData, told Verdict. “Its changes appear in stark contrast to what it has previously advocated on data privacy and security.
“In the past, Apple has rejected US government requests to help break into the iPhones of suspected terrorists and denounced the idea of building backdoors into its gadgets that would make them inherently less secure. Now it is accused of doing just that, i.e. providing a backdoor to further surveillance and weaken users’ control over their data.”
Petrone added that Apple’s CSAM initiatives should be seen in wider context where many lawmakers and regulators are still struggling with creating new rules for new technologies like artificial intelligence (AI) and to rein in the power of Big Tech.
“I think the main concern Apple’s move raises is that a private company is trying to set the rules on data privacy and online content but also on the use of AI, i.e. on areas that even regulators across the world struggle to tackle,” she said.
“In the absence of consolidated rules and a set of principles on how to use these technologies, Apple’s initiative risks overshadowing the actual impact that the new changes will bring about.”