Apple has decided to postpone the introduction of its child-protection software features, according to
The Verge, following widespread criticism of the plan.
child protection policies were unveiled last month by Apple, and included a method to combat the spread of child sexual abuse material (CSAM). It does this by using an algorithm to match the content of images on users’ phones against previously known CSAM material.
The initiative received extensive criticism. Among other things, a number of human rights organisations around the world pointed out that the “child protection” approach, while well-intentioned, could infringe on users’ privacy and create a surveillance tool. The CEO of Epic Games – who might not be an entirely
unbiased commentator – described it as
Apple initially defended its plan, claiming it had been
widely misunderstood and working hard to explain the intricacies of where, how and by whom the images would be scanned. (Craig Federighi denied that the company had any intention of spying on users’ phones, insisting that scanning would only take place if and when images are uploaded to iCloud Photo Library – but slightly muddied the waters by conceding that an initial phase of scanning would take place on the phone.) Now, however, it has agreed to postpone the launch of the feature while it works to improve its implementation.
The policies were slated to launch later this year, but will now be delayed. Apple has not announced a new timeframe, but has only said that it will “take additional time” to respond to criticism and work on the project.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in a statement to The Verge.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
But not everyone is satisfied by this concession. The Electronic Frontier Foundation (EFF), while saying it’s pleased by the delay, has urged Apple to abandon the plan completely.
“EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organisations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools,” the group has
Cult of Mac).
“But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely. The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship.”
This article originally appeared on
Macworld Sweden. Translation and additional reporting by David Price.