09.01.2023
Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos. Apple's statement, shared
Apple's statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative's demand that the company "detect, report, and remove" CSAM from iCloud and offer more tools for users to report such content to the company.
"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.
"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."
Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." The plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.
Apple's latest response to the issue comes at a time when the encryption debate has been reignited by the U.K. government, which is considering plans to amend surveillance legislation that would require tech companies to disable security features like end-to-end encryption without telling the public.
Apple says it will pull services including FaceTime and iMessage in the U.K. if the legislation is passed in its current form.
This article, "Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos" first appeared on MacRumors.com
Discuss this article in our forums
You may also be interested in this
Apple WWDC 10 biggest ann…
06.05.2023
Screenshot by Sean Hollister / The Verge This year’s Worldwide Developers Conference didn’t disappoint. During today’s keynote, Apple showed off its highly anticipated mixed reality headset for the first time,
Macs can get viruses, but…
06.13.2023
Macworld Do Macs get viruses? Do Macs need antivirus software? The answers to these questions aren’t as simple as they might seem. In this article, we look at the dangers
Russia claims U.S. NSA ac…
06.01.2023
Russia’s Federal Security Service (FSB) claimed on Thursday it had uncovered a U.S. National Security Agency (NSA) plot using previously unknown malware to access vulnerabilities in Apple iPhones. Lockdown Mode
iOS 17 Expands Communicat…
06.21.2023
Starting with iOS 17, iPadOS 17, and macOS Sonoma, Apple is making Communication Safety available worldwide. The previously opt-in feature will now be turned on by default for children under
Apple Wallet Enhancements…
02.22.2024
Apple Wallet, the digital cardholder application, has undergone significant enhancements in recent years, paving the way for a more secure and convenient way to store and manage digital IDs and
Apple urges UK to rethink…
06.27.2023
Apple has denounced the UK's Online Safety Bill's kneecapping of end-to-end encryption as a "serious threat" to citizens, and is trying to make the UK government think twice about the
Apple opposes UK bill tha…
06.27.2023
Apple is the Online Safety Bill as it could be used to force encrypted messaging tools like iMessage, WhatsApp, Signal and other to scan messages for, ostensibly, child sexual abuse
Proton Encrypted Mail Des…
03.14.2024
Swiss-based privacy startup Proton today announced the availability of its end-to-end encrypted desktop mail app for macOS and Windows, with a Linux version in beta. According to the company, the

