
Apple on Thursday provided its full explanation for last year abandoning its controversial plan to detect known child sexual abuse material (CSAM) stored in iCloud Photos.
Apple statement, shared with wired And reproduced below, comes in response to a demand from child safety group HEAT Initiative that the company “detect, report and remove” CSAM from iCloud and provide more tools for users to report such content to the company. Does
“Child sexual abuse material is abhorrent and we are committed to breaking the chains of coercion and influence that make children vulnerable to it,” wrote Eric Neuenschwander, Apple’s director of user privacy and child safety, in the company’s response to the Heat initiative. . However, he added that after collaborating with a range of privacy and security researchers, digital rights groups and child protection advocates, the company concluded that it could not proceed with the development of a CSAM-scanning mechanism, even Not even one specifically designed to preserve privacy.
“Scanning each user’s privately stored iCloud data will create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It will also inject the possibility of a slippery slope of unintended consequences. Scanning for one type of content, for example, opens the door to bulk surveillance and may lead to a desire to search other encrypted messaging systems across content types. “
In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in ‘iCloud Photos’, a communications safety option that blurs out sexually explicit photos in the Messages app , and child abuse resources for Siri. Communications Security was launched in the US in December 2021 with iOS 15.2 and has since expanded to the UK, Canada, Australia and New Zealand, and Siri resources are also available, but CSAM detection never launched.
Apple initially said that CSAM detection would be implemented in updates to iOS 15 and iPadOS 15 by the end of 2021, but the company postponed the feature based on “feedback from customers, advocacy groups, researchers and others”. Gave. The plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.
Apple’s latest response to the issue comes at a time when the encryption debate has been reignited by the UK government, which is considering plans to amend surveillance legislation that would require tech companies to use end-to-end encryption without telling the public. -End security features like encryption will need to be disabled. ,
Apple says it will shut down services including FaceTime and iMessage in the UK if the law is passed in its current form.
Note: Due to the political or social nature of the discussion on this topic, the discussion thread is located in our political news forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.