A year ago Apple announced the cancellation of one of the most ambitious projects within iCloud. It was a scanner for the detection of child pornography within the content stored in the cloud of the Big Apple. This move was totally unexpected and without any explanation from the corporation. However, a few hours ago a note was published in which Apple explained that entering into the scanning of multimedia files within iCloud opened an open door to malicious behavior and that they had to stop developing the tool.
Privacy, the axis of the stoppage in the development of CSAM detection
The iCloud photo scanner of Apple users looking for child pornography was one of the most important projects within the new era of protection of the services of the big apple. The term actually used was CSAM (Child Sexual Abuse Material or Child Sexual Abuse Material). And it was neither more nor less than a tool for detect those users who stored this information in the Apple cloud and bring it to the attention of the competent authorities. You can consult more about the operation of the scanner in this article where, in addition, we announced the cancellation of development by Apple.
Related article:
Apple abandons project to scan iCloud photos for child pornography
In a new note published in Wired, Apple it gives the reasons why the development of this child pornography scanner was shelved. These statements come from Erik Neuenschwander, the company’s director of user privacy and child safety:
Scanning each user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the possibility of a avalanche of unintended consequences. Exploring one type of content, for example, opens the door to mass surveillance and could create a desire to search for other encrypted messaging systems across all types of content.
Apple’s spirits remain in commit to breaking the chain of pornography and trafficking in this type of data. However, after conducting extensive privacy and security research with teams of digital rights experts and child safety advocates, they realized that scanning media files within iCloud would open a scenario in which until now never seen.
Perhaps in the future Apple will resume this project, but for now the statements seem to be consistent with the decision that was made a year ago to end the project, although the idea from the beginning was very good.