Apple oddly forewarns child pornographers that their photos on iDevices will be scanned for illegal content starting this fall
Today Apple announced that it is adding a series of new child-safety features to its next big operating system updates for iPhone and iPad. As part of iOS 15 and iPadOS 15 updates later this year, the tech giant will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children.
According to Apple, its method of detecting known child sexual abuse material (CSAM) is "designed with user privacy in mind." The company says it is not directly accessing customers’ photos but instead is using a device-local, hash-based matching system to detect child abuse images. Apple says it can’t actually see user photos or the results of such scans unless there’s a hit.
If there’s a match between a user’s photos and the CSAM database, Apple then manually reviews each report to confirm the presence of sexually explicit images of children, then will disable the user’s account and send a report to NCMEC.
The question becomes, why is Apple giving notice to those who hold child pornography on iDevices? It comes across that Apple's position on "privacy" trumps catching criminals holding such deviant pornography. In reality, Apple legal may be attempting to stave off potential lawsuits from those caught with child pornography on their iPhones and iPads without ample notice.
Apple has published a technical summary document on child sexual abuse material (CSAM) that is presented in full below:
3 - Apple, CSAM Detection Technical Summary by Jack Purcher on Scribd
Comments