The London Society for the Prevention of Cruelty to Children states that Apple is clearly behind many of their peers in tackling child sexual abuse
In late 2022, Apple sadly abandoned its plans to roll out the iCloud photo-scanning tool. Apple’s tool, called neuralMatch, would have scanned images before they were uploaded to the iCloud’s online photo storage, comparing them against a database of known child abuse imagery via mathematical fingerprints known as hash values.
However, the software was subject to pushback by radical digital rights groups, who voiced concerns that it would inevitably be used to compromise the privacy and security of all iCloud users. Child safety advocates decried the rollback of the feature.
Sarah Gardner, chief executive officer of Heat Initiative, a Los Angeles non-profit focused on child protection: "Apple does not detect CSAM in the majority of its environments at scale, at all. They are clearly underreporting and have not invested in trust and safety teams to be able to handle this," which is really disappointing to learn.
Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence.
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.
Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.
All US-based tech companies are obligated to report all cases of CSAM they detect on their platforms to NCMEC. The Virginia-headquartered organization acts as a clearinghouse for reports of child abuse from around the world, viewing them and sending them to the relevant law enforcement agencies. iMessage is an encrypted messaging service, meaning Apple is unable to see the contents of users’ messages, but so is Meta’s WhatsApp, which made roughly 1.4m reports of suspected CSAM to NCMEC in 2023.
Richard Collard, head of child safety online policy at the NSPCC: "There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities. Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK." For more, read the full report by The Guardian.