Apple CSAM controversy continues: Charity says company under-reporting

4 months ago 32

The Apple CSAM controversy doesn’t appear to be going away, as a British children’s charity accused the company of under-reporting incidents on its platforms.

However, the report appears to be based at least in part on a failure to understand how end-to-end encryption works …

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) pointed to two apparent discrepancies in Apple’s reporting of suspected child sexual abuse materials (CSAM),

The first is the huge gulf between the number of cases reported by Apple and other tech giants, as The Guardian notes.

In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

The second is that there were actually more CSAM convictions involving Apple services in England and Wales than the company reported worldwide.

Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales […]

“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” said Richard Collard, head of child safety online policy at the NSPCC. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK.”

9to5Mac’s Take

Some of the court cases cited by the NSPCC relate to instances of CSAM materials being sent in iMessage or FaceTime. Both are end-to-end encrypted services, meaning Apple has no way to see the content of any messages, and thus no way of reporting them. These cases will have been brought after offenders were caught by other means, and then required to provide access to their devices.

However, the other issue here is iCloud. Almost all cloud services routinely scan for the digital fingerprints of known CSAM materials in customer uploads, but Apple does not.

The company cites privacy as the reason for this, and back in 2021 announced plans for a privacy-respecting system for on-device scanning. However, these leaked ahead of time, and fallout over the potential for abuse by repressive governments – who could force Apple to search for the signatures of other images, like political protest posters – led to the company first postponing and then abandoning plans for this. The company even later ended up using the same arguments many had made against its original proposals.

As we’ve noted at the time, an attempt to strike a balance between privacy and public responsibility ended up badly backfiring. If Apple had instead simply carried out the same routine scanning of uploads used by other companies, there would probably have been little to no fuss. But doing so now would again turn the issue into headline news. The company really is stuck in a no-win situation here.

Photo: Priscilla Du Preez/Unsplash

FTC: We use income earning auto affiliate links. More.

Read Entire Article