top of page
Gen-AI Employee Support & Automation Platform

Apple Faces Criticism for Underreporting CSAM Cases




A UK watchdog has accused Apple of underreporting the prevalence of child sexual abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity, revealed that Apple reported only 267 cases of suspected CSAM worldwide to the National Center for Missing & Exploited Children (NCMEC) last year. This figure starkly contrasts with the 1.47 million potential cases reported by Google and 30.6 million by Meta.


Other platforms also reported significantly more potential CSAM cases than Apple in 2023, including TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537), and PlayStation/Sony Interactive Entertainment (3,974). All US-based tech companies must report any detected possible CSAM cases to NCMEC, which then directs these cases to relevant law enforcement agencies globally.


The NSPCC also highlighted that Apple was implicated in 337 CSAM cases in England and Wales between April 2022 and March 2023, a number greater than its total worldwide reports for the year. This data was obtained through freedom of information requests to police forces.


As reported by The Guardian, Apple's services, such as iMessage, FaceTime, and iCloud, are protected by end-to-end encryption, preventing the company from accessing the content shared by users. However, despite WhatsApp's similar encryption, it reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.


"There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities," said Richard Collard, NSPCC's head of child safety online policy. "Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the rollout of the Online Safety Act in the UK."


In 2021, Apple proposed a system to scan images before uploading them to iCloud and compare them against a database of known CSAM images. However, following backlash from privacy and digital rights advocates, Apple delayed and ultimately abandoned the rollout of these detection tools in 2022.


Apple declined to comment on the NSPCC's allegations, instead referring The Guardian to a previous statement. Apple emphasized its commitment to user privacy and security, stating, "Children can be protected without companies combing through personal data."

bottom of page