CUPERTINO (dpa-AFX) - The National Society for the Prevention of Cruelty to Children (NSPCC), a UK charity focused on child protection, has accused Apple of significantly underreporting the amount of child sexual abuse material (CSAM) on its platforms.
According to The Guardian, Apple was linked to 337 documented offenses involving child abuse images from April 2022 to March 2023 in England and Wales, while Apple reported only 267 suspected CSAM instances to the National Center for Missing & Exploited Children (NCMEC) worldwide in 2023. This is in stark contrast to the numbers reported by other major tech companies, with Google reporting over 1.47 million cases and Meta over 30.6 million, as indicated in NCMEC's annual report.
All US-based tech companies are mandated to report potential CSAM cases identified on their platforms to NCMEC.
NSPCC's head of child safety online policy, Richard Collard, expressed concerns about the significant gap between the number of child abuse image crimes involving Apple's services in the UK and the minimal global reports submitted to the authorities.
Apple did not provide any additional comments for this article, instead referring to previous statements made in August, wherein the company explained its decision not to move forward with a program that would scan iCloud photos for CSAM. This decision received criticism from child safety advocates, as well as digital rights groups expressing concerns that it would compromise privacy and security.
Furthermore, Apple's announcement regarding the upcoming launch of an artificial intelligence system, Apple Intelligence, has raised concerns among child safety experts. Collard remarked that the rush to launch Apple AI is concerning, particularly as AI-generated child abuse material poses risks to children and hampers law enforcement's ability to protect young victims, according to the Guardian.
Copyright(c) 2024 RTTNews.com. All Rights Reserved
Copyright RTT News/dpa-AFX
© 2024 AFX News