{"id":246314,"date":"2024-07-22T15:39:59","date_gmt":"2024-07-22T15:39:59","guid":{"rendered":"https:\/\/michigandigitalnews.com\/index.php\/2024\/07\/22\/apple-accused-of-underreporting-suspected-csam-on-its-platforms\/"},"modified":"2025-06-25T17:14:15","modified_gmt":"2025-06-25T17:14:15","slug":"apple-accused-of-underreporting-suspected-csam-on-its-platforms","status":"publish","type":"post","link":"https:\/\/michigandigitalnews.com\/index.php\/2024\/07\/22\/apple-accused-of-underreporting-suspected-csam-on-its-platforms\/","title":{"rendered":"Apple accused of underreporting suspected CSAM on its platforms"},"content":{"rendered":"<p> [ad_1]<br \/>\n<\/p>\n<div>\n<p><a data-i13n=\"cpos:1;pos:1\" href=\"https:\/\/www.engadget.com\/tag\/apple\/\" data-ylk=\"slk:Apple;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a> has been accused of underreporting the prevalence of child sexual abuse material (<a data-i13n=\"cpos:2;pos:1\" href=\"https:\/\/www.engadget.com\/tag\/csam\/\" data-ylk=\"slk:CSAM;cpos:2;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a>) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing &amp; Exploited Children (NCMEC) last year.<\/p>\n<p>That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation\/Sony Interactive Entertainment (3,974). Every US-based tech company is required to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.<\/p>\n<p>The NSPCC also said Apple was implicated in more CSAM cases (337) in England and Wales between April 2022 and March 2023 than it reported worldwide in one year. The charity used freedom of information requests to gather that data from police forces.<\/p>\n<p>As <a data-i13n=\"cpos:3;pos:1\" href=\"https:\/\/www.theguardian.com\/technology\/article\/2024\/jul\/22\/apple-security-child-sexual-images-accusation\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:The Guardian;cpos:3;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><em><\/em><\/a>, which first reported on the NSPCC&#8217;s claim, points out, Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has <a data-i13n=\"cpos:4;pos:1\" href=\"https:\/\/www.engadget.com\/whatsapp-end-to-end-encryption-chat-backups-175533085.html\" data-ylk=\"slk:E2EE as well;cpos:4;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">E2EE as well<\/a>, and that service reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.<\/p>\n<p>\u201cThere is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple\u2019s services and the almost negligible number of global reports of abuse content they make to authorities,\u201d Richard Collard, the NSPCC&#8217;s head of child safety online policy, said. \u201cApple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the <a data-i13n=\"cpos:5;pos:1\" href=\"https:\/\/www.engadget.com\/deepfake-porn-uk-ban-online-safety-bill-171007700.html?guccounter=1&amp;guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&amp;guce_referrer_sig=AQAAAKfP-whSsv2dHVr4lFq3JFxL6V2S5PcHxTPdShvcctl_AcFXv0xHScC0HagIfFXW7qrWcMILSoKwvgJCV1xhlqqAik_CRsxF63GCww0etDk7ZUn8Y9FLI6QGIFxVHyAjMVOw0OXrrLUmxNJGfTfoGvqafPryVSWs_wL_bdE5TqbL\" data-ylk=\"slk:Online Safety Act in the UK;cpos:5;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">Online Safety Act in the UK<\/a>.\u201d<\/p>\n<p>In 2021, Apple <a data-i13n=\"cpos:6;pos:1\" href=\"https:\/\/www.engadget.com\/apple-child-safety-ios-15-193820644.html\" data-ylk=\"slk:announced plans;cpos:6;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a> to deploy a system that would scan images before they were uploaded to iCloud and compare them against a database of known CSAM images from NCMEC and other organizations. But following <a data-i13n=\"cpos:7;pos:1\" href=\"https:\/\/www.engadget.com\/apple-child-safety-csam-detection-explainer-183029927.html\" data-ylk=\"slk:a backlash;cpos:7;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a> from privacy and digital rights advocates, Apple <a data-i13n=\"cpos:8;pos:1\" href=\"https:\/\/www.engadget.com\/apple-child-safety-features-csam-delay-132534290.html\" data-ylk=\"slk:delayed the rollout;cpos:8;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a> of its CSAM detection tools before ultimately <a data-i13n=\"cpos:9;pos:1\" href=\"https:\/\/www.engadget.com\/apple-advanced-data-protection-imessage-contact-key-183819737.html\" data-ylk=\"slk:killing the project in 2022;cpos:9;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><\/a>.<\/p>\n<p>Apple declined to comment on the NSPCC&#8217;s accusation, instead pointing <em>The Guardian <\/em>to a statement it made when it shelved the CSAM scanning plan. Apple said it opted for a different strategy that \u201cprioritizes the security and privacy of [its] users.\u201d The company told <a data-i13n=\"cpos:10;pos:1\" href=\"https:\/\/www.wired.com\/story\/apple-photo-scanning-csam-communication-safety-messages\/\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:Wired;cpos:10;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><em><\/em><\/a> in August 2022 that &#8220;children can be protected without companies combing through personal data.&#8221;<\/p>\n<\/div>\n<p>[ad_2]<br \/>\n<br \/><a href=\"https:\/\/www.engadget.com\/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html?src=rss\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] has been accused of underreporting the prevalence of child sexual abuse material () on its platforms. The National Society for the Prevention of Cruelty<\/p>\n","protected":false},"author":1,"featured_media":246315,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[159],"tags":[],"_links":{"self":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/246314"}],"collection":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/comments?post=246314"}],"version-history":[{"count":0,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/246314\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media\/246315"}],"wp:attachment":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media?parent=246314"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/categories?post=246314"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/tags?post=246314"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}