Search
Search titles only
By:
Search titles only
By:
Menu
Forums
New posts
Search forums
Home
What's new
New posts
Log in
Register
Search
Search titles only
By:
Search titles only
By:
Menu
Install the app
Install
Reply to thread
Home
Computers & Internet
Mobile Computing
Apple quietly pulls references to its CSAM detection tech after privacy fears
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Message
[QUOTE="Carly Page, post: 4343"] Apple has quietly removed all references to [URL='https://techcrunch.com/2021/08/18/apples-csam-detection-tech-is-under-fire-again/']its child sexual abuse scanning feature[/URL] from its website, months after [URL='https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/']announcing that the new technology[/URL] would be baked into iOS 15 and macOS Monterey. [URL='https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/']Back in August[/URL], Apple announced that it would introduce the feature to allow the company to detect and report known child sexual abuse material, known as CSAM, to law enforcement. At the time, Apple claimed — unlike cloud providers that already offered blanket scanning to check for potentially illegal content — it could detect known illegal imagery while preserving user privacy, because the technology could identify known CSAM on a user’s device without having to possess the image or device, or knowing its contents. Apple faced [URL='https://techcrunch.com/2021/08/18/apples-csam-detection-tech-is-under-fire-again/']a monumental backlash[/URL] in response. Security experts and privacy advocates expressed concern that the system [URL='https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life']could be abused[/URL] by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system, while others ridiculed being ineffective at identifying images of child sexual abuse.This led to dozens of civil liberties groups calling on Apple to abandon plans to roll out the controversial feature. Despite a publicity blitz that followed in an effort to assuage fears, Apple relented, announcing a [URL='https://techcrunch.com/2021/09/03/apple-csam-detection-delayed/']delay to the rollout of the CSAM scanning feature[/URL]. The company said “based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” Now it looks like the feature might have been scrapped altogether. [URL='https://www.macrumors.com/2021/12/15/apple-nixes-csam-references-website/']MacRumours[/URL] first noticed that all mentions of CSAM have been quietly scrubbed from [URL='https://www.apple.com/child-safety/']Apple’s Child Safety webpage.[/URL] Up until December 10, this page included a detailed overview of CSAM detection and a promise that the controversial feature would be “coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.” The updated version of the page removes not only the section on CSAM detection but also scrubs all references to the technology and a section offering links to documents explaining and assessing the CSAM process. Now, Apple’s Child Safety page only contains [URL='https://techcrunch.com/2021/11/09/apple-safety-feature-messages/']a reference to Communication safety in Messages[/URL] and expanded guidance for Siri, Spotlight and Safari Search, the former having debuted on iOS 15 earlier this week. A spokesperson for Apple did not immediately respond when reached for comment. [/QUOTE]
Insert quotes…
Verification
Post reply
Home
Computers & Internet
Mobile Computing
Apple quietly pulls references to its CSAM detection tech after privacy fears
Top
Bottom
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…