Apple *silently* killed CSAM
- Euri
- Dec 15, 2021
- 2 min read
Updated: Dec 16, 2021
It’s gone. Thanks for reading-

Firstly spotted by MacRumors, Apple has removed all its references to CSAM from its child safety webpage, likely that Apple just killed the project :/. MacRumors reached out to Apple and this article will be updated if there’s news unless they don’t comment. But why aren’t they going implement this feature anymore if they did?
Last August Apple announced their CSAM detection system. This system is basically scanning people’s iCloud photos if there are child porn files. If it matches, it will be sent to Apple and auto-reviewed. If the threshold matches exceed or if it’s undetectable, it will be reviewed by people then sent to NCMEC. And in my opinion this feature was great but….
Many people were concerned about it but not me initially because I see no problem and Apple will perfect it but I watched some front page tech episode and my brain was messed up soooo I didn’t think twice before tweeting something. A few weeks later, they sent out an FAQ document for those who are concerned. But people probably did not read or did read about it but still concerned. And so, a month later after its announcement, it was delayed thanks to people’s feedback to improve the system. Aaand few months later, Apple gave up as it did not release on iOS 15.2 and the references were gone thanks to people. By people, I mean customers, advocacy groups, researchers, etc.
Do you think this feature will return and Apple just removed it temporarily? Let us know down in the comments!
NEW UPDATE: Apple commented on the report by MacRumors and said that the project isnt dead yet. References were just removed likely to perfect the CSAM detection system.
Comments