Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, you can find more information here: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

From the overview:

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.



Yes, but my understanding was that they do this for iCloud-synced photos.

If iCloud is off, do they still do this? Your quote actually doesn’t contradict that, which is my main hangup. If you turn on iCloud, you forfeit certain expectations.

I’ll read through it carefully now.

EDIT: It was the very first sentence of the intro:

> CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts.

I don’t get it. It’s their platform. Other image platforms do this matching. Old film shops used to do this matching. Why is this evil?


The sticking point is that the matching happens on-device, not on their servers. Sure, it only happens for photos that will be synced, but it’s still your device doing the matching and snitching.

There’s also the fact that the “only scans local photos marked for upload to iCloud” is a technically thin barrier. A switch that could very easily and quietly be flipped in the future.


If you scan in the cloud, photos not synced to the cloud are 100% not going to be scanned - they are JUST NOT THERE. If you can on device, you are one "if(" condition away from "Accidentally "scanning other photos. See the issue now?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: