Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Scanning on iCloud means that Apple can see the content of all your scannable data in iCloud. Scanning on-device is compatible with Apple never having access to your data in an unencrypted format. If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc. then either you have to scan on device before upload _or_ you have to store iCloud data without E2E encryption. From a privacy perspective, on-device scanning before upload is obviously better.


> Scanning on-device is compatible with Apple never having access to your data in an unencrypted format.

Only if you exclude the following network transmission, which is the easy part that needs no special code. The privacy concern comes in with those two things together. So yeah if you take away one half of a bad thing such that the bad thing no longer becomes possible, it's not bad anymore. The concern is the whole process, on-device scanning being the key not-yet-implemented component.

> If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc. then either you have to scan on device before upload _or_ you have to store iCloud data without E2E encryption.

Apple does not have that legal obligation. If they can't decrypt the content on their servers, then their only response to a government-issued warrant would be to hand over encrypted data.

Also, CSAM is not the concern. The concern is this would be used against dissidents in authoritarian countries. On-device scanning takes us a step towards becoming one and further empowering the existing ones.


This doesn't necessarily follow. Law enforcement having near realtime access to everything you ever photographed is a worse situation then them having to know in advance which things you might be storing, so they can add a fingerprint of it to a database and then wait until your device matches and uploads it.

CSAM is a serious problem and you're ignoring Apple's moral or even repetitional obligation to try to address it. However poorly.

We shouldn't be hyperbolic and just flatten levels of badness, or we'll can never find a compromise (which is, I suspect, just how you want things).


" If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc"

My understanding is in the USA companies like Apple cannot be legally obligated to ensure that iCloud does not store CSAM. Something about the US Constitution, but I can't remember what. Apple is legally obligated to report CSAM if they come across it themselves though.

This appears to be the case in Europe as well. But may not always be that way. The EU appears to be working on legislation that can compel cloud providers to scan for CSAM: https://9to5mac.com/2022/05/11/apples-csam-troubles-may-be-b...

Welcome sources on this from others. Last time I dug into this was a year ago.


> Something about the US Constitution, but I can't remember what.

The first amendment comes into play. The government cannot compel Apple to write software in a certain way, such as "write your encryption so you have keys that access all of users' data". That would be "compelled speech". So if the government provides Apple with a warrant, Apple can only provide encrypted or whatever meta information they have, not decrypted content.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: