Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

https://stopncii.org/how-it-works/ explains that "Your content will not be uploaded, it will remain on your device", and "Participating companies will look for matches to the hash and remove any matches within their system(s) if it violates their intimate image abuse policy."

In principle, both promises can be kept, with humans checking the matches (if any) against their rules. (In practice, I have no idea how it will work out.)



It's very likely that the image can be reconstructed from perceptual hashes. Perceptual hashes make two promises, too:

* that the original image can't be inferred from the hash, and * that similar images should get similar (if not the same) hashes

and these are in serious conflict, with what's happened with gradient-based methods the last 10 years.


Yup, you got it, the content itself will remain only in the device, the hashing is done in-browser, and the only part of the original content that makes it into the system is the hashes. Once a platform that is part of the program downloads those hashes and is able to match content, you need to apply some amount of verification. It’s on the participating companies themselves to review the content that matches the hash, to see if it actually violates their policies on NCII.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: