https://stopncii.org/how-it-works/ explains that "Your content will not be uploaded, it will remain on your device", and "Participating companies will look for matches to the hash and remove any matches within their system(s) if it violates their intimate image abuse policy."
In principle, both promises can be kept, with humans checking the matches (if any) against their rules. (In practice, I have no idea how it will work out.)
Yup, you got it, the content itself will remain only in the device, the hashing is done in-browser, and the only part of the original content that makes it into the system is the hashes. Once a platform that is part of the program downloads those hashes and is able to match content, you need to apply some amount of verification. It’s on the participating companies themselves to review the content that matches the hash, to see if it actually violates their policies on NCII.
In principle, both promises can be kept, with humans checking the matches (if any) against their rules. (In practice, I have no idea how it will work out.)