My point was, what if you have content in there that is CSAM in some places but isn't in others(for instance - drawings). If apple employees report it to authorities in a state where it isn't illegal, they just suspended your account and reported you to authorities without any reason. So like I said, then you get into this trap of - do apple employees start judging whether the match "should" count? What if a picture isn't actually pornographic but made it into the database(say a child in underwear, maybe it's there because of a connection to an abuse case, but it isn't a picture of abuse per se). Again, is this random person at apple going to be making judgement calls about validity of matches against a government provided database? Because again, I don't believe this can ever work. Maybe those are edge cases, sure, but my point is that as soon as you allow some apple employee to make a judgement, you are introducing new risks.
My point was, what if you have content in there that is CSAM in some places but isn't in others(for instance - drawings). If apple employees report it to authorities in a state where it isn't illegal, they just suspended your account and reported you to authorities without any reason.
The only CSAM Apple will flag has to come from multiple organizations in different jurisdictions; otherwise, those hashes are ignored.
And since no credible child welfare organization is going to have CSAM that matches stuff from the worst places, there's no simple or obvious way to get them to match.
>>The only CSAM Apple will flag has to come from multiple organizations in different jurisdictions; otherwise, those hashes are ignored.
Have they actually said they would do that? I was under the impression that they just use the database of hashes provided by the American authority on prevention of child abuse.
>>And since no credible child welfare organization is going to have CSAM that matches stuff from the worst places
I'm not sure I understand what you mean, can you expand?
In [1], "That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system."