Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.

Well said.

It's very easy to understand that content on a physical disk I own will not be scanned by some government-friendly surveillance program, and content I upload to somebody else's servers will. I am comfortable with [subsets of] my files existing on both sides of this distinction, but only because I understand it.

Now Apple is letting the surveillance apparatus reach into my own physical hardware, blurring the lines with some overwrought, proprietary crypto-gobbledygook solution. Given the details they've provided, I get why they might have thought this was in line with their general privacy ethos of keeping everything on-device, but the fact is that it is too complicated to build a reliable understanding of in my mental model of my own data privacy--and that's for somebody (me) who is quite technical and has even dabbled in crypto.

Now that I no longer understand Apple's practices as a steward of my private data, I can no longer trust them with it. It's a real shame.



I feel like there's a third option here, though, in that Apple is very likely trying to get ahead of what's likely to be requests from the government that they won't be able to just not abide by. This already happens to every hosting and storage provider and it's entirely plausible that politicians will start publicly wringing their hands to try and justify an actual invasion of privacy where "the government" will be able to scan devices (and even require it from manufacturers) in the interest of "saving the children". This (and I realize it assumes good intentions on behalf of the actors involved) seems to be an attempt to get ahead of that kind of situation by saying "we'll scan things if we're 99% sure they're breaking the law" but we don't want to violate people's privacy. The iCloud vs Facebook comparison is great and noble and all but how long do we honestly think that'll stand up before the politicians move past that argument and onto the next one.

When it stops being about the children, it'll be about the terrorists. When it stops being about the terrorists, it'll be about whatever the next excuse is. This seems like a compromise between those unfortunate scenarios - upholding privacy (we don't see the content) while still attempting to solve for the problems that privacy inherently allows for (we can still identify CP).


> Apple is very likely trying to get ahead of what's likely to be requests from the government that they won't be able to just not abide by

No. Don't make that excuses for them. If it were true it would be a shocking revelation which might result in thousands of cases being overturned.

If the NCMEC searches by providers are being done as a result of government coercion then the fourth amendment requires there be a search warrant. Since there isn't one, these searches would all be unlawful if there was coercion.

Thus far, tech companies like Google and AOL have gone out of their way to testify that there is no coercion (I haven't found a case involving Apple).

If there were coercion it would be a shocking conspiracy. I wouldn't argue that it's impossible: The USG has done some shockingly unethical and illegal things in the past but it fails occams razor.

Moreover, the excuse doesn't actually make apple look better: Instead of being a corporation compromising user privacy and security for commercial gain, they'd be an active participant in a conspiracy to violate the US constitution.

Apple is doing this because they believe they'll make more money this way, and because they believe the complaints raised here about people rejecting devices betraying their trust are fundamentally inconsequential.

Show them otherwise, don't make excuses. Don't blame the government. The government may well be to blame too, but Apple's actions are apple's fault.


>> If there were coercion it would be a shocking conspiracy.

This comment is as if the 2001-2010 decade never happened. Secret wiretaps. Secret Evidence. Overseas torture facilities. Imprisonment without charges. What makes you think that checks and balances would miraculously appear after having disappeared for over a decade.

NOTE: this is a USA-centric viewpoint


> NOTE: this is a USA-centric viewpoint

Those are USA-centric examples.

I'm sure the viewpoint is just as valid across five eyes and even worse in places like Saudi Arabia, Turkey, China et al.

I'm from Australia, allegedly a 1st world democracy - and secret wiretaps, secret evidence (and even secret trials!), overseas torture facilities, imprisonment without charges - are all public policy, covered my the media, and enshrined in laws passes by both of the two-party political system.


>Australia

Don't forget raiding journalists homes and public broadcasters (using illegal warrants!) for reporting on proposed secret spying and hacking laws.

Nice "democracy".


I watched "Secret City" and was aghast at the secret trials, evidence, etc. Is all that really happening today in Australia?


"Covered by the media" is a stretch: Murdoch has been hard at work ignoring all of it.


I would not argue that the conspiracy is impossible, but if it exists then it makes Apple's actions less excusable instead of more excusable.

If the conspiracy exists Apple is an active participant in secretly and illegally undermining the constitutional rights of well over a hundred million Americans. If the conspiracy does not exist, then they're simply lawfully invading people's privacy in order to improve their profits.


Those are examples of shocking conspiracies that got uncovered. Illegal wiretapping and Guantanamo Bay ended up in the news and we're still talking about them now.


But nobody apologies for them or went to prison. There is little to believe the don’t continue.


Snowden did. (Well, effectively did. He'd totally go to prison if he stepped foot in a country that'd extradite him to the US.)

But you're right, none of the people who broke all the laws Snowden showed us were getting broken has suffered any consequences.


> If the NCMEC searches by providers are being done as a result of government coercion then the fourth amendment requires there be a search warrant. Since there isn't one, these searches would all be unlawful if there was coercion.

Not a US citizen, so I’m not familiar with this argument. As I understand it the government isn’t searching anything themselves so would the requirement for a search warrant apply?

I think what the US government is requiring is that if a service provider comes across CSAM they must report it. Since I don’t see why searching your own service would be anything to do with search warrants either, is it constitutional to make such a requirement for other reasons?

Finally even if Apple could avoid this requirement, which they seem to have largely been doing until now, was it viable for them to continue like that? Even without a legal mandate to actively search for CSAM, knowing there must be millions of unreported CSAM images on their service can’t have been a comfortable position to be in.


If the government requires or otherwise coerces someone to perform the search, then the searching party is acting as an agent of the government and is equally subject to the fourth amendment. Similarly, the government can't create a privileged entity like NCMEC and allow it to search without a warrant and pretend it's not the government doing.

What is permitted is that if a private entity of their own free will and self interest searches and then finds something, the government can require them to make a report when they found something. Just as you say.

But the government cannot require them to search in the first place, not without extending the fourth amendment protection against search.

Apple can decide that the commercial value of reducing the risk that they're accused of helping perverts is greater to them than the commercial value of respecting the customers privacy. Indeed. But lets call that what it is, and not make excuses that they're forced to search: They're not. They are choosing to search.


Take your argument one step further - the government knows this and starts issuing warrants for exactly these situations. Now, Apple is in a position where they have to violate their user’s privacy and share content in order to comply with the search warrants. The new system allows them to comply with warrants without ever knowing or needing to know what the contents of users’ phones are and only in cases where it’s known that illegal content exists. Apple can’t divulge any other content from the users’ devices because it doesn’t have it or know what it is in the first place.


A warrant must name the specific persons/places/things to be searched. A automatic dragnet search isn't required with a warrant.

> it’s known that illegal content exists

A point of clarification: The system divulges the encryption keys to content when the content's "perceptual hash" matches one in the database the specific user is being tested against. Apple claims that the hashes are provided by NCMEC "and other parties". It is already known that the NCMEC databases has at least some hits for non-illegal content. Who knows what these databases might contain in the future. There is certainly no structural requirement in the system that the content is illegal or some specific kind of illegal.

For users in Thailand the apple database could just as well be full of hashes of cartoons insulting to the monarch (which are illegal there, as I understand it) and no one would be able to tell, at least not before apple users started being rounded up and summarily executed.


Yes but the criteria for a warrant can be nothing more than reasonable suspicion. So a warrant could be issued for access to someone’s entire device regardless of whether objectionable content exists on it or not. Theoretically, a government could issue a warrant for every single person’s phone and get what they want. This prevents anything even close to that from happening because users aren’t required to provide the keys and Apple won’t have them unless that content matches what’s on the list.

On your second point, yes… that’s a very real possibility and one that I’m sure Apple has already considered and has responded to.


> Theoretically, a government could issue a warrant for every single person’s phone and get what they want.

For that you'd need a sham court full of rubber stamping judges handing out bogus search warrants for every single questionable and illegal request that comes to them. That seems a stretch of the imagination? What judge would to _that?_

We could call that the FISA Court, right?


No, you need just one: "every phone connected to at least two of these N base stations over this 48-hour window".

Which is quite a modern take on a historically reviled phenomenon, so you should call them for what they are: general warrants.


Glad I’m not the only one.


> Not a US citizen, so I’m not familiar with this argument. As I understand it the government isn’t searching anything themselves so would the requirement for a search warrant apply?

There's a difference between cops paying someone to break into a suspect's house to steal evidence without a warrant, and a 3rd party that, while doing business, comes across material that might be criminal evidence.

The former case, in the US, falls under the doctrine of the "fruit of the poisonous tree"[1]. Evidence collected illicitly won't be allowed in court, and any evidence or argument based on the tainted evidence can be thrown out, as well.

I think the line is blurred with the system in the OP, where it seems like it kind of fits the latter case. It's fuzzy to me when parties are working in tandem with law enforcement, even going as far as running law enforcement's own systems voluntarily, or collecting or giving up information without a warrant when simply asked. To me, that seems a lot more like former case, where the government persuades someone to collect evidence on their behalf without warrants or subpoenas.

[1] https://en.wikipedia.org/wiki/Fruit_of_the_poisonous_tree


> voluntarily

In the US, that is currently where we've drawn the line.

They can cooperate, but it has to be voluntarily and the private party can't be government-by-another-name (e.g. not NCMEC which has a special legislative permission to handle child porn and is overwhelmingly funded by the government), but an actual private party.


Hasn't this already been tested in a US court? I can't remember what it was called, but since you're giving away your privacy to a third party, you've given up right for privacy period.



Thank you. Third-party doctrine was what I was thinking of. So I'm guessing putting things in your phone will soon be seen as crossing into third-part territory if it hasn't already.


It's not hard to imagine that this kind of automated scanning might be used as an argument for that: "Clearly you had no expectation of privacy for information on your phone..."


Bingo. If you chose devices that aren't open, you're chosing to give away your freedoms. I makes the jump to LineageOS earlier this year because I could see the signs that Apple were getting to comfortable with government. Now, I've got the confirmation I was right in my hunch


I’m not making excuses. The government can mandate that Apple monitor content uploaded to its services. All the other major hosting providers do this already. The only difference here is that Apple is trying to maintain encryption while also giving an out. They can’t give more information to the government if they don’t have it.


> The government can mandate that Apple monitor content uploaded to its services.

In the US, if the Government were to mandate Apple search the users content, then Apple would be acting as an agent of the government and the searches would require a warrant per the fourth amendment. This is the unambiguous case law.

If Apple is being forced they should say so explicitly rather than secretly participating in an unlawful conspiracy to violate their customer's fourth amendment rights.

Other providers, such as google, have been unambiguous in their testimony in court that they are not being coerced, for example quoting US v. Miller (6th Cir. 2020):

> Companies like Google have business reasons to make these efforts to remove child pornography from their systems. As a Google representative noted, “[i]f our product is associated with being a haven for abusive content and conduct, users will stop using our services.” McGoff Decl., R.33-1, PageID#161.

> Did Google act under compulsion? Even if a private party does not perform a public function, the party’s action might qualify as a government act if the government “has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must in law be deemed to be that of the” government. [...] Miller has not shown that Google’s hash-value matching falls on the “compulsion” side of this line. He cites no law that compels or encourages Google to operate its “product abuse detection system” to scan for hash-value matches. Federal law disclaims such a mandate. It says that providers need not “monitor the content of any [customer] communication” or “affirmatively search, screen, or scan” files. 18 U.S.C. § 2258A(f). Nor does Miller identify anything like the government “encouragement” that the Court found sufficient to turn a railroad’s drug and alcohol testing into “government” testing. See Skinner, 489 U.S. at 615. [...] Federal law requires “electronic communication service providers” like Google to notify NCMEC when they become aware of child pornography. 18 U.S.C. § 2258A(a). But this mandate compels providers only to report child pornography that they know of; it does not compel them to search for child pornography of which they are unaware.


I think you’re missing the point. If the only thing stopping the government is a warrant (which the government itself can authorize) then there’s nothing stopping the government from just issuing those warrants. It may take slightly longer to do so but they will get it and all they need is reasonable suspicion to make that happen. This system makes it impossible for Apple to provide anything to the government outside of anything that is already criminal. If the government entity gets a warrant, they can simply get access to the whole device. This, in theory, prevents that situation because everything is encrypted and only things that are confirmed as being CSAM are provided. They literally can’t provide something they don’t have.


To be equivalent, they'd have to get courts to issue warrants for hundreds of millions of US Apple product users, people who have no reason to suspect they've engaged in any wrong doing. I suspect the courts would take issues with that.

Besides, if they could get the warrants they could get access to all the data in any case (e.g. by getting access to the users backups to steal the users credentials and then accessing the icloud accounts).



Yes but, effectively, the end result is the same without violating users' privacy. If none of what's on their device matches the hashes, then nothing will have been done and Apple would be able to encrypt those backups also which would invalidate the whole 2nd part of your complaint.


Or they could just not invade the users privacy, encrypt the backups, and have no access to the user's data.


They're not invading the users' privacy. They have to comply with a lawful government order. This is allowing them to do that without violating users' privacy.


Apple has not been ordered by the government to scan user's images.

Because the scanning is happening without a warrant, if the government compelled apple to perform the search the search would be illegal. It is only legal for Apple to perform this search because it is in no way compelled by the government.

The government cannot obtain a blanket warrant against everyone. This kind of dragnet scanning has to be voluntarily performed by a private party for it to be lawful.

By all means! please prove to us that the government secretly has ordered Apple and other companies to scan users private data: If you do so it will result in overturning tons of convictions due to the unlawful searches which were concealed due perjury by the government and tech companies who have consistently claimed that the scanning by the tech companies is completely voluntary in in their own self interest.


They could make themselves unable to access any data and still comply with lawful orders. There's no rule that you have to preemptively design systems to make data extraction easier.

This planned system means less violation of privacy in cases where a warrant exists, and more violation of privacy where there is no warrant. That's not exactly an amazing tradeoff.


I don’t follow. This system is exactly what you’re describing. Apple is not able to access any of your data. The most they get access to is the signature of the file (which would already be matched with known CSAM content) and a potential thumbnail of the CSAM content.


I feel like it should be obvious that I'm describing a system where they don't add this code.

And depending on what's in the database, signatures and thumbnails can be really bad.


>Thus far, tech companies like Google and AOL have gone out of their way to testify that there is no coercion (I haven't found a case involving Apple). If there were coercion it would be a shocking conspiracy.

Isn't Google a company with billion dollar government contracts? No coercion required...


If Google is being forced to search their users private data for fear of losing government contracts then they've purjured themselves in court by claiming otherwise.

If that were the case and Google honestly testified that they performed the searches based on the belief that the government would withhold business from them it is likely that the warrantless searches would be disallowed. Courts have found that even covert encouragement can apply. What matters is that the if it's functionally the government's choice if the searching does or doesn't happen, then its subject to the fourth amendment.


>If Google is being forced to search their users private data for fear of losing government contracts then they've purjured themselves in court by claiming otherwise.

Doesn't need to be fear. Doesn't even need anybody to insinuate something threatening.

The conflict of interest is enough, to willingly do anything you're asked and play the good boy out of greed - to keep/get more contracts.


This seems to imply "the government" is some single minded entity. Anyone who has had to deal with "the government" knows that it is rare that the same department can stick to a purchasing strategy.


> Apple is doing this because they believe they'll make more money this way

And possibly, as mentioned in an earlier post on HN, as a defense against Anti-trust litigation.


Yeah - my read of this is Apple wants to increase their e2e adoption while still handling CSAM in a way that gets them an okay from the regulators.

Ben Thompson's suggested approach for them in this article is to just do the unencrypted iCloud model with normal CSAM reporting like FB does and make that the tradeoff.

Apple actually wants things to be more secure by encrypting everything e2e with this CSAM hash approach to still fulfill that obligation (and protect themselves from future regulatory requirements that are more broad).

Reasonable people can disagree on these approaches to policy, but it's not obvious that Apple's move here is strictly worse for end users - if they do make moves to e2e then it's arguably better for user privacy to do it the way they're doing it.


It's been said elsewhere, but it bears repeating: E2EE is useless if the ends are compromised.

Furthermore, since the end is compromised, it's no longer a true end. The true end is the user and a system compromised in this way doesn't offer any guarantees to the user.


I don’t agree with apples approach. But it’s worth being clear that the hash scanning they propose doesn’t fundamentally break e2e, except for files which match the hash. Yes it might be possible to adversarially create collisions, but that is an edge case.

What is actually bad about their plan is the potential for scope creep and might wider scanning in the future. Your own, handwritten messages and personal photos can be secure even while this destroys many freedoms and lives.


Why are you assuming the end is compromised? What makes you say it is in the first place?


The "end" will happily send information to Apple such that under certain circumstances (that Apple and various organizations get to decide) low-resolution versions of the "end-to-end encrypted" messages can be obtained. That seems "compromised" by any reasonable definition.

Edit: thanks dpkonofa, I made the description vague in the hope that it's more correct.


That's not what happens. Read the white paper. The only thing sent is the hash and signature (which, at worst, essentially amounts to a thumbnail of CSAM). No part of the encrypted content is ever sent to Apple.


So Apple might end up with thumbnails of the "end-to-end encrypted" content I send, and you think that it counts as end-to-end encryption even if (a low-resolution version of) my messages can end up in a third party's hands. Did I get that right?


I'm really curious if the potential for the wrong files to be exfiltrated is going to force corporations and law firms to prohibit the use of iPhones and Macs for work.

Personally I've been dying for the next gen of M1 macs to come out. I also wonder if this tech has some magical way of getting around a hosts file and little snitch. If I can't gap my data from Apple it would raise ethical issues with my storing client secrets on the machine.


Yes. The only way they would get the signature and low-res version is if it’s already been identified as CSAM (with a 1-in-a-trillion chance of a false collision). If they’re not getting your content, it can still be E2E encrypted.


> with a 1-in-a-trillion chance of a false collision

I don't think that's exactly where the "one in a trillion" claim comes from. Rather, it's that a single matching hash isn't enough to trigger the reporting; there needs to be multiple matches, and when there are enough of them to cross an unspecified threshold, then the reporting is triggered. There's theoretically only a one in a trillion chance of that threshold being crossed without having actual CSAM matches.

If I understand the white paper correctly, this even goes a step farther than that; they can't decrypt the signatures of the images corresponding to the matched hashes until the threshold is passed, because those images form a kind of decryption key together.

On a technical level, I'm actually pretty impressed. They absolutely could set up E2E encryption and still implement this system, and it largely assuages my worries about false matches of innocent photos (with the extremely big caveat that a false match has a very high potential of ruining someone's life). As the linked article points out, though, the real privacy concern here comes from having this matching capability on-device at all, because once it's there, limiting the data set to just this one provided by NCMEC becomes a matter of company policy. If an agency of any government demands Apple add their data set, they can no longer say, "we can't do that without drastically compromising the way our devices and services work," because it will be public knowledge that this in fact how their devices and services work already.


The 1-in-a-trillion claim was debunked, https://www.hackerfactor.com/blog/index.php?/archives/929-On...:

> Facebook is one of the biggest social media services. Back in 2013, they were receiving 350 million pictures per day. However, Facebook hasn't released any more recent numbers, so I can only try to estimate. In 2020, FotoForensics received 931,466 pictures and submitted 523 reports to NCMEC; that's 0.056%. During the same year, Facebook submitted 20,307,216 reports to NCMEC. If we assume that Facebook is reporting at the same rate as me, then that means Facebook received about 36 billion pictures in 2020. At that rate, it would take them about 30 years to receive 1 trillion pictures.


//No part of the encrypted content is ever sent to Apple.// Except all the data before it is encrypted, if it matches the data Apple wants to see. And don't anyone dare say this is a backdoor. It just scans for arbitrary secret files a 3rd party is interested in, and sends out info whenever it finds something interesting. That is perfectly fine bc shiny Apple said so.


This is wrong. Apple never gets the contents of the file. Read the white paper.


Yeah, good cop, bad cop, game.

Bad cop: I want access to all your information.

Good cop: Oh, these guys are after you. Let me see if I can intervene for you. I can implement a selective limited search, that make these bureaucrats happy and at the same time will protect you. I am on your side, I hate them too. This is the best practical solution.


> problems that privacy inherently allows for

I wonder how we ever survived with the level of privacy we had before the advent of the internet. It must've been a living hell. /s

Privacy is not a problem, it is necessary in order for freedom from tyranny to be possible. Even if all the governments of the world mandate that all communication devices must be compromised like this, criminals will still be able to have private communication channels, but lawful citizens won't.


I don't claim to have an ideal solution here, but to your point: There weren't highly-effective global distribution networks for CSAM "before the advent of the internet" either. This capability has created a market for these people to produce/obtain this material, and thus creates financial incentives for people to abuse children in this way more than in pre-internet times. There needs to be some way of combating this that can keep up with this growth curve, and that answer can't simply be to throw up our hands and say "the criminals will always win, so there's no solution worth doing".


We should tackle this from the other end though. That there are children so abused and isolated whilst ignored by society is shameful. We need solutions that treat children as humans rather than a product or contraband.


This makes it sound like there aren’t entities out there who are interested in such solutions. If you have one that fulfills that need, I’m sure Apple and others are open to hearing it.


This is a great response and covers what I think the crux of the issue is. Technology has exacerbated a problem and, rather than simply bow to that, they tried to come up with a solution. Whether that solution addresses the negatives of the problem without introducing other, worse negatives is to be seen. On paper, it feels like they have addressed the problem in a reasonable way if you trust that they want to make money while absolving themselves of liability. They can’t violate users’ privacy if they don’t have the ability to do so.


//they tried to come up with a solution.// ... they decided to spy on us ^there I fixed it for you. Making me total dictator would be a 'solution' too I mean something has to be done nothing else is working or could possibly work. Me for dictator- I promise to be benevolent, who can argue with that. At worst it is a thing reasonable ppl can disagree about. /s


But I fail to see how icloud backups are a good place to tackle it.

And I doubt this system, which looks for known files, is going to do very much to impact production.


I never said privacy was the problem, so your argument is irrelevant. What I said is that privacy (true privacy) introduces problems that we may want to solve for. You can simultaneously be an advocate for user privacy while agreeing that CSAM is bad and should be minimized. The question is how do you maintain privacy while still being able to act (especially if mandated by a government) on known illegal actions. In the past, that wasn’t possible. Apple is suggesting that this is possible but with the giant caveat that you have to be willing to put trust in the way the system is developed.


I don't necessarily disagree with the "getting ahead of it" thing, and I do see the benefits of this strategy wrt. keeping data private in the face of escalating govt overreach, but I'm simply not comfortable with the surveillance happening on my own physical device. It means trusting proprietary software to work exactly the way it claims to, which in my own experience is often not the case.


I share that concern and completely understand. My only response to that concern, though, is that what you’re asking for is not possible without submitting the content in question to Apple and I would rather they not have it at all and therefore not even have the ability to go down that route.

It’s like a double airlock. If your content never leaves your device, there’s no way for them to provide it to someone upon request. It’s definitely a complicated situation but I have yet to see someone provide a solution for how to achieve what Apple has while never transferring your data to them in the first place.


> what you’re asking for is not possible without submitting the content in question to Apple

This unfortunately appears to be true, and part of me is impressed by [the idea of] the solution they've come up with. But, like I said in my first comment, it means in practice that I have an opaque and unaccountable surveillance program running on my personal physical hardware, which is undeniably an escalation. Whatever they claim it's doing is both completely unverifiable and probably complicated enough, and subject enough to change, that it's practically impossible to form a complete and persistently correct understanding of the attack surface it exposes—even if it were open-sourced and exhaustively documented, which it isn't and presumably never will be.

I am much more comfortable simply knowing that anything I upload to somebody else's servers is subject to snooping, and anything I keep on my hardware is not. That's how the other cloud storage providers do it, and I may not like it, but it's at least easy to account for when I'm thinking about my own data security and privacy. It's admirable that Apple tried to come up with something better, but (I'd argue) from certain important perspectives it's arguably worse.

Software is a chronic disease. Once you let it in, it never goes away, and changes beyond the scope of "fixes" seem to infallibly make it more intrusive and/or harmful. We are like frogs with many bodies, each being boiled in a different pot by a different chef. I have no interest in yet another unaccountable daemon inhabiting my private person.


While I agree, in principle, I think that cat's been out of the bag for a while. The government will continue to mandate things away from end-to-end encryption (they're already trying to pass bills to make it illegal) and this seems like the most reasonable solution I've seen thus far that still allows for end-to-end encryption while addressing the concerns of governments around the world. If you want what's on your device to stay on your device, I think you have to live in a place that doesn't exist - namely, a world without governments or the internet.


> If you want what's on your device to stay on your device, I think you have to live in a place that doesn't exist - namely, a world without governments or the internet.

This isn't true quite yet. It's pretty easy for me to keep some of my data truly private (e.g. on a secure Linux machine with an encrypted disk, if I'm really paranoid) while still participating in modern society, using the internet, and so on.

Apple has simply removed itself from the set of vendors whose products don't preclude uncompromised local data security.


//It’s like a double airlock. If your content never leaves your device, there’s no way for them to provide it to someone upon request.// Except they just built themselves a way to get our data on our devices which very much does transfer our data.


No, they didn’t. Apple never gets any of the content on your device. They get signatures and, at best, a visual representation like a thumbnail.


Whether scanning happens on the iPhone or in iCloud does not counter the problem that the dragnet nature of this makes it easy to weaponise.

Getting CSAM pictures in the iCloud photo library of someone you don't like does not even require physical access sometimes: For example, WhatsApp has a "Save to camera roll" option that's enabled by default: just send a bunch of bad pictures via WhatsApp, those will get synced to iCloud after a little while, and now that person is in big trouble.


>It's very easy to understand that content on a physical disk I own will not be scanned by some government-friendly surveillance program, and content I upload to somebody else's servers will. I am comfortable with [subsets of] my files existing on both sides of this distinction, but only because I understand it.

According to Apple, the only content being scanned are the images you are storing in iCloud. So how is this breaking your rule?


> According to Apple, the only content being scanned are the images you are storing in iCloud

Thats the whole point of this debate. TODAY (if you trust apple is honest) it's only stuff being stored in icloud. But there is no technical reason for this to be true - just a "policy" reason. The scanner is already deployed to devices, we already let in the trojan horse. Tomorrow, the rules could change (and we might not know)


This is breaking my "rule" because the detection is being moved device-side as part of a complicated scheme I don't believe it's realistically possible for an outsider to have a complete and persistently correct understanding of. Thus, I do not trust myself to fully understand the attack surface I'm exposing, and that is something I'm not personally comfortable with.

I thought this was fairly clear from the later parts of my comment.


"The USA PATRIOT act will only be used against terrorist threats." 12 years later, Edward Snowden proved otherwise.

The world is a much different place than when I grew up in the 80s. I can't imagine what it would look like in another 40 years. I hope I'm dead by then, because it doesn't seem like it's going to a place I want to live in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: