Yep. Too many people design systems (both technical and otherwise) under the assumption 'their team' will always be in control, and it'll be their enemies who suffer.
Or that hey, this power will never be abused because the system won't allow it, and those in control will always have our best interests in mind.
Don't collect more data than you have to, limit the power of any systems you implement, and always design things under the assumption your enemies will take control of it in future.
> Don't collect more data than you have to, limit the power of any systems you implement, and always design things under the assumption your enemies will take control of it in future.
This is great advice if you want to design Good systems, but under barely-regulated capitalism, those who design Good systems will be out-competed and put out of business by those who design Profitable systems :(
I remember when Schmidt came out with this line thinking the same thing.
But back then it seemed like a distant and improbable scenario - I objected to his statement more on point of principle than out of any realistic sense that it was things were going to get that bad in the near future.
Turns out they did.
(before anyone says it - yeah. Back then there were probably already enough examples of law enforcement overreach - not to mention the decades-long injustice of the "war on drugs". I need to learn to be more cynical)
> But back then it seemed like a distant and improbable scenario
I am trying to understand this perspective. I was there and nothing seemed improbable or remote. The remoteness was merely a function of technical & economic conditions. Historic precedents, domestic and foreign, past or near present, all pointed the same direction, underlining high probabilities.
Pessimism is rarely the correct inclination, with the exception of questions concerning freedom, power, security, and control. It is appropriately rational to question and highlight worst case outcomes in such cases.
This same pattern is happening yet again ('surprise!') with generative AI. Maybe it is necessary to assure that 'yes, this technology is very cool' as red flags are raised.
It is a very simple thought, backed by unassailable historic evidence: Humans enjoy lording it over other human beings. We should never create systems that permit a tiny tiny subset to realize such base desires. A very simple idea, truly.
Discussions on surveillance and misinformation often involve people advocating for granting more power to the government to prosecute who they believe violate their value system, unless the value system somehow changes and now you become the criminal. As an example, this is why breaking E2EE with backdoors to stop pedophilia, revoking immunity to social platforms for users' speech and the like are bad ideas - some day your values will become abhorrent, and the same tools that you used against others will be used against you.
I have nothing to hide, beyond a bit of casual piracy, at least until laws change significantly, but people I care about might.
My distaste for corporate stalking is, if I'm perfectly honest, at least partly selfish feelings of discomfort in losing what privacy I still have, but it is mostly concern for what some will use information about others to enforce.
Is the issue that Meta and Google comply with the law(turning in the data to crime investigators), or that the law itself is medieval? I mean, both are concerning in their own way. But the latter seems like a much much bigger problem.
I think this issue is that we are giving these companies the ability to reveal sensitive information about us.
We should rethink how we share our data and the costs that it has.
I don't think Meta and Google are to blame here. Other than encouraging us to give them our data unprotected (as well as trying to syphon up as much as they can get their hands on in the background).
> I don't think Meta and Google are to blame here. Other than encouraging us to give them our data unprotected (as well as trying to syphon up as much as they can get their hands on in the background).
That bracketed "as well as" is a 'king huge "other than".
Even your smaller "other than" that is stated as such, is enough to make the premise that they carry no blame seem pretty silly to me.
Yes. But the bracketed part doesn't apply to this article IIUC because it is talking about chat logs which were "consciously" given. If the article was talking about web history assembled by ad tracking scripts than I can definitely agree that Meta and Google have a large portion of the blame.
This comment is a fantastic example of the meta-battle which the tech industry has been waging. They have worked very hard to change the very questions in the privacy debate. In their terms, collection of the data itself is never under debate; all debates are framed in terms of how they are allowed to use data. In this case, the failure isn't the law or the company's compliance with it. It was the collection and retention of the data in the first place.
That is the problem: we have nothing to hide until someone changes the law. Suddenly those things that were legal yesterday become the crimes of today.
And as their data was harvested, consumers were told: Relinquish your private data to us, it's a fine and normal thing to do, we are trustworthy corporate citizens and privacy is a concern expressed only by those who wear hats of tinfoil.
It's a good question actually. The law was on the books but not being enforced because of Roe. But then the supreme court says Roe doesn't apply and law is fine as-is. Isn't this legally different from an ex-post-facto law?
From the point of view of the company for who it is just a pot of data with no context until analysed, maybe.
It will sure seem retroactive to someone who might have acted differently so the data wouldn't be available to be handed over, if the current laws were in action at the point they could have done something to avoid the data being collected¹².
--
[1] "Generated" is too benign a word here IMO, hence using "collected" instead
[2] "inferred" might be a better choice as the data could be incorrect³ but that still seems to imply less agency than the companies have in their very deliberate stalky behaviour
I used "generated" viewed from the user, "collected" would be from the perspective of the company. I wonder whether there is a legal difference in the case of law changes, which date would be taken?
The classic form of retroactive application of laws would be if someone performed an action, the law was backdated so that action becomes illegal and the performer becomes a criminal.
In this case we're talking not about direct action as the action is implied via the data collected. So if the action was performed before the backdating of the law but the data was collected after the backdating, is the performer a criminal?
> That is the problem: we have nothing to hide until someone changes the law.
As it happens the Dutch authorities were pretty good with collecting ethnicity and confessional data in the inter-war period, then the very bad guys came along and we know what followed.
Data collection at scale and especially data centralization has always been a mistake, too bad many of the livelihoods of us here depend on exactly that.
In the US’s case I don’t think it would take an invasion to put people’s lives at risk because of innocuous (at the time) data collection on sensitive societal issues. This article is just proof of that.
The second part is true, the first part not so much. Enforcing the law creates stability, which is a good trait in and of itself, (all things considered equal otherwise about the law in question)
You have some arguments for that I'm not aware of perhaps?
Companies don't get a choice. The "out" some companies are using is to encrypt everything with keys only on the customer's device. They then provide all data they have, as ordered, on request. This, of course, does not include the encryption key. However, as far as I know essentially only Signal does it these days.
Whatsapp is famous for doing this before, and then Facebook killing this for "regulatory concerns". I don't know for sure, but the previous owner of Whatsapp and founder of Signal implied that Facebook got threatened by states into doing that.
But FB/Google/Amazon/... are the tip of the iceberg. The companies really used for "enforcing the laws" (and for using very harsh measures against individuals just to make some government department's job a little bit easier) are banks.
(note the wording here: "the IRS has full authority to". They can do this at will. This process has been used to cause problems for political opponents as well. Nobody seems to care)
Except in this case, they are doing both. To not do what you're asking (not turn over data on abortion seekers), they would be breaking the law if they received a valid warrant.
Now... They could comply with the law by not hoovering up as much data as they do, thereby becoming useless to both advertiser and law enforcement alike....
The companies profit from this data, but at this point they no longer have a choice: they have to collect the data or stop offering services in a lot of geographical areas. The point that they can avoid it by not collecting it was valid some 15 years ago, but no longer.
One of the main objections of companies is, by the way, that most governments refuse to pay the sometimes extensive development and infrastructure costs for this, instead just threatening the companies with (often illegal) measure to force their compliance.
Nothing really. Here is the list of this ethnic people you want to eliminate in conformance with the law you just passed on mandatory abortion and sterilization, oh my great overlord. Congratulations for your successful putsch by the way.
Both are problems, but naive people expect Google and Facebook to do differently. Google has always been intertwined with the government, and Facebook was absorbed early on. I am not denigrating people by using the word naive. Those who are most naive need protection and knowledge. More tyrannical/medieval rule seems to be trending.
>Facebook and Google comply with the law.
>When presented with a valid warrant they hand over the data requested.
The article seems to imply that the big social media companies should
selectively comply with a valid warrant based on what crime the accused
has committed.
I think you should either have problem with the entire procedure or
agree that the procedure is valid.
> The article seems to imply that the big social media companies should selectively comply with a valid warrant
They already selectively comply: "According to internal statistics provided by Meta, the company complies with government requests for user data more than 70% of the time".
Purely opinion, but I'm sure a non-trivial amount of government requests for user data are invalid, warrantless, or unfulfillable (contains incorrect information).
> As we have said in prior reports, we always scrutinize every government request we receive to make sure it is legally valid, no matter which government makes the request. We comply with government requests for user information only where we have a good-faith belief that the law requires us to do so.
(But) they seem to apply legal discretion on which to follow, which is mostly expected. When Meta receives a request/warrant they must use their judgement to determine whether it's legal or not.
> only where we have a good-faith belief that the law requires us
Given their track record, it's frightening that we're depending on the "good-faith belief" of Meta/Facebook and Alphabet/Google to make such legal decisions that affect people's lives.
But I suppose the alternative could be even worse, where they comply with any and all government requests for data, regardless of legal validity and requirement.
And the other 30% is unknown so we can’t say they selectively comply. It’s quite possible the other 30% are invalid warrants, in which case there’s nothing to comply with.
Social media companies can't decide that a warrant is "invalid". Only a court can decide that.
From the article:
> Goldman indicated examples where internet services affirmatively go to court to protect user interest, "but those are the exceptions." "There's thousands of requests for every one of those cases"
> Social media companies can't decide that a warrant is "invalid". Only a court can decide that.
They can, however, decide “this is worth pushing back against” vs “this is not worth pushing back against” - that 30% represents the number of times that Meta’s lawyers believed it was worth pushing back and they were proved correct
I’m sure you’re aware, but there’s pretty obviously a huge difference between the police requesting info because they want to make an arrest and any other government branch asking Meta for data on anything else. My guess is Meta’s cooperation rate with police would be much higher than 70%.
For me, the takeaway is that they collect and persist too much personalized data about users, and it's a shame that people only start caring when it affects abortion access.
Now, Google, to their credit, claims[1] they now purge information about users who visit abortion clinics or related places, but ... that isn't very reassuring. Even if they excise some related portion of user data, they still have enough other data to figure it out once law enforcement has access -- and there's more stuff the law would be after than just abortion! You'd be expecting Google to play whack-a-mole with every latest "activity that needs protection"!
But yes, you're correct, it's far too late to identify what Google's doing wrong after abortion is illegal, and after Google has that data about you, and after they're served with a warrant on that basis.
User chat logs and search history released by social media companies to police can be used to prosecute people for abortion, even when they are being investigated for other reasons.
> User chat logs and search history released by social media companies to police can be used to prosecute people for abortion, even when they are being investigated for other reasons.
Firstly, can you even prosecute a woman for abortion? Aren't they legal?
Secondly, if the abortion is illegal, it's not unusual for the state to prosecute someone when the investigation of that person reveals other crimes.
What would you propose instead? That any evidence of secondary criminal activity uncovered during an investigation of the primary activity be ignored?
You should do the bare minimum to familiarize yourself before wading in to this particular topic.
Abortion legality varies by state in the US. This is a recent development.
I wouldn’t expect to be prosecuted, or even investigated, for typing “how people get away with bank robbery” into Google. Or for watching The Dark Knight.
> Firstly, can you even prosecute a woman for abortion? Aren't they legal?
In case you've been living under a rock for the last while: several American states have banned abortions[0][1] after the Supreme Court overturned Roe vs Wade. The federal government failed to implement any laws to safeguard access to abortions so overturning Roe vs Wade was all that conservatives needed.
Some states have exceptions for rape and incest, some don't. Texas even offers a sizeable bounty for reporting abortions. This has already resulted in medical care being refused to women carrying stillborn children and other pregnancy complications fatal to either the mother or the child out of fear of prosecution.
As for secondary criminal activity: I agree, if the police finds other illegal acts during a legal investigation, they should be allowed to act on that. This is the proof that the whole "if you've got nothing to hide" narrative surrounding state surveillance is dangerous.
> I agree, if the police finds other illegal acts during a legal investigation, they should be allowed to act on that.
I'm not sure this principle is a good one. Almost everyone had probably broken a law or two (yes, this meane there are too many bad laws, let me know if youve an idea of how to solve that), so by investigating someone for some random thing they haven't done you've got a good chance of finding something. This de facto gives imprisonment powers to the police and prosecutors office, giving plenty of opportunity for corruption.
> I'm not sure this principle is a good one. Almost everyone had probably broken a law or two (yes, this meane there are too many bad laws, let me know if youve an idea of how to solve that), so by investigating someone for some random thing they haven't done you've got a good chance of finding something. This de facto gives imprisonment powers to the police and prosecutors office, giving plenty of opportunity for corruption.
Well, that's not going to work out well - LEOs investigating a shoplifter should just ignore the corpse lying in the backyard?
It's just no going to fly - crimes are crimes, and if you want a crime to be not-a-crime then follow the legal process in your jurisdiction to make it so.
>yes, this meane there are too many bad laws, let me know if youve an idea of how to solve that)
I, in fact, do. All laws come with baked in sunset dates. No exceptions. Furthermore, it's clear there is a need for some sort of secondary legislature or sub committee of the primary dedicated to the repeal of bad law. Then again, if that worked, we wouldn't necessarily be in the problem we're in.
> I, in fact, do. All laws come with baked in sunset dates. No exceptions.
The problem with this, I imagine, is that when the sunset dates comes around, and a new political party is in control, they will let the new law lapse and laws will be ping logging back in forth given they even get the votes to go back in effect.
This would be terrible for very important laws like the Civil Rights Act.
What’s the sunset date on these abortion bans? Is it more than 9 months? Is there any reason to believe they will be repealed at their sunset date? How long do we tolerate the injustice? What do you tell the woman who wants an abortion today?
Bad laws aren’t a mechanical problem. They’re a people problem. Repealing law is something the current legislatures are perfectly capable of. The hard part of repealing “bad law” is defining “bad”. A secondary repeal committee would have the same difficulty as our existing legislature.
If you don’t like a law then go do something to change it. In case you doubt the feasibility of this recall that is exactly what happened with abortion.
What sounds like a trip into medieval times? Government control over corporations? Corporation oversight over private lives? Isn’t that what the new generation of enlightened and progressive people wanted?
But wait, isn't that what was happening in medieval times: the church ruled over the peasants, the kings ruled over the states and everyone was happy living in a liberal and free world - well at least they believed that.
That was the theory; in practice, the very limited communication (hard to handle fast-evolving situations when couriers from Rome to Amiens take a few weeks at best) & enforcement means (you have 100 guys with spears? We are 200 with pitchforks, come get us) led to a much more decentralized power structure than what is stereotypically understood.
Technically it was illegal the whole time in many places and the supreme court only last year recognized that they never had the power to prevent states from enforcing their own laws on the matter.
In states where it was illegal it may not even be ex-post-facto for them to prosecute for events that took place prior to last year's supreme court ruling.
Anyway, the real takeaway should be that businesses should not be collecting this kind of data in the first place. If they don't collect it, then they have nothing to turn over.
> Technically it was illegal the whole time in many places and the supreme court only last year recognized that they never had the power to prevent states from enforcing their own laws on the matter.
I don't agree with this "technical" interpretation at all. Nobody would have said in 2020 that technically abortion is illegal in many places. Just because the Supreme Court changed its mind doesn't wipe everyone else's minds.
It doesn't matter what you agree with. Plenty of legal scholars said exactly that, so you're wrong.
Roe ruled that certain state laws criminalizing abortion could not be enforced. But Roe's ruling was found to be unconstitutional and invalid. It was invalid the day Roe was ruled, not the day it was overruled.
Those same states laws criminalizing abortion, which were on the books before Roe and are still on the books, were always legal and enforceable because Roe never was. That is what the court determined last year.
> Plenty of legal scholars said exactly that, so you're wrong.
LOL. What exactly, numerically, does "Plenty" mean, and how does it compare to "all"? Of course, both the majority and minority of the Supreme Court in the Dobbs opinion are legal scholars, but they disagreed vehemently with each other.
Your response sounds very Orwellian to me. Oceania had always been at war with Eastasia.
Slavery was once legal everywhere until it wasn't. It became illegal in some parts of the country but eventually it became illegal everywhere. Now we can look back and think how obvious it is that slavery is wrong.
Murdering the unborn was once legal everywhere until it wasn't. It became illegal in some parts of the country and will eventually become illegal everywhere. One day we will look back and think how obvious it is that murdering the unborn is wrong.
You're missing the point. The transition from legal slavery to illegal slavery was not a smooth one. Consensus did not magically arise. People did not simply relent to legal authorities. There was a massively bloody Civil War fought over it. Regardless of what you think about how "One day we will look back" (and note how Confederate pride is still a thing today), you can't expect this to be like an ordinary legal issue. The transition is extremely divisive, and there will be resistance. Nobody could be neutral during the Civil War, and it's going to be difficult for corporations to remain neutral now, when they want pro-choice people to use their services. Abortion is legal in the headquarter state of the big tech companies, and those companies employ many pro-choice workers.
E2E encrypted doesn't mean they don't know who is messaging whom. If there is suspicious contact, can they request who contacted a pharmacy? Then they can ask the pharmacy why so-and-so contacted them.
A better option would be something like Briar which hides IPs via Tor.
Is it really? I always assumed it was like Zoom end to end encryption where one of the “ends” is a Facebook data center. How can a user prove the claim of end to end encryption?
Almost all of WhatsApp has been E2EE for years, based on the same protocol Signal uses. This goes for text messages (personal and groups) and calls. Cloud backups are not encrypted by default, but encryption can be enabled.
WhatsApp doesn't have an open source client so verification is difficult. However, if someone were able to break the encryption, I'm sure it'd be in the headlines of most newspapers.
One exception is WhatsApp business: I don't know the details, but Facebook offers a service where they will do some chat automation for your business which means they must receive the keys.
In terms of security: key changes are automatically accepted. They are hidden by default, but by toggling a setting every time a user updates their keys, a message will be introduced into the chat. QR code key validation has been in the app for years now, though I doubt many users are using the feature.
How do you tell the difference between true E2EE and Zoom E2EE where FB decrypts the message in the middle? Or otherwise backdoors the exchange, perhaps outside the Signal protocol? Ultimately you are trusting Facebook to tell the truth here.
There was a bit of a song and dance when Whatsapp adopted the Signal protocol. Certainly if you choose not to back up your Whatsapp messages, your old messages aren't available when you switch phones.
If they're not end-to-end encrypted, they're engaging in a lot of deception to indicate that they are.
Thanks, I don’t have much experience with WhatsApp. I don’t have a lot of faith in Facebook. Especially post-Snowden.
If you think you need E2EE you can really only achieve that on an open system you control and have intimate knowledge of. You can’t trust precompiled binaries.
Something something trusting trust.
This isn’t a problem technology can solve. Women shouldn’t need to be information security experts just to ask questions about their own bodies.
Except you provide the key to the app and the app is controlled by FB. There’s really no way to prove the key stays on your device. Or that your messages aren’t just forwarded without encryption to a FB datacenter.
IMO the most realistic way to improve this situation is to get more people using E2EE chat apps, and fight against any attempt by government to weaken or ban encryption.
Meta the company hates this and is spending money on ads trying to get people to use WhatsApp with end to end encryption for things like this.
Even with E2EE doesn't META know who is messaging who on Whatsapp? This doesn't seem completely secure. Briar is distributed and uses Tor to hide user IPs.
Although it probably sucks for the individuals at the receiving end of the scales of justice I believe this is an important public education campaign which will hopefully raise awareness of the scale and extent of data collection.
There seems to be a disconnect between data being collected from a terminal, how rich that data can be, and what it can be used for. If you use a digital keyboard your every keystroke can, and probably is being logged - we used to call this spyware, now even the keyboard app on your phone has clipboard sync (and it's built into Windows too!).
People need to be aware for example when activating javascript (and most don't know what that is), how much the various APIs are collecting and storing, which is used to build a "fingerprint" of your device.
The web (and digital devices in general) is actively hostile, anyone who uses noscript can attest to that and anyone who goes onto any news media website and opens "network" via web dev console can see how much data is flying to god knows where to do god knows what.
If you want to defy the state you need to be a master spy, this means actually thinking how you research, and probably learning a thing or two from Snowden (are you on wifi? open hotspot? is your device logging anything, if so, what? do you need to destroy the device afterwards incase it gets forensically inspected? If you could read /var/log what would be there and would it reveal anything about your situation? If you can't read /var/log then your device is actively hostile and not worth the risk).
Privacy is effort, and, in these cases, the 'seekers' are not spending enough effort on hiding their tracks and they are like fish in shallow water, easy pickings.
The ones who are clued up are not the ones who end up making the press, because they know exactly how to cover their tracks, and make it impossible to prove or disprove a fact, which, thankfully at least at this moment, is how justice works (for the most part).
Tragically enough, Facebook Messenger does actually support E2EE. However, it needs to be enabled per contact/chat. Facebook's bad UX may just have become a driving force behind the conviction.
"An investigation by ProPublica found online pharmacies that sell abortion medication such as mifepristone and misoprostol are sharing sensitive data, including users' web addresses, relative location, and search data, with Google and other third-party sites — which allows the data to be recoverable through law enforcement requests."
"we also know that social media isn't likely to stand up to illegitimate law enforcement requests, because of the fact that they fear their own liability, or because of the fact that it's just too costly to stand up."
Interesting to realize that I now have to equally self-censor whether I'm using WeChat (China) versus any US-based service (Western world) for communication..
Unless you're working on sensitive or compartmentalized tech, and talking about it on the platform, I don't even see why you'd need to self-censor on WeChat. The Chinese, for all their faults, don't seem to care what you do in your spare time.
That said, I don't know if I'd feel comfortable discussing my medical history over Gmail or Facebook, even if ostensibly "private."
Today I learned about pyrography and a particularly dangerous technique called "fractal burning", where people use high-voltage electricity to burn patterns into wood.
The result is quite beautiful, but I can see why some countries might want to regulate or prohibit it. Personally, I think people should be allowed to take such risks if they choose to do so.
> The Chinese, for all their faults, don't seem to care what you do in your spare time.
No, they'll just dutifully record all activities in a dossier, until something interesting presents itself to be leveraged against you, or until they decide at a later date that what you do is distasteful.
Abortions are now a crime in several American states, close if not equal to murder. These companies just do what they've always done when it comes to murder cases: hand over the data when the authorities demand it through warrants. As they should, in theory.
Facebook and Google are not to blame here; they're simply doing what the law demands of them. These companies are not above the law, they cannot refuse warrants. Blame Nebraskan and American federal law for this situation.
With a bit of luck, this situation will make these companies put more priority into E2EE. Had these conversations been done through Signal, there was likely nothing to be found or handed over.
>One legal expert said social platforms may cooperate with police even if not legally required to.
From my reading of the article this is purely speculative, however. There's no actual assertion that FB/Google are doing more than complying with valid warrants, other than observing that this appears to be the case with other types of warrants. So I guess one could fault these companies for not fighting tooth and nail over these warrants in a way they wouldn't for other warrants, but that seems like a weak condemnation.
> From my reading of the article this is purely speculative, however.
Apple publishes statistics about their shared data, and whether data shared was shared because of a warrant or request. The company very often just hands over data without a warrant, a simple request is all that's needed. I doubt they are unique in that regard.
Yes, but to the one case specifically mentioned in the article Facebook said the warrant was valid. Hence further commentary about how companies are not simply doing as required in these specific cases is speculative. Seeing further discussion on this it seems unlikely that the warrants in question made any specific reference to abortion to begin with. So there isn't even a notion in many instances that these social media companies could provide extra scrutiny unless they made this determination on their own.
Maybe social media companies should fight tooth and nail over every data request, but somehow I think most people don't want this. The same people who would be outraged at Facebook turning over data in an abortion case are probably the ones who are fine with say Facebook turning over data related to the January 6th protestors. Is there actually a non viewpoint-based principled stance behind the outrage, or is this just an instance of working the ref to your team's advantage?
> Abortions are now a crime in several American states, close if not equal to murder. These companies just do what they've always done when it comes to murder cases: hand over the data when the authorities demand it through warrants. As they should, in theory.
Companies just hand over data when simply asked, they don't even need a warrant.
And as far as I know it isn't. At least these companies certainly comply with lawful warrants. They may be more helpful in cases of their choosing (e.g. CSAM) but are they really expected to refuse when a court orders them to assist with an investigation on the basis that they (to the extent possible for a company) disagree with the law in question?
I miss the good old days when you could just open a telephone book and no one would know what you were looking up. Now everything you do is recorded by a third party.
Since communication was unencrypted, listening in was easy though. Even the instant messengers, up to 2005-2007, sent their messages in cleartext. So listening in was easy, but since storage was much more expensive, I don't think they bothered with recording them too much.
Regarding your conclusion, leaving FB and G is not even the hardest part. The hardest part is that your network probably won't follow.
Note that you can turn on E2E encryption in FB Messenger (as well as WhatsApp). It does lose a few features, but people should probably be aware it is possible.
Meta still knows who is messaging whom. Sending a message to a pharmacy or certain doctors would raise flags. E2EE isn't completely safe. Briar is decentralized (no central company or servers to watch) and uses Tor to hide IPs.
This is key. Both parties need to trust that their messages are staying within the realms of WhatsApp itself, and with the usage of disappearing mode, not leaving any traces of a conversation.
They should probably be using encryption within the chat itself (and not, you know, speaking in plain English) to add another layer. Perhaps changing the keys frequently via an agreed method (thinking about how to do that safely without leaving another trace) to render older messages 100% undecryptable.
But yeah, chances are all of that data is going to be accessed and they agreed to it! It's right there in the privacy policy people don't bother reading
The story is Roe v. Wade overturned due to heinous corruption of the US political process in favour of the religious right and free market libertarians.
Roe v. Wade was a corruption of the political progress (and I say this as a non-american pro-abortionist). If you want to have a law on abortion make an actual law.
I mean, SCOTUS said otherwise for fifty years. They decide what is and isn't legitimate and it was fair play until last year.
I agree with you in principle but AFAIK "actual laws" have never been necessary when the SCOTUS has declared a Constitutional right. Individual states don't need to codify a right to keep and bear arms, or freedom of speech or religion, nor can they choose to re-enable slavery or segregation, because those have all been decided at a higher level.
Yes, but a caveat by SCOTUS has the lifetime of a given group of justices. If you want that law to be effectively immortal, it needs to be implemented through the legislature, not backdoored through judicial fiat.
"Legislating from the bench" is considered poor form in legal circles for a reason.
>And it's somewhat unclear to me why this should be a federal issue at all... isn't murder a state issue?
That's an odd framing for a self-described "pro-abortionist" to use, but no, obviously federal law against murder exists[0].
But of course the question of whether or not abortion is a matter of murder, fundamental bodily autonomy or both is a quagmire not worth getting into. Not that it's relevant to Roe v. Wade, or its appeal, because that rested on the question of the existence of a fundamental right to privacy and stare decisis.
I mean, read the dissenting opinions on Dobbs[1]. I think a good case is made there as to why this shouldn't be an issue left to the states, and why Roe wasn't repealed because it was bad law, but because the court was stacked with ideologues who were opposed to Roe for religious reasons.
You should be at least as angry about that as Roe itself, if not more so because that represents a far more egregious corruption of the system, but I suspect you aren't.
Safeguarding the privacy of a user's personal information is a technological problem that should be solved through the development and correct usage of cryptographic primitives that secure that data.
Wrong. It's a people problem. Any technological measure capable of reliably recovering information will be utilized or forced to be tapped by law enforcement. The problem is the law. The solution is not tech. The solution is changing the law. The cryptographic primitive approach is an elitist wet dream. The moment you employ it in enough sketchy contexts, then it itself will be seen as evidence of criminal doing. The UK/Europe has already been sniffing in that direction w.r.t possession of heavily encrypted phone handsets/telephony systems.
The story is Roe v. Wade was overturned due to the US political process working exactly as it as designed to, with the religious right telegraphing their play for decades, and progressives doing nothing about it except trying to fundraise on the aftermath.
That's not entirely true. They also bought candles and tote bags.
More seriously, there has been substantial opposition to the anti-abortion long game but it's not easy to secure abortion rights in Nebraska from NY without federal law. But Roe made securing abortion federally a little moot and maybe not the hill a slim majority wants to die on when they have other priorities. Then they didn't get justices to retire when they could be replaced, although it's questionable that would have been allowed to happen anyway. Bad strategy but not complete inaction. They have been trying to oppose laws in states but gerrymandering means that the Right gets unfairly more representation in those states like Georgia.
The framing is inflammatory. What most people can agree upon is that the surveillance state is out of control.
I'm sure I'll be flamed into oblivion here, but it is worth considering the headline from the perspective of the anti-abortionists. If we cannot empathize or attempt to understand those who disagree, what is the point of having a discussion?
"FB and Google hand over user data, help to prosecute baby killers"
Reasonable people can disagree on the topic of abortion or at what stage of pregnancy it is acceptable. HN is not the place where I want to have that discussion. It has already been explored at depth elsewhere.
I've often been puzzled by why the abortion argument is seen as a binary. The majority of people in the developed world feel that the abortion legality line should be somewhere between conception and birth. Honestly, based on my somewhat biased sampling, there's probably a similar number of people who think infanticide should sometimes be legal as people who think a single fertilized egg has a right to life.
This is definitely a big part of the problem. Media outlets are looking for outrage. They need only conflate a ban on late term abortion with a blanket ban. For the other side, vice versa.
Think of it this way though: 50 years of legal precedent was overturned last year. Women who are young enough to get abortions have lived their entire lives not being considered murderers, legally. Abortion was an explicitly protected legal right. Now suddenly, a woman is transformed overnight from a citizen freely exercising their legal right into an illegal murderer — but of course only in certain localities. How are they supposed to deal with this massive U-turn? Can we not empathize with them?
It's one thing to make providing abortions illegal. It's quite another thing to prosecute the women.
Abortions were a convenient way to keep women locked into the workforce. There could be no escape from wageslaving. Now that the population is in the shitter, they criminalize abortion.
That's at least a tertiary factor in "freeing" women.
1. The creation of domestic appliances that dramatically shorten house hold chores (washing clothing is a main one).
2. The move from the bulk of work labour being less physical.
Both meant there was less time needed to maintain a household, previously that was a full time job that seemed unpaid. And that women had suitable roles available in the work force where as manual labour would mean they likely couldn't compete with men for employment.
While society gained extra productivity from these changes, it's debatable if households gained financially as it was likely a powerful force in the mid century inflation.
I'd place centralised schooling in that list of things which freed up more labour. Contraceptives are recognised for this too, and I'm sure it lead to a sexual liberation at least.
Indeed. A household with a single wage earner will have a lot more free time. Its not just cleaning and cooking, it's paying bills, organising holidays, going shopping, arranging events/parties, socialising with neighbours, taking kids places, birthday cards/presents, keeping up with local council issues to be able to vote sensibly... the list goes on.
Honestly, given where the productivity gains of women in the workforce went (not to the actual workers), we'd probably have a better quality of life if only half the household adults worked.
Wages stagnate when there are an abundance of workers and a shortage of jobs. Corporations don't care about empowerment of women they care about keeping wages down across the board.
It's why they were offering women several thousand dollars in reimbursements to travel for an abortion. Those employers were unwilling to provide the same amount to women who wanted to keep their baby.
Good question really, I suppose it depends if having had an abortion is a crime. I've only peripherally followed the news but i think some places do want to make it a crime.
Disingenuous. Contraception and abortions freed women from being locked into patriarchal family structures (see the thousands of years of history where women were passed from father, to brother, to husband with no choice in the matter).
Not entirely, but it is a contributing factor: Women aren't suck having children and stuck staying with some guy because she can't support herself. It isn't the only benefit nor the only outcome of having these - we've had different sorts of abortion and imperfect contraception for thousands of years. Women are still stuck in such societies. Safe contraception and safe abortions keep women from dying. Women are able to have fewer children, spaced further apart, making for healthier women and healthier children - even in such traditional societies, assuming folks have access to these things.
Unless they change the laws and suddenly you have something to hide.