Hacker Newsnew | past | comments | ask | show | jobs | submit | ardy42's commentslogin

> misinformation is combat with MORE discussion, not less!

The problem with that idea is that it takes more effort to debunk a lie than to tell one. It also takes more effort to absorb a debunking than a lie. That's why disinformation works.

Here's an example: JFK ate babies occasionally, and the media hushed it up. Oswald was actually a secret high-level CIA operative, and was so outraged by this that he assassinated JFK for it.

It took me two seconds to write that. How much effort would it take you to debunk it?

It's just not practical to put all the burden of combating misinformation on each individual's shoulders. It's also necessary to stop the spread of misinformation. That doesn't need to be done by a central authority, but people who've been convinced by a lie will perceive that as "censorship" by one.

> scientific consensus is not arrived at when every scientific paper says the same thing. this is a fundamentally wrong view of science and also reality. on any given topic, the corpus includes opposing conclusions. eventually we figure out why and discern the underlying principles.

Scientific consensus is also not arrived at by publishing literally every crackpot idea, and answering each with "more discussion." Science has several mechanisms for "censoring" bullshit and misinformation (e.g. peer review), and it couldn't function without it. "More discussion" is saved for cases where those mechanisms failed.


> Scientific consensus also not arrived at by publishing literally every crackpot idea, and answering each with "more discussion." Science has several mechanisms for "censoring" bullshit and misinformation (e.g. peer review), and it couldn't function without it.

What counts as a "crackpot idea?" We don't have to dabble in hypotheticals about JFK eating babies. We have real examples from current political events that show we're not talking about "slippery slopes" here. We have rolled down the slope with stunning speed.

In March 2020, the Surgeon General suggested that wearing masks was effective to prevent spread of COVID was a crackpot idea: https://thehill.com/policy/healthcare/485332-surgeon-general... ("Seriously people- STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus[.]").

10 months later, the Surgeon General is calling that same assertion a "myth": https://twitter.com/surgeon_general/status/13189727242078986... ("There is a currently circulating MYTH suggesting masks don’t work to prevent spread of COVID-19.").

I have a degree in aerospace engineering--I totally get that scientific understanding evolves. But it doesn't evolve like that. The truth is that the Surgeon General's March 2020 statement was ill-advised and overly-certain, and so was the October 2020 statement. Whether masks are effective at limiting the spread of COVID is quite uncertain. Mask-wearing rates vary quite dramatically between countries with similar COVID death rates: https://www.economist.com/graphic-detail/2020/07/08/face-off.... By June 2020, the U.S. had mask-wearing rates of 75%. Denmark, Sweden, and Norway were under 20%. Out of those, Sweden and the U.S. have death rates (per population) 5-10 times higher than Denmark and Norway.

Despite that uncertainty, I think most people worried about "misinformation" would use mask-denialism as a motivating example for why restrictions are needed. So what are the restrictionists really advocating for here?


And it was damned foolish to say "masks don't work" if what they wanted the public to understand was "please leave surgical and N95 masks for healthcare workers. We are exploring the effectiveness of cloth masks".

THAT would have been honesty, it would have explained the reason they didn't want the general public using masks, and it would have hinted at an alternative while not directly confirming masks work (or don't work).

NOT TO MENTION that the CDC probably could have asked South Korea, Taiwan, Japan, or any other country where mask usage was common, "How well do masks work?" and been pointed at a few relevant studies, right? But no, they make a very fishy statement to the public claiming masks don't work for normal people.

/rant Sorry. You hit a nerve. Pretty frustrated that the CDC would throw away its credibility like that.


> The issue is actually pretty uncertain, and government bodies are making categorical statements for political reasons

I think it's more complicated than just politics, as I was saying elsewhere (https://news.ycombinator.com/item?id=26139732), public health officials advised against mask-wearing for general public initially for a very particular reason (possible shortages for medical frontline workers). As far as public healthy policy is concerned, where you cannot pass a certain threshold of complexity in communicating best practices to grandmas around the nation, masks work is a good enough message and it stands on pretty solid science: https://www.medrxiv.org/content/10.1101/2020.07.31.20166116v...


> It took me two seconds to write that. How much effort would it take you to debunk it?

How long would it take you to establish enough credibility to be able to make an accusation like that and have people actually take your word for it? There might be a few nutters out there who are so predisposed to hate JFK that they'll believe anything negative about him, but most people - even those who dislike him - would rightfully question such an outlandish statement made by someone with no credentials.

Dishonest people retain credibility when their supporters are trapped in echo chambers designed to keep the truth out. Censorship is a powerful tool for establishing and maintaining echo chambers. We need to fight echo chambers, not promote censorship.


> How long would it take you to establish enough credibility to be able to make an accusation like that and have people actually take your word for it?

Keep this in mind: Q is literally some dude on 4chan/8chan with a tripcode.

> There might be a few nutters out there who are so predisposed to hate JFK that they'll believe anything negative about him, but most people - even those who dislike him - would rightfully question such an outlandish statement made by someone with no credentials.

I make no claim that my example lie is a good example of misinformation/disinformation. It was only meant to show the asymmetry of effort implicit in "more discussion."

The key thing about getting a lie to stick is to hitting the right emotional buttons with it. And it's so easy broadcast lies nowadays that you can even discover those buttons stochastically, by just throwing random lies out there and seeing what sticks.

Furthermore, if your goal is not to convince anyone of anything in particular, but to just to gum up a society (which is the goal of disinformation, properly understood), you don't event need to find particular lies with a broad appeal across society. You just need enough lies that enough people fall for one or two.


I believe it is reasonable to speculate that QAnon members are generally trapped in extreme, right-wing echo chambers. Echo chambers enable people to retain undeserved credibility.


> I believe it is reasonable to speculate that QAnon members are generally trapped in extreme, right-wing echo chambers. Echo chambers enable people to retain undeserved credibility.

That's not true, for instance:

https://www.startribune.com/conspiracy-theories-of-qanon-fin...

> Conspiracy theories of QAnon find fertile ground in an unexpected place – the yoga world

> QAnon's conspiracy theories have taken root among yogis and other adherents of natural medicine.


I'm not sure how that demonstrates QAnon members are not generally trapped in right-wing echo chambers. Are yoga practitioners exempt from right-wing echo chambers?


> I'm not sure how that demonstrates QAnon members are not generally trapped in right-wing echo chambers. Are yoga practitioners exempt from right-wing echo chambers?

I suppose a significant number of yoga teachers/influencers could be secret dittoheads, but the idea kind of beggars belief.

One of the interesting things about QAnon is that it offered on-ramps to groups outside the stereotype of people would go for such a theory (e.g. "save the children"). People in right-wing echo chambers were definitely more susceptible, but it's a mistake to be reassured by that.

Also, particular echo chambers aren't some kind of primordial entity. They start all the time and they often grow. So even if something like QAnon requires one, that just means there's one more step.


Are yoga practitioners usually liberal? Is that a thing? My perception has always been that yoga communities tend to attract those interested in "alternative medicine", a group which certainly has its own share of echo chambers. Given the apparent ideologically-insular nature of both groups, I'm not surprised that there would be overlap between the them.

Echo chambers are not a new phenomena, but they have certainly become more powerful with the rise of the internet. Never before have we been so easily able to surround ourselves with groups of like-minded individuals. But what I find even more concerning are algorithmically-driven content feeds which are tailored to suite the preferences of each individual user.

Algorithmically-driven, tailored content feeds basically automate the creation of echo chambers. It all sounds well and good to the user - after all, they get access to more of the type of content they prefer. However, those feeds almost inevitably learn to always provide the user exclusively with content that reinforces their preexisting ideas and opinions. They'll eagerly spread things like QAnon if it results in increased user engagement.

I don't think there's anything particularly special about QAnon compared to any other politically-charged conspiracy group. I think they just got lucky and once they passed a certain threshold of popularity, the algorithms did what they do best.


So you'd like to make lying illegal...? I have an amendment to show you, it's actually the very first one!


Your example actually tells something. Nobody would believe your JFK baby eating story. It is easy to write a fake story, but it is not easy to have lots of people believe your fake story. "Misinformation" can spread because they seem plausible to enough people, not because they are "bullshit" like your example.


> "Misinformation" can spread because they seem plausible to enough people

Conspiracy theories are only believed by those who already mistrust the target. If there's a lot of conspiracies revolving around something/someone, you have a trust problem.


How about Jewish space lasers?


The problem with your calling this example out, is that people will believe this stuff if down the rabbit hole enough.

Jan 20th Biden and Harris were supposed to be arrested and their pedo evidence was suppose to be shown to all, along with evidence of election fraud.

The next one is what March 7th?

A lot of people think an ancient all powerful being will re-appear and lift up adherents on high, and punish "bad" non believers.


I wish you were right, but you are not. It is very easy to have lots of people believe a fake story.

QAnon conspiracy theories are incoherent and absurd yet are embraced by thousands and cause needless harm to many.


> It is easy to write a fake story, but it is not easy to have lots of people believe your fake story.

I thought so too before Pizzagate, Q-Anon....


Perhaps a better example: Jewish people are telling you the earth is round so that way they can distract you from the fact they're kidnapping children and drinking their blood.

A fantastic video on the topic of difficult to debunk, but easy to produce content https://www.youtube.com/watch?v=JTfhYyTuT44.


> It's political censorship disguised at fact checking. If they were really concerned about "fake news", they would apply the same standards to the other side. Remember how just a couple years ago almost all of the major media outlets were pushing a conspiracy theory about "Russian Collusion"? Yet nobody is talking about shutting ABC, CBS, NBC, or MSNBC down, kicking them off cable, or banning them from the internet.

You're misremembering. Russian collusion was an allegation that needed to be investigated because of 1) the actions of Russian intelligence agencies to influence the election in ways advantageous to Trump, and 2) weird things members of the Trump campaign did that were suspicious. However, the major media outlets only reported on that, and it didn't go on to claim that there was actual collusion.

For instance, here's the first page of results from major media outlet search for "Russian collusion" (ending the day before the Mueller report):

https://www.nytimes.com/search?dropmab=true&endDate=20190321...

> BRIEFING Judge Doubles Down on Scrutiny of Roger Stone’s Book

> POLITICS Roger Stone’s New Instagram Post and Book Draw Scrutiny After Gag Order

> POLITICS Trump Lawyer ‘Vehemently’ Denies Russian Collusion

> POLITICS ‘Collusion Delusion’: Trump’s CPAC Speech Mocks Mueller Inquiry

> POLITICS Trump Says There Was ‘No Collusion’ With the Russians

> OPINION The Russians Were Involved. But It Wasn’t About Collusion.

> POLITICS Indictment Details Collusion Between Cyberthief and 2 Russian Spies

> OPINION How Will ‘Collusion’ Play in the Midterms?

> OPINION Can We Please Stop Talking About ‘Collusion’?

> OPINION Oh, Wait. Maybe It Was Collusion.

That last piece sounds like it's the closest to saying there was collusion, lets see what it says:

> What remains to be determined is whether the Russians also attempted to suborn members of the Trump team in an effort to gain their cooperation. This is why the investigation by the special counsel, Robert Mueller, is so important.

That's not pushing a conspiracy theory.

On the other hand, Fox News, OANN, NewsMax, etc. lied to the extent that they're legitimately worried about the conspiracy theories they were pushing costing them a lot of money from defamation lawsuits:

https://www.cnn.com/2020/12/19/business/fox-smartmatic-news-...

https://www.cnn.com/2021/02/05/media/lou-dobbs-fox-show-canc...

https://www.cnn.com/videos/business/2021/02/03/newsmax-mike-...

However, I don't fault you too much for this confusion. There are a lot of liars out there who've found a lot of success "arguing" with false equivalencies to keep their followers loyal to the cause.


But if there are 34 billion tethers, $18 million is like a rounding error.


The difficulty is that so much infrastructure has been built around the assumption of car use, and that will be very expensive and take a long time to redesign and replace.


> Surely there must a technical trail to investigate.

The probably is, but I'm guessing the really definitive records would probably require a subpoena to obtain (e.g. business records of the domain registrar or hosting company).


> It‘s a pretty strong case for Len to be Satoshi, admittedly. And it would explain the mysterious silence from Satoshi Nakamoto - he's dead.

I think it's pretty likely that the person (or people) who went by Satoshi is dead, and probably died before their Bitcoin hoard was worth much.

If not, you'd have to explain why someone is giving up the opportunity to never have to work for someone else to support themselves ever again. And those are less believable to me: e.g. 1) already having so much FU money they're secure in their lifestyle w/o the Bitcoin, 2) being either extremely ascetic or happy to work for someone else to support themselves, 3) deliberately choosing to destroy it for some reason, 4) losing the all keys in an accident (more understandable for a rando playing around than the creator), etc.


I mean they probably mined other Bitcoin along the way. I’m pretty sure they could have cashed in at points without touching those earlier addresses.


Were there any scientific utility to these payloads, or were they added in order to create some spacecraft porn? How much did the equipment for this weigh?

IIRC, a downward facing camera is useful for precisely locating the landing location, but I'm having trouble seeing a purpose for the upward facing camera besides getting cool sky-crane video.


There is an engineering use for it, and that is to the see all the dynamic behaviour that other sensors might not capture. This is rocket science and they do want to know how the landing went.


> Per their press conference today, in the Q&A section. https://youtu.be/gYQwuYZbA6o?t=4025

What was the model? The recording was garbled, and all I could hear was "...we're using a commercial computer, an Intel ???? PC. It's running Linux...


https://link.springer.com/article/10.1007/s11214-020-00765-9 has details:

> The DSU is an off-the-shelf computer-on-module (CoM) from CompuLab Ltd with an Intel Atom processor and solid-state memory. The DSU runs the Linux operating system, along with additional software to communicate with the EDLCAM sensors, perform the EDL data collection sequence, manage the data storage and compress the collected data files. The DSU uses a high-density connector to provide connectivity to the high-speed USB3, USB2, gigabit ethernet and SATA interfaces.

> The main DSU is located inside the rover body. A second DSU, the descent stage DSU, is located on the descent stage. In both DSUs the CoM is connected to a custom electronics board that provides connectivity for all the USB devices. The two DSUs are almost identical to each other and communicate with each other through a gigabit ethernet link. The rover DSU includes a 480 GB solid-state flash memory drive (SSD) for data storage, provides a gigabit Ethernet link between both DSUs, and implements the high-speed serial communication protocol to communicate to the rover computer.



> Griddy is simply an energy provider you can choose to hedge your bets and save money if you are smart. Plenty of people using it shut off their power or switched providers when they were informed of the incoming price surge. That some households ignoring the warnings shouldn't be a reason a service like Griddy can't exist.

It's not reasonable to expect households to monitor their electricity rates or alert emails that closely nor be prepared to drop everything to switch providers at a moment's notice for an entirely unexpected reason. I know I don't immediately read every email I get.

An even if someone was monitoring that closely, or was ready to jump, there's no guarantee they'll be able to switch:

https://6abc.com/griddy-gridy-texas-power-bills-what-is-ener...:

> In a rare move Sunday night, Griddy sent out an email to all 29,000 of its customers, urging them to switch to a different provider. Thigpen says she tried Monday morning, but no one was taking new customers....

> We reached out to a few providers here in Texas. They are not taking customers. Some say they may accept new customers by next Wednesday, when they say the weather has improved. That was the earliest 'maybe' answer we could get.


> This article seems to suggest the answer is a fully automated smart home, with some kind of AI to intelligently manage your power usage. Sounds awesome, but I don't think that's ever going to be a reality outside the valley.

It also sounds awful. It would objectively be a regression from reliable electricity to unreliable electricity, like in an undeveloped country.

The only thing such a technology would do is give the power companies a BS way to shift the blame for their planning fuck ups onto consumers, because technically it would be the consumer's equipment that killed the power.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: