Hacker Newsnew | past | comments | ask | show | jobs | submit | bestcommentslogin

Not a dumb question. The shortest (and at a glance unsatisfactory) answer is because it works, and therefore it evolved that way.

Going in detail, first consider that for a feature to be evolutionarily selected for two things have to be true:

1. It must increase the fitness of the organism that carries it, i.e. the likelihood of its carrier having descendants as compared to non-carriers ( or be a side effect of another feature that improves fitness enough to be a net positive, etc etc )

2. It must be inheritable (and, in sexually reproduced organisms, mutually compatible during embryonic development).

One such a feature has reached dominance in a given population, as long as it continues to be important for fitness it cannot really be deprecated in favour of an alternative from scratch, even if that alternative is arguably better.

That's why, for instance, vertebrate ocular nerves connect to our retinas on the inside of our eyeball, resulting in us having a blind spot. Cephalopods, on the other hand, evolved their eyes independently the "reasonable" way, connecing their nerves from behind the eyeball. There's no way a vertebrate could mutate from scratch for its optical nerve to connect to the retina from behind without causing absolute mayhem in embryonic development. Our hacky solution for the blind spot? Let the brain hide it in software.

Going back to your question, some spots of the body being more sensitive than others became critical for evolutionary fitness long before nervous systems were complex enough to generate conscious qualia, let alone enough for them to be consistently involved in decision making. Furthermore, mapping of specific nerves to intensity of feeling on the CNS would imply complex hardcoding of something which is much easier to solve with "this place important, have more nerves", and maybe would even conflict with the fitness benefit of a CNS with enough neuroplasticity to learn anew during the development and lifetime of an organism.

So, in summary, the solution of having more nerves where it matters is simple, good enough, and has no reason to be rolled back in favour of a radically different alternative.


A large amount of Voyager 1 & 2 's success isn't just technological it is the ability to take advantage of a specific planetary alignment for a gravity assist [1] that can only occur every 175 years [2] .

[1] https://en.wikipedia.org/wiki/Gravity_assist [2] https://en.wikipedia.org/wiki/Voyager_1#/media/File:Voyager_...


Ironic, from reading the article it actually takes a while to find the research...

https://www.biorxiv.org/content/10.64898/2026.03.18.712572v1

>>> PDF with the images

https://www.biorxiv.org/content/10.64898/2026.03.18.712572v1...


"The TurboQuant paper (ICLR 2026) contains serious issues in how it describes RaBitQ, including incorrect technical claims and misleading theory/experiment comparisons.

We flagged these issues to the authors before submission. They acknowledged them, but chose not to fix them. The paper was later accepted and widely promoted by Google, reaching tens of millions of views.

We’re speaking up now because once a misleading narrative spreads, it becomes much harder to correct. We’ve written a public comment on openreview (https://openreview.net/forum?id=tO3AS KZlok ).

We would greatly appreciate your attention and help in sharing it."

https://x.com/gaoj0017/status/2037532673812443214


The fact that they hijack scrolling to artificially limit scroll speed is insane to me. Feels like I'm trying to navigate through molasses

I spent some time investigating this, and the issue is not accurate - Claude Code itself does not have code that spawns `git reset --hard origin/main`

Most likely, the developer ran `/loop 10m <prompt>` or asked claude to create a cron task that runs every 10 minutes and refreshes & resets git.


I would actually suggest this is symptomatic of the real problem: money in politics.

Elected officials (and some appointed, like SCOTUS) keep changing laws and precedents to allow more and more money in politics. They can't quit all that dark money - without a lot of funding, you don't get elected. Usually the best funded candidate wins.

There was an anonymous oped from a congressman some years back which bemoaned the reality - that 60% of their time was dedicated to meeting with donors for reelection campaigns instead of working on real problems.


Microslop for a while now seems to be testing exactly how much you can abuse the user before they move somewhere else. Windows is a prime example. Everything is ads, tracking, popups, annoyances, etc.

They have got away with it for a while because a lot of users have largely been stuck, but they are in real trouble now with Apple providing meaningful competition.


Earth's "radio bubble" is well over 100 light years across now. If there are aliens out there, they are probably already on their way to ask us in person why Ross, the largest Friend, doesn't simply eat the others.

Solar is not less than revolution in Pakistan. Almost every home and factory has solar installed on their roofs. More affluent houses have almost gone off grid; others are selling back to grid and others who can't afford has their own small scale 12V solar panels to run fans in the scorching summer of Pakistan to save electricity bills. It is all done by people independently without much support from the government as ROI (if you are using full potential of your installed capacity, it can be as low as 1 year and afterwords it will be free) is much better on solar than paying the grid.

I myself has got one my roof, 6KW with 5Kwh battery backup costing me 700K roughly 2500$. Now, I can use AC without thinking of electricity bills and the most importantly I do not have to face inconvenience of grid being not available in some cases for 24 hours.

Now Pakistan is facing energy crises not because it does not have enough, because it has too much as people are generating their own and due to nature of the contracts with electricity producing companies' government has to pay them according to their installed capacity not by generated.

According to a government report in 2021, 116,816Gwh was consumed commercially and in 2024 it stands at 111,110Gwh and in 25 and 26 in would be even lower.

Isn't it insane?


Reddit is notorious for being awful at real life interactions

just look at the relationship subreddit the first answer is always divorce, it’s become a meme

but beyond romantic relationships, i think a lot of us have seen how it can impact work relationships, i’ve had venture partners clearly rely on AI (robotic email responses and even SMS) and that warped their perception and made it harder to connect. It signals laziness and a lack of emotional intelligence

AI should enhance and enable connection, not promote isolation, imo this is a real problem

it should spark curiosity, create openings for conversations, point out the biases to make us better at connecting with other people, i hope we get to a point where most people are made kinder by ai. I’m seeing the opposite atm, interested in hearing others experiences with this


this kind of "action"/"settlement" is too funny:

>"As part of a settlement, OkCupid [...] will be prohibited from misrepresenting its privacy policies."

>"Under the proposed settlement, OkCupid and Match are permanently prohibited from misrepresenting or assisting others in misrepresenting: [...]"

every company should already be "prohibited from misrepresenting its privacy policies" and the collection/controls stuff.

12 years, including intentional obstruction of the ftc investigation, and we get "please dont do that again". (dad voice: im not surprised, just disappointed)


AWS has a similar RAM consumption. I close Signal to make sure it doesn't crash and corrupt the message history when I need to open more than one browser tab with AWS in the work VM. I think after you click a few pages, one AWS tab was something like 1.4GB (edit: found it in message history, yes it was "20% of 7GB" = 1.4GB precisely)

Does anyone else have the feeling they run into this sort of thing more often of late? Simple pages with just text on it that take gigabytes (AWS), or pages that look simple but it takes your browser everything it has to render it at what looks like 22 fps? (Reddit's new UI and various blogs I've come across.) Or the page runs smoothly but your CPU lifts off while the tab is in the foreground? (e.g. DeepL's translator)

Every time I wonder if they had an LLM try to get some new feature or bugfix to work and it made poor choices performance-wise, but it completes unit tests so the LLM thinks it's done and also visually looks good on their epic developer machines


It’s incredible how polarizing the AI rush is. I keep the perspective that the technology is an absolute step change but I have no idea where the cards will fall. I take a lot of issue with these style of articles. I get a sense that the authors are being overly defensive.

The cost to serve tokens is absolutely profitable today and that’s been true for at least a year. What’s unclear is how R&D and capex fit into the picture. I am not that pessimistic on this front either though. For the data center build outs, demand for tokens is still exceeding supply. On the R&D front, well most of us here on HN have benefited from decades of overinflated engineering salaries being paid by often companies that were not profitable and not only unprofitable, usually without a plan for success. In this current rush, companies cannot keep up with supply, it’s a much easier math problem when you have something that people want (tokens) and you need to figure out profitability when including R&D.


Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.

Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)

https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...


I think this post potentially mischaracterises what may be a one off issue for a certain person as if it were a broader problem. I'm guessing some context has been corrupted?

I am not aware of actual code removal but skirting in that direction there was a movement, just a couple years back, to replace words that had become more offensive than they were in the recent past. One example is renaming master to main.

I am not stating any opinion for or against any words or terms in this context.


The vendor they used, Clearview AI, does not allow you to request data deletion unless you live in one of the half-dozen states that legally mandate it.

https://www.clearview.ai/privacy-and-requests

I have suddenly becomes very interested in New York's S1422 Biometric Privacy Act.


I’ve been sitting on this page for two minutes and it’s still not sure whether I’m a bot lol. What did I do in a past life to deserve this :(

There is a terrific documentary, 'Its quieter in the twilight', about the aging and dwindling team that still runs both Voyager missions https://www.youtube.com/watch?v=F6L9Du_IFmI

From the last few paragraphs:

> There is an official way for travelers to bypass long TSA waits if they’re willing to spend: hiring concierge services to escort them through security.

> Perq Soleil is an airport arrival and departure assistance service that can help travelers through TSA in about a minute flat by accessing alternative lines usually reserved for airport staff and airline personnel. The company — which operates in more than 300 airports and 150 countries — charges a base rate that varies by location.

Talk about burying the lede. Apparently the airports “highly discourage” line-sitters, but if you use services that pre-bribed airports you can skip the lines entirely.


> If you have a public website, they are already stealing your work.

I have a public website, and web scrapers are stealing my work. I just stole this article, and you are stealing my comment. Thieves, thieves, and nothing but thieves!


The thing that has been bothering me for a while is that the USB spec allows for software detection of capabilities. You can read the emarker data and see the supported protocols, speeds, voltages, etc.

But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.

Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.


Yes, it felt like the universe provided in this case. I invest in my batch mate. We end up becoming best friends, our wifes become best friends. Biotech VCs are afraid click chemistry isn't hot enough. I lead multiple rounds and become the biggest investor and it becomes my biggest investment. The year I get cancer click-chemistry wins a Nobel prize. And I become a patient. Unbelievable.

This study doesn't correct for baseline exponential decay due to inflation, to better highlight the meaningful variations. By comparing based on 1914 dollars it also causes old variations to be relatively more extreme and newer inflationary events to look less extreme. You must compare apples to apples.

Finally the events are quite cherry-picked. It is a conclusion looking for a result, when the statistical reason for choosing those 4 events simply isn't evident when you look at the data itself. There is no mathematical rule you could apply to your dataset that would distinctly highlight those 4 periods.


I implemented Contracts in the C++ language in the early 90's as an extension.

Nobody wanted it.

https://www.digitalmars.com/ctg/contract.html


Bias towards women would be understood by most readers as favouring them. I would have written bias against women here.

The best analogy I can think of (quite similar to this one) is that the internet is low Earth orbit and AI is the Kessler syndrome. We abandon the place not to hide ourselves, but because it is saturated with garbage, and anything you try to put up there will only result in even more garbage being generated, without any positive effect.

The ideal solution would be to remove the garbage, but right now we can't even detect it, let alone figure out a way to get rid of it. Besides, it's a zero sum game, why bother cleaning up when you can just effortlessly pump out more garbage in hopes that some of it will remain in orbit for long enough to benefit you.


This article makes no real sense to me.

>You think of something new and express it - through a prompt, through code, through a product - it enters the system. Your novel idea becomes training data. The sheer act of thinking outside the box makes the box bigger.

This was the same before, if you had a novel idea and make a product out of it others follow. Especially for LLMs, they are not (till now) learning on the fly. Claude Opus 4.6 knowledge cut off was August 2025, so every idea you type in after this date is in the training data but not available, so you only have to be fast enough. Especially LLMs/AI-Agents like Claude enable this speed you need for bringing out something new.

The next thing is that we also have open source and open weight models that everyone of use with a decent consumer GPU can fine-tune and adapt, so its not only in the hands of a few companies.

>We will again build and innovate in private, hide, not share knowledge, mistakes, ideas.

Why should this happen? The moment you make your idea public, anyone can build it. This leads to greater proliferation than before, when the artificial barrier of having to learn to code prevented people from getting what they wanted or what they wanted to create.


C++ is the last language I'd add to any list of languages used for correct-by-design - it's underspecified in terms of semantics with huge areas of UB and IB. Given its vast complexity - at every level from the pre-processor to template meta-programming and concepts, I simply can't imagine any formal denotational definition of the language ever being developed. And without a formal semantics for the language, you cannot even start to think about proof of correctness.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: