Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why not? In both cases the result is losing the thread of thought.


Because analogy can be useful in explaining things, or it can be worse than useless - it ties our thinking up into side quests that have nothing to do with the matter at hand.


...No, no that's not how ADHD works. It's difficult to sum up how wrong this is concisely, but I invite you do to some serious research into ADHD, how it functions, and the great variety of ways in which it can present in different people. It's quite a poor analogy.


I'm aware that anything to do with the brain has a variety of presentations.

Could you try to put a couple sentences down on how ADHD is an inapt metaphor for failure modes in this case?

It's lazy to claim something is wrong without offering a useful point as to how it's wrong. I trust in your ability to summarize.


For additional context/discussion, I feel this comment[0] elsewhere in the thread put it well.

The reply to that comment also has some information I feel is helpful to show the breakdown here. It mentions that lack of attention presents in only 15-20% of cases. This isn't ADHD, it is something new, the fundamental underpinnings do not relate, and so the analogy/metaphor does not facilitate a better understanding of the situation.

On the contrary, it makes LLM "attention" out to be something entirely different from what it actually is. Without attention, models don't become easily distracted. They are easily distracted regardless. Without attention, LLMs primarily fail to disambiguate between different meanings of identical words, they fail to take context of the sentence structure into account when assigning meaning.

I hopefully don't have to dive into psychological and chemical specifics of ADHD to have demonstrated that this is fundamentally just not at all what ADHD is. Again, there is no underlying harmony between this mechanism and how ADHD affects human attention in 15-20% of cases, and there is no analogy.

The only similarity is that they both use the word "attention". If they'd used a different label, we wouldn't even be having this conversation right now.

[0] https://news.ycombinator.com/item?id=42585600


It’s lazier to claim something is correct without offering a useful point as to how it’s correct. I trust in your ability to theorize.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: