Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s a good question, what exactly is this decay and why is it called undeniable? Is that even true? If I think back on what programming was like thirty years ago up through today, everything about computing has steadily improved, gotten easier, more reliable, and higher quality, on average. All operating systems crash a lot less often than they used to. Computing devices from desktops & laptops, to phones, to routers & NASes, to household appliances, have all become faster, better, cheaper, and have more features and higher utility.

There are some ways I could see the author being somewhat justified, especially when it comes to the need for debuggers. Software is getting more layers. The amount of it and the complexity of it is going up. Debuggers are super useful for helping me understand the libraries I use that I didn’t write, and how my own code interacts with them. There are also a lot more people writing code than there used to be, and because the number of people writing code has been growing, that means the distribution skews toward beginners. I feel like the number of languages in popular use is also going up, and the diversity of coding environments increasing. I don’t know that I would frame all this as ‘decay’ but it does mean that we’re exposed to higher volumes of iffy code and abstractions over time.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: