The Unix model is a mess, the WOSA is a mess, personal computing hardware is a stack of hacks, programming environments are a poorly integrated mess, software 'engineering' is still back in the age of siege engineering rather than modern engineering discipline, and emacs still takes ages to boot.
Computers are marvelous productivity and life-enhancement tools, but following the easy path has led to stagnation in the mainstream.
upvoted for "emacs still takes ages to boot" -- hit me right where it counts.
I wonder though, is reckless advancement better than taking time to do it right? A lot of people are finding success with the fail-and-iterate model -- isn't that essentially what we've identified as a problem now?
Incremental improvements are improvements, but one runs the risk of reaching a local maximum. As we see a significant change in the landscape (say, networking, SMP, or GPGPU, which haven't been prevalent since companies wrote their own minimal operating systems when receiving a computer), we build on what we have rather than breaking it down and factoring new capabilities into our way of thinking. We know it's a good idea to do so, but the advancements tend to be limited to academic or research oriented projects rather than widespread commercial pursuits.
There's a good reason for this, which is backwards compatibility, in terms of concepts, education, maintaining a skilled workforce, and retaining the value of prior investments in hardware and software.