Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Did I just read a 'because it's popular, it's better' argument?"

I could try and weasel out, and I kinda will, because that's a bit stronger than I think my argument really is, which is that if something is popular, it is likely to be better. (Well, that's not the entirety of my argument, I'll come back to that for a second.)

When you assert the strong form, like you just did, the argument becomes exceedingly silly, because rephrasing almost ANYTHING short of mathematical formulas or laws of physics as a Universal Truth makes it at least somewhat silly. But when you look at it as a probability rather than as an absolute, yes, you should expect more popular things to be better than less popular things, no? Not necessarily even overwhelmingly so, but more often than not, right? And quite often, when you scratch an argument for why something that has lost (rather than some new innovation) is better than the thing that became popular, often what you find underneath is a mismatch between what the arguer thinks is better and what the people who chose the popular thing think is better. And maybe they're wrong about what's better, or maybe they were right about what was better given certain constraints that Moore's Law or the cloud or whatever has made irrelevant and now they're persisting due to network effects or whatever.

But to go back to the original poster, who's a Perl programmer trying to figure out what to do with himself. I think writing off the popularity of languages that followed Perl as being due to fadishness, rather than actually investigating why people chose to develop new languages, learn them and build tools for them, is putting an obstacle between yourself and learning new things. You may learn that people had bad reasons, or that they had good reasons at the time but you can do something better than what they had to choose from then. But you'll learn new things, rather than closing yourself off to them by writing off new things that replace what you know best as "fads" and people who adopt them as driven by something other than merits.

It's sort of the liberal flip-side of Chesterton's fence -- Chesterton says that you shouldn't tear down a fence until you can understand why it was built, i.e. you shouldn't change something unless you understand it. The converse is that if everyone keeps tearing down a fence, you should understand why before you put the fence back up again.



I think writing off the popularity of languages that followed Perl as being due to fadishness....

You may be responding to something that isn't in the article at all.

I first learned Ruby in 2000, mostly because Dave and Andy sent me a copy of the just-published Pickaxe book. There were few reasons to use Ruby in the English-speaking world back then, unless you were a language magpie or had very simple needs and didn't already have a scripting language in your toolkit.

I didn't do much with it until 2004, when Dave told me to check out the nascent Rails. (I'd already failed to convince my employer to cover Ruby in more detail even though many of the clever and insightful people I knew had started to pay attention.) We published the first Rails article anyway, and it was a good thing we did.

Even then, I didn't use Rails seriously myself because it and Ruby offered no benefits and only drawbacks. It was slower and had fewer libraries and worse tools than what I was already using. (Unlike a lot of Rails adopters, I wasn't primarily working in either J2EE or PHP.)

In 2014, even with Ruby and Rails off of their 2007 peaks, the metrics have changed a little bit. Ruby and Rails have better job prospects, in my experience, while the tooling and ecosystem and languages themselves are roughly equivalent. (One's better in some aspects and the other is better in others.)

Funny thing, though. It seems like around here Rails or Django are the old guard--the conservative technologies no one will lift an eyebrow at you using--and things like single-page client applications served by Node.js are the exciting technologies with new libraries and frameworks announced every week.

Some of those will succeed. Some will fail. Given that I'm not 14 or 19 or 22 anymore, that I spend my days working hard to deliver value for clients and/or employers, and that I have other things to do on nights and weekends than sit in front of a computer, where do I focus my time and energy and resources so that I can both get things done well now and continue to be employable for the next fifteen or twenty or thirty years?

That is what the article was about, not calling everything but Perl a fad.


I highlighted the quote that bothered me; it may not be relevant to your point but I actually think it is.

Why is Node.js popular now? Because people want to create web applications that are responsive without requiring the client to constantly poll the server to see if there's new data, and Node.js does that better than existing tools.

Why are NoSQL databases popular now? Because right now, it's a lot cheaper to scale up by buying more computers than it is to scale up by buying better computers, and current relational databases are bad at scaling up that way (compared to other parts of the stack). Now, I think that NoSQL sometimes throws out the baby with the bathwater, and I do think there are cargo cult NoSQL users who are convinced that if they go around throwing out babies they'll eventually get some dirty bathwater. But NoSQL does solve real problems, and (to the point here) those are problems that are really difficult to predict that you'll need to solve from 15 years before now, because you need to know how the economics of buying computer hardware will change in the future.

I don't think it is possible, at this point in the development of these technologies, to future-proof your skills if you want to work on new projects. You can either keep learning the new things, or waiting until other people find out which of the new things are actually going to last and then learn those things. You can use your current skills to maintain legacy systems. Or you can get into something like management, where you can leave specific technical details to other, younger people and focus on broader issues, like relations between your developers and the people they're serving.


Why are static site generators popular now? Because not everyone's building a social network for marmots that has to serve 300,000 image macros a second. Plus c'est la même chose.


"But when you look at it as a probability rather than as an absolute, yes, you should expect more popular things to be better than less popular things, no? "

Possibly in areas where no serious money is involved, but I haven't experienced this in the big corporate/government world myself. It can occasionally be true, but it's certainly not the rule


I think this generally applies in that case: "often what you find underneath is a mismatch between what the arguer thinks is better and what the people who chose the popular thing think is better."

A simple example of this would be going with the bloated, more expensive, more error prone product. It may just be that service contracts, existing relationships, available support or even just name recognition (Nobody ever got fired for buying Microsoft...) trump those in some cases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: