You say you've never understood the benefit of unit tests but you list three really great things: they help you debug stuff while you're writing it, they help you understand what you're writing and the edge cases, and they help others understand your code later. That's all great stuff.
I agree but, come on. It's a cop out. Anyone can retroactively fit a model. It's much harder to have a model that makes good predictions. It's tiring when everything from insect populations, to depression in humans, to polar vortexes are all retroactively attributed to AGW. Of course, existing models would never predict these things, they are simply observed, and then a climate related cause is sought out and pinned on.
I agree that it's lazy to just say "AGW" for any problem.
But I also think it's lazy to dismiss theories that point to AGW as a cause simply because no model predicted it in advance.
There are an infinite number of possible effects (including amphibian funguses), modeling each specific one and knock-on effects is an exercise in futility. It's much more reasonable to simply be responsible about the attribution of any effects we do observe, and only focus on modeling the most devastating predicted effects of AGW rather than every potential one (including amphibian funguses).
My main concern is things are not going to be better researched because climate change is the simple answer, why look further? I work at a university, talk to lots of scientists and academics in my free me, these researchers are just as human as anyone else and simple answers are just as appealing and knee-jerk reactions are just as common
This is a false dichotomy and betrays a lack of understanding of the actual work of the scientific community here. The mortality of the fungus and amphibian behavior are both affected by climactic conditions. It is far from “easy” to attribute the epidemic to global warming but careful studies are illustrating how different temperature envelopes and humidity can cause this fungus to thrive in amphibian populations. I suggest you take a few minutes to read this study, it might change your perspective.
Actually neither of those scenarios seems right to me. At some point scientists should generate a falsifiable hypothesis and then test to falsify that hypothesis.
You are correct that this step seems to be omitted in much popular writing about science.
This is a sciency-sounding position that actually misrepresents science. Since scientists are not gods, just as with macroeconomics and astrophysics and paleontology, climate science cannot readily “test to falsify a hypothesis.” We are restricted to observing the universe and hoping to come across useful data we can compare and contrast. Thus to imply that the inability to regularly cleanly isolate and test variables in these disciplines is unscientific is itself a disservice to science.
No, that pretty much is the definition of modern (Popperian) science. At least the sort of science that we should require in order to justify rewiring trillion-dollar economies.
This "astronomy makes no predictions" meme is goofy. Halley got a comet named after him by predicting the year of its return. That was a completely falsifiable hypothesis. If you like something a bit more current, there are plenty of falsifiable hypotheses concerning e.g. neutrino mass that various underground/under-ice detectors will test in the next few years.
Such hypotheses are not inherently impossible for climate science.
Why would you expect us to have good models for unprecedented events of unprecedented scale? I’m all for high standards but this isn’t exactly a simple thing you’re pining for.
It's hard to imagine many other organizations with their level of experience on the subject. The analysis of the issue seems fairly spot on, and their own agenda seems fairly transparent.
Literature? This is a blog post. It’s not like I’m losing a critical piece of the Superbowl advertising zeitgeist by ignoring the identity of the authors. The Superbowl is not a figurative tool to be used as the backdrop for religion in modern advertising. If this was reposted anonymously and still stood on its own, would we have to do this song and dance?
The content of this article stands on its own or it doesn’t. Ignoring it because it was written by someone with different values from your own is puerile. That was my original point.
I mean i was being in tongue in cheek, but it really seems like a "Why do you look at the speck of sawdust in your brother's eye and pay no attention to the plank in your own eye?" kind of situation.
No, other than the fact that the HN audience is largely western and the Catholic Church has had much more cultural influence in western society than any Islamic institution has.
I think the implication is that the criticism may be valid, but that there is hypocrisy involved wrt author - i.e it's not really about the criticism, but the author/religion.
The original "Monorepos please don't" article really just convinced me how great monorepos are when you aren't at scale. So you know, put your shit in a monorepo, and then when it gets painful, break it out.
That process will take you 3-7 years, depending on how many resources you throw at it. Can your business survive 3-7 years of #seriouspain?
This article hand-waves over many of the criticisms while ignoring a few cold realities. If you're following an infrastructure as code pattern and/or if you run bare metal, at some point you _WILL_ determine that some things are too sensitive to keep in the monorepo. Here is one of the places where coupling will screw you the hardest.
Your lightweight production deployment repo will have a hard dependency on some nightmarish 35-40GB monorepo. Your collaboration tools like rietveld/gerrit will choke under the load and you will struggle to get big enough servers to maintain it. You'll do things like push to one target and pull from another. You'll deal with all sorts of transient failures trying to push or pull. Your CI/CD platform will start taking an eternity to do anything.
Monorepos absolutely result in coupling and coupling is one of those nasty things that you don't realize how much of a problem it is until you're drowning.
None of the above-mentioned complaints are theoretical. I've lived through them all.
I imported a 20 year old SVN monorepo to git with 100s of thousands of commits and tens of thousands of branches/tags and it was under 10GB. Removing a few large .tgz files that were inadvertently committed brought it down to 5 GB.
Linux has 25Mloc and ~800k commits; I think the pack is on the order of 2GB?
I don't doubt that 40GB nightmarish monorepos exist, I'm just wondering how and why.
Linux is a highly focussed project trying to accomplish a single thing well and with rigorous standards.
If you have 100 developers working average US work schedules and making 5-10 commits per workday (debatable number, depends on culture, but i'm averaging between the "big" commit and lots of small commits), you're going to end up with 100k commits _per year_. And many large startups have a multiplier of that number of developers and they're much, much messier than kernel devs.
Referencing the ideal case as counter-example is a bit silly.