At this point, there should be a bot running that archives any post that starts getting traction. Or maybe have a HN serve a cached copy to offload the traffic? Who knows, it just seems like every blog post is killed the instant it reaches the front page.
Sorry, the server wasn't big enough to handle everyone coming from hn. But it's working again.
BTW: The blog is also available via IPFS: /ipfs/QmdrFfK8yc6UkPK7aCcxAZzRM7K3ZP4JPfyjjbiHsHEDVH
> Or maybe have a HN serve a cached copy to offload the traffic?
I've suggested this before; the usual problem people bring up in response is that it deprives these websites (which run on advertising) of their ad impressions.
Not honestly sure what percentage of Hacker News users are browsing without an ad-blocker, though...
I'm not sure if you've ever visited https://lobste.rs, but it actually has this built-in - it caches every post. It's kind of like Hacker News except they've made some changes and added a bit more functionality.
It does. But unless you have a CDN, caching is going to be per client. Since I've now visited the site, I have it cached, but that doesn't help the next unique visitor.
The site makes use of and respects cache headers, too. (And, I'll add, only makes 1¹ request aside from the main HTML, for some CSS; the site is refreshingly minimal, yet still looks nice.)
¹it also makes a request for MathJax, but that gets blocked b/c it's made over HTTP, and the site is HTTPS.
At this point, there should be a bot running that archives any post that starts getting traction. Or maybe have a HN serve a cached copy to offload the traffic? Who knows, it just seems like every blog post is killed the instant it reaches the front page.