Hacker Newsnew | past | comments | ask | show | jobs | submit | squirrellous's commentslogin

The problem is scale. Beyond a certain scale it’s all a net negative: social networks, bitcoin, ads, machine learning, automated trading, big this big that, etc.

Unfortunately for fellow developers, software enables massive scale.


A lot of process and management is about dealing with low performers - by which I don’t mean incompetent people but people lacking motivation, or with the wrong intuitions, etc. Our hiring process can’t reliably filter out low performers, and when they get in it’s difficult to fire them, so we invent ways to raise the bottom line through processes.

And FWIW I don’t think you can solve this by always hiring the “best” either, at least not beyond a certain team size.


I agree in principle. However JSON tooling has also got so good that other formats, when not optimized and held correctly, can be worse than JSON. For example IME stock protocol buffers can be worse than a well optimized JSON library (as much as it pains me to say this).

Yeah the raw parse speed comparison is almost a red herring at this point. The real cost with JSON is when you have a 200MB manifest or build artifact and you need exactly two fields out of it. You're still loading the whole thing into memory, building the full object graph, and GC gets to clean all of it up after. That's the part where something like RX with selective access actually matters. Parse speed benchmarks don't capture that at all.

> The real cost with JSON is when you have a 200MB manifest or build artifact and you need exactly two fields out of it.

There are SAX-like JSON libraries out there, and several of them work with a preallocated buffer or similar streaming interface, so you could stream the file and pick out the two fields as they come along.


You still have to parse half the entire file on average. Much slower than formats that support skipping to the relevant information directly.

yep, this is exactly the kind of use case that caused me to design this format.

as parser: keep only indexes to the original file (input), dont copy strings or parse numbers at all (unless the strings fit in the index width, e.g. 32bit)

That would make parsing faster and there will be very little in terms on tree (json can't really contain full blow graphs) but it's rather complicated, and it will require hashing to allow navigation, though.


yep. I built custom JSON parsers as a first solution. The problem is you can't get away from scanning at least half the document bytes on average.

With RX and other truly random-access formats you could even optimize to the point of not even fetching the whole document. You could grab chunks from a remote server using HTTP range requests and cache locally in fixed-width blocks.

With JSON you must start at the front and read byte-by-byte till you find all the data you're looking for. Smart parsers can help a lot to reduce heap allocations, but you can't skip the state machine scan.


Great post!

The minimized repro seems like something many other eBPF programs will do. This makes me wonder why such kernel issues weren’t found earlier. Is this code utilizing some new eBPF capabilities in recent kernels?


Thanks!

The new spinlock that the problem is in was introduced in kernel 5.15, which is relatively new, you need to be hooking context switches, and you need to be sampling at a high enough frequency that you hit the problem, and you need to be using the ring buffer to emit those events. Outside of CPU profilers like us, I don't think there are many other eBPF applications with this type of setup.


Sounds like what we need is Facebook pages, except as a free service from the government or non-profit.

Back in the day, there was this thing called the "Yellow Pages"! :-)

I believe the yellow pages were typically printed by private companies, often the telephone companies, so in a way Facebook is an apt comparison!

Did you need an account to read the Yellow Pages?

No, you received piles of them on your doorstep whether you liked it or not.

Wouldnt ISPs give you a bit of web space with your internet plan back in the day? (I'm too young to have been around for that but I've heard it used to be a thing)

Yes, but that's an ugly address tied to your provider. And you had to learn rearing a website (in Frontpage?) and FTP. Also expectations on websites were different. They were allowed to be fun and didn't have to care about different kidb sof devices, accessibility and all these things.

Back in the day™ this worked somewhat as people who were online and a somewhat level of technical interest. Else they wouldn't have used the Internet. The average restaurant owner doesn't have that interest. They like cooking or talking to customers on the bar or something, but not doing Webdesign. Probably they only use the desktop/laptop for preparing numbers for tax purpose unless they can fully outsource that.


Ah, fair enough

Do you have any justification in mind for the “free service” being funded by tax payers? Why should it be free for the people who need it, and why should tax payers fund it?

Such proposal doesn't need justification. You can merely disagree.

Anyhow. The justification is that it is an important part of a communications infrastructure.

Just like the government finances roads, etc.


I'm not disagreeing with you, but shouldn't free Internet access come before that?

We should be making sure everyone has internet access, but hosting some basic pages is about 1000x cheaper, so no I don't think free internet access should come before that.

Internet access doesn't seem to be an issue.

Politics is also about making practical choices to advance humanity.


Converted to dollars, the value is far greater than the cost of a single bomb dropped on strangers that aren't a threat to me, so I don't need to justify it until someone can justify to me the bombs, the oil and gas subsidies, the bailouts, the...

>the value is far greater than the cost of a single bomb dropped on strangers that aren't a threat to me

Such a weird comparison. Just so we are tuned in, can you list some things that are of less value to you than a single bomb on a stranger?


My point is I don't want bombs dropped on strangers, so, in terms of things the government spends money on, there's nothing of less value to me that a single bomb on a stranger. Of all things the government spends its money on, I'd rather any one of those things to take 100% of the budget, than even a penny to go to dropping a bomb on a stranger, even if that significantly decreases my quality of life.

I just really don't like my government killing people far away that pose no threat to me.


> Do you have any justification in mind for the “free service” being funded by tax payers? Why should it be free for the people who need it, and why should tax payers fund it?

Because the government should provide useful services. It should be funded by tax dollars because I'm tried of libertarians, and it's well-demonstrated that the free market has consumer hostile incentives that I'm sick of.


Alright cool.

Your assuming the local government employed webmaster won't favor his friends restaurants.

Craigslist basically is this, and it's more or less free.


Forgive me for assuming that the government owned service would be more transparent/serve the people better than a privately owned, closed source, platform that's explicitly funded by ads and so is transparently corrupt. Even your worst case scenario for this would be equivalent to what we already have.

> Your assuming the local government employed webmaster won't favor his friends restaurants.

Oh my! Mic drop! You got me! Corporate owned sites would have to be unbiased, right? It's not like a business would ever do something as disreputable favoring a restaurant that paid for the favored treatment, or try to steer you to affiliated businesses. Inconceivable!

But seriously now: a government-run site would be way better and have less biases. In the US, there's a good chance it'd be run by civically-minded people, and there's about zero chance that conflict of interest would be baked into its "business" model.


Trump would be running that right now if it existed.

One pattern I've seen is that a team with a decently complex codebase will have 2-3 senior people who have all of the necessary context and expertise to review PRs in that codebase. They also assign projects to other team members. All other team members submit PRs to them for review. Their review queue builds up easily and average review time tanks.

Not saying this is a good situation, but it's quite easy to run into it.


Can’t argue with the quote. However my current boss has been pushing this to the extreme without much respect for real-world complexities (or perhaps I’m too obtuse to think of a simple solution for all our problems), which regrettably gives me a bit of pause when hearing this quote.


Reminds me of another good one: Make everything as simple as possible, but not simpler. (-- probably not Einstein)


I know the world has moved on but like, use emails, man.


C++ dev here. It’s helping me just fine. Not as much as frontend folks but not far behind either.

It’s not quite at the place where LLMs can take over 100% coding, but give it a few more months.


To be clear I never said or implied it wasn't helping c++ devs. I said I find it helps HTLM/JS development more.

And I don't even think it'll stay that way, just that it's what I've seen so far.

It's anecdotal of me, I apologize for that.


This is why I get turned off hearing about Rust (and I use it from time to time). You can say Rust is good without saying other languages are bad.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: