I’m curious if anyone knows how Google internally works around the lack of support for gRPC in the browser? Perhaps gRPC is not used for public APIs?
The lack of browser support in the protobuf and gRPC ecosystem was quite surprising and one of the biggest drawbacks noted by my team while evaluating various solutions.
Back in the day, it wasn't used for private API's either. Different teams had come up with different ways of encoding protobuf-style messages as JSON for web apps.
For the best browser-side performance, usually you want to use browser's native JSON.parse() API call and this doesn't really let you use unmodified protobufs. In particular, you can't use 64-bit ints since that's not a native JavaScript type. Meanwhile, server-side folks will use 64-bit ints routinely. So if the server-side folks decided on 64-bit integer ID's, you need workarounds like encoding them as strings on the wire.
JavaScript has BigInt now, but still doesn't natively support decoding 64-bit integers from JSON.
It didn't seem like the gRPC folks understood the needs of web developers very well.
Is decoding performance typically a problem for web UIs? The lackluster performance of binary protobuf decoding in browsers (and unmarshaling BigInts from JSON) seems much less problematic than (1) using a 200 for unary error responses, (2) choosing a wire format that's _always_ opaque to the network inspector tab, and (3) having really poor generated code.
> It didn't seem like the gRPC folks understood the needs of web developers very well.
Agreed. Being fair to the team that designed the protocol, though, it seems like browsers weren't in scope at the time.
Quite often I find that Man suffers only because a second Man finds joy in the suffering of the first. Detachment will serve you well in alleviating the suffering that would otherwise arise from the uncaring, undirected whims of the universe, but will leave you unprepared at dealing with willful and directed malice.
In the spirit of curiosity and playfulness I would like to respond with a bold claim:
The distinction between self and other is simply a concept. Certainly useful in many situations, but ultimately just one perspective - one tool at the mind’s disposal.
One could reasonably argue this is the central teaching of Buddhism. Alan Watts helped articulate these ideas to western audiences in many of his lectures:
“You see, the point is that an enormous number of things are going on inside us of which we are not conscious. We make a very, very arbitrary distinction between what we do voluntarily and what we do involuntarily, and we define all those things which we do involuntarily as things that happen to us rather than things that we do. In other words, we don’t assume any responsibility for the fact that our heart beats, or that our bones have such and such a shape.
If you become aware of the fact that you are all of your own body, and that the beating of your heart is not just something that happens to you, but something you’re doing, then you become aware, also—in the same moment and at the same time—that you’re not only beating your heart, but that you are shining the sun.”
I will respond with a bold claim of my own: vague platitudes such as this are a great way to instill a sense of learned helplessness that rationalizes suffering as "just the way the world is" while failing to acknowledge that the world can be changed to reduce the causes of suffering and not merely reduce the personal effect of suffering. This helplessness is perpetuated by those in positions of power as a means of effective societal control, which is to say, those who would stand to lose the most if their adherents realized the truth about who is the source of their suffering.
I’ll bite. The data/graphs look suspicious and convincing enough, so WTF did happen?
I was hoping for an answer but alas there was none. Anyone have plausible theories? Is this an unexplained mystery or just an artifact of S curve growth?
A lot happened around that time: Technology and automation really started taking off, sending productivity way up. Global shipping costs plummeted, giving way to a new era of international trade. The first personal computers started entering the market. Trade relations with China opened up around that time. Households started moving from single-earner to dual-income.
It was a time of rapid change. That particular website is usually used to suggest that the only thing that changed was the gold standard, but it's been debunked and refuted all across the internet.
Stills begs the question... why would we see inflation when all of those trends are deflationary? Can't possibly be because the fed is printing money out of thin air..
I think the gold standard ended because of globalization. After all gold wasn't widely distributed enough to allow a global economy to develop so the USD had to decouple from gold.
Other sources -- sorry no links -- ascribe the genesis of such "social rules" to a US WW II norm of austerity among elites, who might otherwise have found ways to seize a greater share of the economy's wealth for their own dissipative pleasures.
Or the threat of Communism, which was still at least officially trying to inspire revolution around the world. If you have someone trying to inspire your workers to revolt, you'd better be giving them a better deal than the other guy is offering.
I think its more what happened in 1980 with Reagonomics and the Volker Fed followed by Clintonomics and the shift towards "third way" neoliberalism. We've actually had 40 years of what I'd call Republican free-market economic policies and decimation of unions and the power of the average worker to get a higher nominal wage. Most of the trends start closer to 1980.
> to produce one litre of petrol it takes 1.29 kg of oil, of which 85% (1.1 kg) is carbon. And as only 1/10,750 of the carbon remains from the plants that were buried millions of years ago, our one litre of petrol is the result of 1.1 x 10,750 = 11,825 kg of carbon from ancient plants. Finally, as plants are approximately half carbon, that means that 23.65 tonnes of plants were required to make just one litre of the petrol available at your local station
Thanks! I was going to make a similar comment myself but HN isn’t too keen on these ideas yet. Glad to see other like-minded folks here.
The Wikipedia page for plasma cosmology is extremely biased, so I suggest anyone interested in these topics should read the previous comment I made and some of the resources I linked there.
Isn't that just the nature of exploring things that are hard to experiment/interact with? (It would be difficult to directly speed up time on cosmological scales or manipulate galactic clusters for experimentation)
How wikipedia explains consensus:
> The term non-standard is applied to any theory that does not conform to the scientific consensus. Because the term depends on the prevailing consensus, the meaning of the term changes over time. For example, hot dark matter would not have been considered non-standard in 1990, but would be in 2010. Conversely, a non-zero cosmological constant resulting in an accelerating universe would have been considered non-standard in 1990, but is part of the standard cosmology in 2010.
This is a thoughtful comment but IMO your phrasing is too strong and potentially misleading.
As far as science is concerned, the jury is still out on stellar nucleosynthesis. The leading theory - that gravitational collapse causes fusion near the core - may be entirely incorrect, and there is mounting evidence suggesting so including a plethora of modern observations that the theory fails to explain.
For example the so-called ‘coronal heating’ problem has existed for 75 years, since the solar corona was first demonstrated to contain plasma with temperatures of 1 million degrees kelvin and above, much higher than the photospheric surface temperature of approximately 6000 K [1].
As another stunning counterexample, current gravitational models do not explain observed properties of the solar wind like spatial variation and periodicity in time (thanks voyager 1 & 2), among many other things.
See [2] for a dense intro to more unresolved problems in solar physics.
Alternative (non mainstream) theories like “plasma cosmology” are gaining traction in the scientific community for this reason. Note though that many papers do not use that particular phrase to describe their work.
Unfortunately (for intellectually curious folks), the contributors to Wikipedia pages on these physics topics are extremely … biased. Reading [3] may leave you with the impression that “plasma cosmology” has been completely debunked by the scientific community, when in fact nothing could be further from the truth ([4][5] are examples to support this claim).
JS engines have had many serious vulnerabilities in their JIT optimizers, it’s not myopic at all and is a well known technique in the industry.
I agree that some folks aren’t executing untrusted ruby code so they wouldn’t have to worry about this - but how many PaaS/SaaS products out there are? Or how about third party dev tools that are blindly downloaded and executed on local workstations or CI pipelines?
> JS engines have had many serious vulnerabilities in their JIT optimizers, it’s not myopic at all and is a well known technique in the industry.
HotSpot and V8 are both written in C++ and get more use than any other JIT on Earth.
Can you provide a link to a CVE caused by JIT miscompilation and explain how Rust would have been able to prevent the bug in a way that C++ wouldn't?
> I agree that some folks aren’t executing untrusted ruby code so they wouldn’t have to worry about this - but how many PaaS/SaaS products out there are?
This is what Xen, KVM, and Hyper-V do.
> Or how about third party dev tools that are blindly downloaded and executed on local workstations or CI pipelines?
Are you suggesting a Ruby JIT shouldn't generate machine code that corresponds to the Ruby program, but somehow magically prevent stupid developers from doing stupid things?
Adding to this great comment with my own experience at work where we extensively use signal processing and time series compression algorithms to visualize large amounts of biological signals in the browser.
WebAssembly is less than 10%-20% slower than native code in our benchmarks and tests for algorithms like fast wavelet transform and bit packing.
WA allows us to aggressively optimize ahead of time when compiling, and be less sensitive to JS engine performance pitfalls around JIT optimization and garbage collection.
Throwing in my data: for physics simulation code, WASM vs native was identical, provided the native code was compiled with automatic SIMD optimizations disabled. On the one hand, that speaks well of the idea of web assembly, on the other, disabling SIMD is highly artificial. It's good to see basic SIMD being made standard, though the 256-bit wide SIMD instructions are still being finalized, and those will be necessary to really have a chance at evening out performance vs. native.
Connect-Web: TypeScript library for calling RPC servers from web browsers
https://news.ycombinator.com/item?id=32345670
I’m curious if anyone knows how Google internally works around the lack of support for gRPC in the browser? Perhaps gRPC is not used for public APIs?
The lack of browser support in the protobuf and gRPC ecosystem was quite surprising and one of the biggest drawbacks noted by my team while evaluating various solutions.