> In summary: our modern computer's sheer power are camouflaging poor software performance. The difference between using native and Electron apps, is a huge reduction in the upper limit of how many things you can do at the same time in your machine, or having a lower ceiling on how many heavy-load work tasks your system can be doing before it breaks.
Same can be said about a lightweight web page and 'React' with tons routers all in SPA and vdom. Maybe the page is fine when it is the only page open, but when there are other SPA also open, then even typing becomes sluggish. Please don't use modern computer's sheer power to camouflaging poor software performance. Always make sure the code uses as little resource as possible.
That brings a Python "performance" talk to mind that I was recently listening to on YouTube. The first point the presenter brought up was that he thinks the laptops of developers need to be more modern for Python to not be so slow. I had to stop the video right there, because this attitude isn't going anywhere.
You know what? I actually believe in having developers work (or maybe just test) with slower computers (when writing native apps) or with crippled networking (when doing web) in order to force them consider the real-world cases of not being in a confy office with top-notch computers and ultra high-bandwidth connections for testing.
I agree with this approach. I used to always have hardware no more than 2 years old and were med-high to high spec. When I helped troubleshoot on my families and extended families devices and internet connection I saw how normal people suffered on slow systems and networks. I since operate on older devices and do not have gig internet at home every web and app designer should have to build or test with constraints.
I think dev containers can help here. You have a laptop that can run your editor, and a browser. The actual build is done on a remote machine so that we're not kneecapping you by subjecting you to compiling kotlin on a mid range machine, but your laptop still needs to be able to run the site.
Heheh no. I'm in my 30s. My opinion comes from experience. I like to travel a lot, and have been several times on trips that brought me to places where the norm is a subpar connection. Taking 30 seconds to load the simplest bloatware-infested blog that doesn't even display text without JavaScript enabled, teaches you a thing or two about being judicious with technology choices.
This is giving me flashbacks to editors of yore; EMACS, Eight MB And Continually Swapping. I remember reading almost the exact same comments on Usenet from the 80s and 90s.
It’s also 2024 and you still can’t share JavaScript objects between threads. Do not underestimate the horror that is tracing garbage collection with multiple mutator threads. (Here[1] is Guile maintainer Andy Wingo singing praises to the new, simpler way to do it... in 2023, referring to a research paper from 2008 that he came across a year before that post.)
That’s not entirely surprising. Emacs’s UI is a character-cell matrix with some toolkit-provided fluff around it; VSCode’s is an arbitrary piece of graphics. One of these is harder than the other. (Not as harder as VSCode is slower, but still a hell of a lot.)
Getting the same amount of current engineers or possibly less that actually care and know about performance can work. There’s a reason applications are so much relatively slower than they were in the 80s. It’s crazy.
Anyone that believes this can prove it by taking down an existing popular product with a better engineered and better performing competitor built for the same cost.
I was using computers in the 80s. They did a very small fraction of what we ask them to do now and they didn't do it fast.
> Some kind of principle of social stability has kicked in.
Same goes in finance. For example, SIFI (Systemically important financial institution) are exempt from regulation and even bankruptcy. It is human nature to mange/avoid risk, and the system will always want to maintain itself. It is kind pointless point finger to say oh this company/nation/regime censor this. Everyone does it in their own ways.
>There is more money to be made by stakeholders treating health problems than there is in preventing them. And so there is a lot more money and effort invested in developing pharmaceutical and treatment solutions than there is in preventing them – even though that would be far, far better for society as a whole.
The health care system in US is very broken. This paper [1] really outlines. It not just more money made in treating health problems than prevention. Due to the health insurance is structured in US, there is also an incentive for doctors to prescribe the more expensive treatment instead of the cheaper more effective treatment.
And how did email become the defacto verification method. 2FA like TOTP/Security Key should be. It can be used offline and no connections to shady corporations.
That's because there is no such thing as "truly secure", there can only be "secure under an assumed threat model, where the attacker has these specific capabilities: ...". I agree that software engineering is getting away with chaos and insanity compared to civil or other engineering practices, which have to obey the laws of physics.
Remind me of the One World Trade Center rebuild, and "if you want a 747-proof building, you're building a bunker".
Translate the internet to the real world, and basically every building (IP address) is getting shot at, hit by planes, nuked, bioweapons are stuffed into the mail slot, and lock-picked all day, every day.
This study would make a lot sense, that is why may studies in the area found that talk therapy in conjunction with medication are way more effective and less relapse than just medication along. Dr. Grof has done quite bit of work in that are.
Personalised brain circuit require personalised therapy, and talk therapy with well trained therapist seems to be the key to treat the cause instead of symptom. All the difrerent fancy medications are "throw stuff against the wall and see what works".
The parent is asking about how bun compare to Deno, not node. TSC, tsx, and ts-node are nodejs related. I also would like to know. Deno seems to be mature and ready. It runs .ts files from command line. Of course there is no point of picking one JS runtime over another. I would follow the WinterCG group, and use something that is runtime agonistic like hono. It already has a React like front end built in (jsx-lite type).
I don't think Bun "definitely can" run most Node code. Both Deno[1] and Bun's[2] Node compatibility are incomplete. Bun being perceived as more compatible is mostly clever marketing.
From the beginning, Bun was designed to be a drop-in replacement for Node.js. That’s why Bun implements Node’s globals. That’s also why Bun automatically detects when CommonJS is used in the entry point and ensures CommonJS is loaded. require and many other Node.js features “just work” in Bun.
Same can be said about a lightweight web page and 'React' with tons routers all in SPA and vdom. Maybe the page is fine when it is the only page open, but when there are other SPA also open, then even typing becomes sluggish. Please don't use modern computer's sheer power to camouflaging poor software performance. Always make sure the code uses as little resource as possible.