Feels fairly simple. A decent chunk of people always disliked SPAs, due to their very obvious disadvantages. Another chunk of people always liked them, either due to novelty or to their, again, fairly obvious advantages.
Depending on who you talk to or what online forums you frequent, you may have had an impression that the first group was far smaller (or larger) than the last group. Due to changes in the media you consume, or changes to the culture of various online forums, it may seem to you like the first group is now larger (or smaller!) than the second instead.
On the margin I suppose a few people who liked SPAs due to the novelty may have got bored with them, but most people, I think, have had pretty static views about SPAs, because the pros and cons are pretty obvious.
Yeah, okay, some prominent engineer tweeted something bad about SPAs last month and it went semi-viral, but prominent engineers have been tweeting and writing bad things about SPAs for the entire existence of SPAs. And more than a few of those have gone pretty viral.
Better question might be "How come I saw this tweet on my feed, but didn't notice all the earlier ones?", but I don't think that's super interesting.
Google Maps was a justified SPA. But when I am looking at a normalish looking web page and can’t open hyperlinks in new tabs cuz they aren’t actually hyperlinks, or the back button doesn’t back, you’ve broken fundamental navigation. Back and forth shouldn’t refresh lists and reset scroll.
Too often SPAs completely reinvent browser navigation within themselves. And I know the argument is that it is indistinguishable, but from a performance standpoint, you’re now running a second copy of browser functions in ram/cpu for every tab open.
The real answer to why everyone hates SPAs now is that there are well funded companies providing frameworks that are touting their SSR capabilities as a massive advantage, seeding the internet with content about how SPAs are the worst.
Something similar happened about a decade ago when everyone started loving SPAs.
SPAs didn’t become popular because of cell phones. In fact, outside of Facebook for a short period of time, there has been a strong consensus over the entire “everything must be a SPA” period that mobile apps should be native. Facebook, once it realized its mistake, tried to fit its massive “build once run anywhere” peg into a tiny “native for mobile, HTML5 for the web” hole through React Native, not SPAs.
SPAs have their place. Although I’m not a huge fan (though less of a hater now, thanks to the fact that at least the JS world has largely agreed that typing is good), I have a highly successful SPA that’s been used with almost no maintenance besides basic security upgrades for nearly 7 years now.
The reason is that we used a proper framework that fully supports building a SPA, EmberJS, but it could have been anything other than trying to cobble together a variety of react-* libraries thst only barely work together, and that our application actually was a heavily interactive user driven application with most of the stuff happening on the same page. There are only a few minor other pages (such as settings), so the application itself is largely a single page application, which is why a SPA fits it so well.
I’m increasingly convinced that so much of the trend following we’re seeing in the front end development world today is a result of the fact that ReactJS absolutely burst through the scene when it realized because it had some game changing ideas, but never had the courage to become a framework.
Instead, it insisted on being a thin UI layer forcing others to fill the gaps, which have led to all sorts of startups/open source projects popping up trying to convince others that theirs is the one true way.
So instead of being a sober discussion around the pros and cons of using certain technologies in certain scenarios, it’s being recast to resemble the fashion industry where everyone is insisting that their trend is the next big thing.
> So instead of being a sober discussion around the pros and cons of using certain technologies in certain scenarios, it’s being recast to resemble the fashion industry where everyone is insisting that their trend is the next big thing.
If they were still easily accessible, you could go back and read identical drama play out on 50 year old usenet newsgroups and mailing lists. Our industry is built on the genius of many clever tinkers who could get any tool to meet any task, and plenty enough stubborn contrarians and manic hype-builders ready to argue to their death (or until the next fashion comes through) that the one true light has finally been glimpsed.
Meanwhile, countless quiet and productive teams privately figured out how to use a tool well and just stuck with it for as long as it still made sense, not letting themselves getting so wrapped up in the public hullabaloo -- just like you did with your EmberJS project.
The fashion debates reflect that a lot of work is actually art, while those many quiet, healthy projects reflect the engineering processes that see us get paid better than most other artists.
I’ve viewed it as backlash for the way they were promoted heavily as the solution for web development based on claims about being faster or more reliable. One of the weird things about that period was that people would say something about performance, you could point to WebPageTest.org or RUM stats showing the opposite was true, and have people just trying to handwave around it. After a decade of those problems not going away, I think a lot of people have been realizing that there is something fundamental at work.
I’d like to think that as a field we’ve learned something about measuring claims or not picking your architecture based on what someone in a different business with different users is doing, but experience otherwise.
There is no sudden about it, SPA's are yet another one of the so-called "modern web" cancers of the Internet. They go completely against the protocols and architecture they run on and the only reason why they became so popular to begin with is because of all the hype from the big business, like so much else.
A web application or a website, by its very nature, is a series of small requests. When someone sends all requests in one big lump of a page, they have just defeated this very important architecture. Not to talk about the fact that they break searching on pages and a whole lot of other things.
> A web application or a website, by its very nature, is a series of small requests. When someone sends all requests in one big lump of a page, they have just defeated this very important architecture.
This whole take is just bizarre. Literally none of this matters to users except to the extent that it affects performance. While I would agree that many/most SPAs are implemented extremely poorly and inefficiently for reasons relating to what you're talking about it, and that that poorness spans both technical faults and UX problems, it's a very myopic view to complain that the foremost function of the web should be to rigidly adhere to a particular set of architectural patterns that are in flux and often outdated by modern standards.
Being able to bookmark deep links matters. I've worked places where users complained they couldn't bookmark their work because everything is just "the page".
That's the fault of the application developers, not SPAs in general. SPAs support routing, for instance if you are using React router it uses the browser's history API to add the navigated pages to the history and to change the path that appears for the users, if a user then bookmarks that page when they reopen it react router and co should take him back to the same area of the app they were in before.
Not defending SPAs, just pointing out that they are not to blame for this.
If your web page has to reimplement basic web browser functionality, like scrolling, or the back button, or linking, you have strayed far from the path to wisdom.
>Literally none of this matters to users except to the extent that it affects performance.
Yeah, just render my browser's back button dysfunctional, fam. Doesn't matter to me at all. Take all my control away and lock me into your walled garden of totally custom, approved-only interactions.
Do you care about your back button because it's an architectural principle of the web, or because it provides a useful function? Go back and read my comment carefully and you'll probably realize you're arguing against a strawman.
ask any user what "hypertext" is and you'll get a confused look; similarly, give them a black-on-white, no-css page and they'll think it looks quite ugly and isn't a good experience. spas are commonly horrible to use but restricting everything to the model of the internet as it existed in the 80s is not a good thing either
Just because my mother in law doesn't use the word hypertext doesn't mean she doesn't know how to use a link, and isn't frustrated when some places can't do it.
I'm not an expert, but it looks like things are starting to move towards a nice optimum. Many of the hot new frameworks do server side rendering and only send JavaScript to the client as needed.
I think you misunderstand what an SPA is. It also sends a series of requests as you navigate the website. It doesn't break search because everything is rendered to HTML.
It does break search. A traditional SPA returns to the client an empty HTML page (except for header stuff) and a script to parse. The script then populates the DOM with the elements, problem is that crawlers are not really good at waiting for the "render" to happen, and running JavaScript just to see what's in the page opens a whole other can of worms. SPAs are awful for search engines and that's why we get the hybrid monsters that Next.JS, Nuxt, SvelteKit and co. produces nowadays (not saying they are bad from a user perspective though, much prefer web apps made with them over client-side rendered SPAs).
We spent the last decade trying to fix the whole mess that client-side "rendering" created in the web, we'll probably spent the next decade trying (and failing) to standardize things such as web components, declarative shadow DOM, among other things. That's because we as developers keep insisting in complexity: hydration, SPAs, bundlers (and in consequence bundle bundlers such as Vite). The only true solution is to bet in simplicity, but that's not sexy, and sex sells.
In conclusion: The React revolution and it's consequences have been a disaster for the human race.
The SEO argument is a strawman. Large swaths of the Internet return a blank page if you try to access them with JS turned off. Are you saying search engine crawlers haven't figured out how to deal with that?
There is nothing inherent in SPAs that make load time slower. If your REST API takes 5 seconds to serve up JSON, what's to say your SSR rendered page won't also take 5 seconds?
> Are you saying search engine crawlers haven't figured out how to deal with that?
Except for Google which has spent a tremendous amount of effort on that, yes I am.
> There is nothing inherent in SPAs that make load time slower.
I never said they are. The difference is only for crawlers which now have to run JavaScript whereas before they could just retrieve the document and index it as it was served to it instead of trying to determine when the SPA has done the full loading of the page.
Not convincing. Even a PHP page wouldn't be indexed as it was served if it loads JS scripts. And if your page isn't running JavaScript, comparing SPA and SSR is moot. One could just use one of the hundred odd HTML templating tools to produce a static page.
Till today the Google Crawler is the only one that gets even near good search results of SPAs. That's because they have to run a full browser to load every page to parse it. Every. Single. Page. That's why it is hard to do correctly. Read a bit about their current approach here:
> Keep in mind that server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript.
Recently in 2019 Bing Bot just gave up and is using the same approach a Google:
So nowadays having a SPA isn't as bad for SEO as it used to be, but again, and I'll keep nailing on this, that is only because we spent the entire last decade throwing millions of dollars and hundreds of hours of engineer time to make SPAs a viable option.
Saying that an SPA returns an empty HTML page is the old days. If you're already aware of Next, Nuxt, etc. then you know that the most popular frameworks of each ecosystem already support SSR whether you're building for an MPA, SPA or hybrid.
Next basically is React and the React team implicitly admits this.
Traditionally Next.JS isn't really a tool to develop a single page application (although you could CSR every page if you want to), since single page applications are... single page. These are all multi page full stack frameworks that make use of incremental static regeneration and hydration to have the best of both worlds.
We agree though, it's just a naming thing really. I make the distinction like this because traditionally a SPA can simply be served by a CDN since it's simply a single file, meanwhile all of these frameworks either require a server to render pages on request or prerender all the applicable pages and leave a few of them to totally client-side "render".
Also note how Vercel (at the time called Zeit, mostly known for Now) called Next.JS back in 2018 when they originally released it:
> A web application or a website, by its very nature, is a series of small requests. When someone sends all requests in one big lump of a page, they have just defeated this very important architecture.
I have a hard time seeing this as how most SPAs work.
I see this as how most SSR systems work. Which does seem broken & bad. A user has no potential to understand this world. It's just a mainframe generating some pre-rendered experience that has no real data inside of it.
So actually, no. I think SPAs have moral value. The alternatives that have upstarted recently seem bankrupt, with no legs to stand on. I don't understand where this post comes from. There's an extremely hostile negative high-bullshit pissy attitude here, that isn't backed up by a single thing I can recognize:
> There is no sudden about it, SPA's are yet another one of the so-called "modern web" cancers of the Internet.
Most SPAs seem to be the last defender of doing what you asked for?
I've built so many spa's in virtually every popular framework.
I'm over it. Even the simplest SPAs end up being way more effort than they are worth IMO. I'd write thin API layers and it seemed great.
Recently though. I moved to all backend with some jQuery. I render all the views on the backend and if they need to be swapped out I make a simple ajax call to pull the rerendered days and swap the div. It's like 2 lines of code for me.
Everything is so much faster, primarily because JS has so many gotchas. And so many times where you wonder "wait why isn't the data right here? Shouldn't this have blown up".
There is no appreciable difference to the user here but I can work in a backend lang and sparingly use JS and it's a pleasure.
> Recently though. I moved to all backend with some jQuery. I render all the views on the backend and if they need to be swapped out I make a simple ajax call to pull the rerendered days and swap the div. It's like 2 lines of code for me.
Sounds like one of those bad SPA in the making. You know, without browser navigation, without an url that can be revisited.
SPAs certainly were not a "zero interest rate phenomenon".
It was early in the 2010s that I saw all the web design shops who made Ruby on Rails sites in my small town suddenly start building all their apps in Angular and struggling.
I'd ask them why they were doing this and it wasn't masochism, rather they believed that small town web clients demanded SPAs and that they couldn't sell user-friendly RoR apps that were affordable to develop.
Zero interest rate phenomenon is building native everywhere. iOS, Android, Windows, MacOS, Linux, web. Got money to burn? Build it 4, 5, or 6 times! Then keep it up to date! Even if you're a rocking engineering org, that's an ouch.
Web apps/spa/pwa can be a great value choice if you need to hit a lot of platforms on a budget.
Angular definitely started out as kind of a dog. Some modern frameworks I'd put up there with ruby, maybe better. Don't chase the hype. Lots of great tooling out there. Prove something out before you run with it. It's not THAT hard.
I built my first SPA for a company I worked for in 2006! It was funny trying to explain to the lead engineer that it was possible to build a web app without page refresh. He didn't believe me at first.
And that was already a while after Outlook Web access, and even later than people using hidden iframes to simulate XHR before XHR existed. They might not have been common, but they're old by now.
A well done SPA is pretty much indistinguishable from server side rendering. Perhaps even better. They use real links which you can copy and middle click, but if you regular click, js hijacks the event and changes the page without a full page reload and then inserts a history event so it’s just like a normal link but slightly better.
The hate for single page apps is about bad ones. Which are unfortunate common
Except that it's not just like a normal link. It's a button dressed up as a link. A translinkingbutton, very current in this this time of age, I hop I don't offend any buttons.
What happens if you press shift? or cmd? and in which browsers / OSes? What if you use enter to open links, will it have a tabstop? What about accessibility. You're in essence re-creating the many parts of the webbrowser.
> The hate for single page apps is about bad ones. Which are unfortunate common
It’s not a button. It’s an actual a tag with a href. Additionally it has an on click handler. If you open the link in any way other than clicking, it uses the href like a normal link.
Things like React Router make it trivial to create real links that additionally can change the page without a page reload if they are clicked. But you can still middle click to open in new page, still hover to see link, still navigate history normally, etc.
You don't really seem to understand how the History API works. They're normal anchors that have an handler attached. When the handler is executed, the transition to another page is handled client-side. If you open the link in another page, or bookmark, or deep link in any other way, the SPA loads in its entirety and shows the right content anyway. Most of the time, with lazy loading and caching, the actual number of bytes on the wire is less than serving a whole server generated page of content.
Almost every google site is a SPA and work flawlessly. Spotify web works very well, even better than server side rendering as it can continue audio playback between pages.
Imo the main issue with spas is they can be very fragile and don’t handle errors because they were built lazy. And since it’s all async, the spinners keep spinning forever despite the request failing already.
> Imo the main issue with spas is they can be very fragile and don’t handle errors because they were built lazy.
I think that’s the key point: the more browser functionality you take over, the more of a resource commitment you’re making to support that extra code. If you’re Google or Spotify, you can easily afford that and in the latter case you have a strong argument for necessity because there’s no way to keep the music playing across page-loads.
The key part is making sure that you’re taking on an appropriate level of work for your team and site. A local restaurant website should probably prioritize loading quickly since most people are looking for fast answers and the owner doesn’t want to pay to maintain a heavy tool chain which requires frequent maintenance.
The API is mature but usage is not. I still see cases where companies screwed up state management on at least weekly basis so clearly the SPA-building community is still lagging behind where the server-side world was three decades ago.
The solution to bad url routing in SPAs is distributing our work across a madcap wild ass front-end/back-end split with ill defined borders & no protocols? What's happening now is 500% more complicated.
But the magic is well packaged & available by default.
The SSR world has frameworks: but as another commenter pointed out, React stopped at being merely a library. And so it's been anarchy, deciding how to SPA, with many folks not realizing how many needs/libraries they really needed to have.
I'm hugely in favor of SPAs, but there's been a lack of well-integrated sensible approaches. The attempt to identify & only solve a narrow window of concerns has hurt our ability to imagine what apps are, left huge voids all over.
I've always hated them but it's good to see the tide turning. Slow (one of the first signs of an SPA - a page with its own custom loading indicator), requires JS to do things that otherwise needn't, and things like links don't work right. I do wonder if the otherwise superfluous --- and constantly changing --- JS requirement is part of the "weaponised change" approach of Google to furthering its control over the browser, since it's relatively trivial to create a browser that's usable on sites that need nothing more than HTML forms, but substantially more difficult to create one which will work with the simplest of SPAs.
Anything that requires some kind of SEO is a dumb idea for a SPA. If you're making Jira or something, a SPA makes sense. The issue is, the bovine tech herd sees new thing and applies new thing to everything. Everything is React now. Now we have news websites as SPAs, the worst possible use case for a SPA.
- SPAs make sense for "applications". Google docs, Gmail, Trello, Slack, web-based music video creators, etc. Things that need high levels of interactivity.
- SPAs don't make sense for "websites". Blogs, forums, search engines, news sites, etc.
Of course, many things exist somewhere on the continuum between "website" and "application", and then the tradeoffs get messier.
Gmail arguably doesn't need a high level of interactivity - that's why they also have the basic HTML version.
Because like all fads with narrow applicability, idiots in big enterprise assume that All New Things Must Become the Only Way.
At $dayjob I regularly see "SPAs + microservices" getting deployed for static content that could have been literally just HTML, CSS, and JPG files.
I've seen a design recently for a trivial app that had two Kubernetes clusters in the architecture diagram. Two!
Not to mention that virtually all SPAs are infected by Node and NPM, directly or indirectly. JavaScript package management seemed "so easy" at first, and has slowly but inevitably morphed in a nightmare.
Minutes of CPU time was insufficient to disentangle the dependency graph of one particular Angular app in the effort to upgrade its packages. If hundreds of billions of machine instructions aren't up to the task, a squishy meat brain definitely isn't either.
Maybe SPAs are a great idea if they are your only product and are truly used to develop web applications that are constantly maintained. For corporate web sites they're idiotic beyond belief.
The ecosystem of web standards, browsers and search engines is mostly optimized with MPAs in mind. That's the base experience for web. Things have only gotten more complicated as we've built abstractions on top of MPAs.
In the earlier iterations, SPAs broke navigation, broke SEO, had no decent state management solution, and required a good understanding of reflow/repaint to avoid performance issues.
The ergonomics of SPA frameworks has improved drastically and metaframeworks like Next.js have mostly fixed those issues. But the myriad of options and paradigms might be exhausting for some devs.
Some experienced devs might be nostalgic for the document-centric PHP days. I imagine that many newer devs have only recently been exposed to the pleasure of building a simple MPA website, as they were initially thrust straight into React.
For example, I recently chose to use Hugo for my blog. Just wanted something dead simple to build a static website without all the bells and whistles of a SPA.
It's not sudden, I've hated them since they've existed with only a handful of exceptions such as gmail. Most things on the web would be better without them.
As many folks mentioned already, the hate isn't new. Teams keep trying to use a one size fit all. Using SPAs for traditional websites was never a good idea, but people kept trying. SPAs tend to be pretty cheap and easy to build, especially if you cut corners, which makes them good to get an MVP out of the door quick (even if it's gonna suck). When trying to build "app like" products and given enough effort, they worked well and people made tons of money from them.
Now though, there's a couple of entities who benefit a lot ($$$) by telling people "Actually, this SPA thing sucks, look at OUR solution that will solve all your problems!". And they're VERY vocal about it (much more vocal than people who actually dislike SPAs).
A lot of trend setting in the tech industry is just from people ruffling feather or sowing confusion on purpose to then sell a solution. Right now we have a FEW of those, a bunch of people making money out of training you to solve the problem you didn't know you had, as well as a couple of Youtube "influencers" making money from the confusion.
Because you normally need to use a framework + the front-end ecosystem and it's hard (as in: not something you can improvise, you need to be good at it and have years of experience in it). Many developers on the back-end think that the website should be a thin veneer over a service. When they find out they can't just hack on it, but they need to learn React/npm/react-router/table virtualization/some component framework they get irritated because they've always seen front-end as a "lower" discipline for "worse" developers. That said, a well-written SPA is able to express a class of web-based applications that, if written well, are indistinguishable from normal web pages (with anchors etc) but can do much more things than a traditional server generated web page. Since native apps are essentially living in paid, closed, walled gardens, long live SPAs.
Love me a well built SPAs. Great tech. I certainly don't mind if others don't build them. So many people trying to advocate the "one true way" of tech instead of acknowledging that it really depends on the use case as which tools end up being superior.
I have been building SPA with SSR for years, if you truly know what you are doing they work very well. Yes there are some downsides to SPA's but the amount of friction is removed due to the speed of the app it does amazing things for business. We increases conversion rate and sales like crazy because of the experience of the website. It is true I had do some extra work to make it so fast, but dam the growth we had with the SPA was amazing. Know keep in mind that in our case it made sense because our users are buying 3000+ products in one session
your website is _fast_, i'm impressed. The problem is, it's a very rare exception. Most SPA's I interact with daily (like banks - BofA, Citi, Chase, etc) are a complete and absolute disaster, anyone who touched that code must be fired for incompetence and banished from profession forever (e.g., Citi has 30,000 developers+all the money in the world, yet their backward-ass, forever-spinning, busted SPA gets a 7/100 in lighthouse)
Whenever this “SSR vs SPA” question comes up, I feel like it’s the wrong question to ask, because neither is ideal.
Server rendered or static sites make sense in some specific use cases (e.g docs, blogs), but even for those classic use cases they’re not ideal (e.g full page reloads every time).
On the other hand, SPAs make sense in some specific use cases (e.g Gmail) but even for those classic use cases they’re not ideal (e.g first render takes time).
I think we always need both server and client rendering, regardless of the use case, and the question we need to ask is how can we achieve this in an ergonomic way.
There’s some progress being made there (e.g Next.js) but I feel like it’s not quite there yet.
The hybrid approach really is the best of both worlds, and it makes perfect sense to do the heavy lifting on the server initially and then turn over everything to the client rather than doing the heavy lifting again for every single subsequent request. Every time this comes up though there is the snarky "lol SSR isn't new, are JS devs just learning about it???" type response here.
Imagine if this was reversed and mobile apps suddenly started switching to SSR. Instead of downloading the app once and making lightweight JSON requests (analogous to what SPAs are doing), they froze for seconds while the next page is rendered on the server while losing all of the local state that you had typed in. Sure would be a lot of hate for that when they have been working fine for a decade. You'd probably argue, that it sounds like that is just a bad implementation of SSR. Not all SSR is like that surely?
This is how silly this weekly SPA vs SSR debate is.
Some mobile apps are like this. Even some desktop apps, won't move forward without it's connection to the mothership. Yes, it's frustrating. I like offline capable systems.
The money isn't there for offline ability on most websites. So... I can appreciate it from a distance.
I hate everything that's mobile-first. And I prefer simple sites not bogged down by tons of JavaScript. Which is not so surprising considering this is my favorite website. In fact it's not so surprising to find a general desktop-centric view here at all, in my opinion.
I got to hate them when angularjs became the new version, which was nigh on impossible to upgrade from the original. They should have given it a different name.
Angularjs burned me so hard that i didn't even bother to look at react.
I’ve noticed that a lot of developers tend to hate things that they’ve tried building but they found it too hard. People enjoy doing things they’re good at and hate doing things they suck at.
There are a lot of SPAs that shouldn’t be SPAs, but that doensn’t mean all SPAs are bad. This isn’t your grandpa’s web browser anymore. We’ve moved on from the web being all about hyperlinked documents. The browser is also an app runtime, just like your smart phone.
If server-side rendering was the only good way to build web apps, then it stands to reason thay it’s the only good way to build phone apps with equivalent functionality. Maybe phone apps should also start re-rendering the entire UI on the server every time the user taps something.
It’s quite nice working on an SPA again after a few years working on a SSR site. Auth is simple, no sync issues between the client and the server, and never have to worry about caching
"Suddenly"? I don't know any users who liked them ever, and that has been a long time now. Designers and clients seemed to like them. Not users or developers.
Q: why it matters?
A: 90% SaaS product features you see is half baked, SPA's are the cure to the mess. Solving core problems requires rebirth of application but when ARR is big you can't go and rebase the code.
Either you're born with it or your product will rebirth.
(SAP's has it's own cons but the pros weighs more)
- Client devices are sufficiently performant for anything you throw at it. Yes, even the $100 Chinese Android smartphones that taxi drivers in India use.
- The last argument I've heard is "I don't want to expose a JSON API to the world", but that's false security because scraping a web page is not difficult.
This is engineering, not religion, so an absolute assertion is unlikely to be true.
The whole point of SPAs is to run code on the client instead of servers. That tells us that the load time will favor SSR for any site where the code and/or supporting data is much larger than the displayed result. This means that the question of what’s better depends on how much data you’re talking about and whether you do enough dynamic behavior to make up for the unavoidably slower initial load time, and weather doing that dynsmic work locally is a net win relative to doing fewer requests over a higher latency / lower bandwidth network. This is different for every application so there isn’t a global optimum. Hybrid approaches (progressive enhancement, hydration) are also popular for blending the two, as are approaches like CDN edge execution, which again suggests different sweet spots for different types of applications.
The correct answer is to measure and adjust. I have seen exactly zero SPAs which delivered the authors’ original expectations for performance so I’d always rely on data.
This can also be repeated for reliability: SPAs can be great for reducing server load if you can run a lot of work on the client, but that comes at the cost of giving up control of more of the runtime environment and complicating monitoring and support. The right balance is going to depend on what you’re doing, who your users are, and where you’re running servers.
The other big factor to consider is development time. Writing an SPA requires shipping a lot more code to the client and taking on more responsibility to duplicate what browsers have built-in - for example, they usually have significant accessibility challenges which are easy to miss until late in the cycle. That extra code necessitates more work on things like packaging and since you’re running it on other people’s computers deployment is more complicated. Again, different teams will have different sweet spots based on their staffing, needs, and chosen tools.
But the runtime behavior will near-certainly far-prefer a thicker client model, where more is retained & locally available.
This was obvious & clear in the aughts, as web-apps emerged from purely server-side systems. Having a client that could re-render based on dynamic data was vastly better than trying to find portals of HTML to re-render & send.
It seems still likely & obviously true now. But folks sure AF love load time metrics!
This is why you have to measure what your users actually experience and understand how they want to use your app. If you do, you learn that browser caching does not have the dramatic impact thick client proponents often assume and connections are similarly slower and less reliable. That produces unsatisfying experiences which are more memorable than the times where your app loads quickly.
> Having a client that could re-render based on dynamic data was vastly better than trying to find portals of HTML to re-render & send.
This can be true but again measurements will show it isn’t true far more than claimed in the SPA marketing materials. This is due to the reasons I mentioned in my original comment: if your SPA has to make a number of network calls to update the view with a modest amount of visible data, it’s probably going to lose compared to the same code running on a faster processor in a data center with a network path to the data which has greater bandwidth and 1-2 orders of magnitude lower latency.
The break-even point is going to depend on the type of application you have, how much data you need to touch, how well that can be cached locally or on a CDN, and where you run servers and their performance characteristics, and the usage patterns of your users.
This is why the hybrid approaches end up being most commonly successful: if you’re dogmatic about using one approach exclusively you’ll miss out on the optimizations which make sense for your users by trying to follow an approach designed for someone else in a different business.
> Client devices are sufficiently performant for anything you throw at it
Possibly true of the devices themselves (although dubious generally), but definitely not true of wireless networks. If you can deliver a rendered page in 10 KB versus a 1 KB doc and 120 KB of framework to render that same page, you win. That’s true regardless of whether it’s a 3G connection or 5G.
SPAs can hold a small advantage in the aggregate, so if you expect your users to be bouncing around routes quite often within a session, you might amortize the cost of the view framework and bundles, but that’s probably not the typical case on mobile, particularly in developing regions.
120 KB is largely overblown. Especially with gzip, it reduces even further. There aren't many countries left where the time to load 1 KB vs 120 KB on mobile or broadband Internet would be noticeable.
Why does everyone like skinny jeans and not bell bottoms? Light mode or dark mode? Fashion trends change. Cynically, there's also a measure of job security in chasing trends. Gotta keep up with the latest whizbang framework.
The web has a huge anti- crowd all around it. People detest that there's this terrible awful wretched browser pretending like it can do code. Which is 98% gatekeeping sad-pants whiner-shit.
There's problems, but the amplification problem is uber for real, and most of the complaints just don't frigging matter; this is 98% a Bullshit Asymmetry problem: it takes vastly more energy to care about & refute troll-ish petty behavior than it takes to generate it. Positive behavior rarely gets amplified. There's an outsized representation for anger/negativity, as there almost always is, everywhere, forever.
Personally I think it's absurd to favor servers. AKA mainframes. Thick client web systems have a fundamental embrace of user-agency, in a way that thin-client systems are utterly unable to negotiate. It's cheap and easy to take digs on how bad things have been, and 100%, there's been a pretty poor developmental track record & much could & should be better in the client side universe. But it's still 100% better a RFC 8890 paradigm - The Internet Is For Users - than literally all the rest of computing, which has nothing for users & offers them diddly squat in terms of agency.
Alex's critique / pouring gas on the fire is wrong for a simple reason: we've been asleep at the wheel & embraced little change. URL routing, incremental loading, offline capability, & a host of interesting sprawling new web capabilities that could improve everything are still drastically unadopted. We have a lot of libraries for orthogonal concerns, but there have been essentially zero successful efforts in the last decade to define a system even as comprehensively cross-cutting as literally the first successful JavaScript framework was, Backbone. We have reduced scope since, and left orgs to cobble together orthogonal libraries, which has strengths, but makes it harder to adopt & leverage the high minded better potentials of what could happen. The web just hasn't evolved or advanced significantly, in spite of some churning iteration for how we all React.
SPAs are wonderful, but we've crashed against the limits of what we are willing to explore, and there's been a lack of way forward projected, and the upsurge in anti-SPA SSR is basically just an off-gassing, a pent up pressure to do anything other than what we've been up to, and while it has some virtue, I still ascribe most of it's interest to the fact that client-side systems have been at a standstill & real coherence & sensible integrative holistic imagining of how we might better do SPA's has been obviously absent for a while now.
GraphQL was some attempt to respin the front-end world but I think by now we know it's not really actually much different or better, but there's ideas captured there, where front-ends are not as hand-craft artisinally-made that we still are far from, and there's countless capabilities we should be tapping - like good Service Worker patterns, much more webworker and distributed computing ideas - that have made essentially no progress. At some point we need to upgrade & enhance what we expect of SPAs, but while the tooling has changed somewhat (React moving from classes, to HoC, to hooks, Redux to whatever), there's been little actual embrasure of real change, real web capabilities, that have been all about.
SPAs are still one of the greatest best option out there, even though their means-of-production are essentially unchanged for a decade & the hipsters are desperately hungry for cooler/bettter. Even though we've hit this kind of so so local maxima, there's just nothing anywhere as effective, and the whole SSR pitch may fit some roles, but for like 98% of systems, a SPA has a ton of advantages in being a more unified isolated independent thick client that just networks back to a bunch of dumb resources, which is a better/simpler & less overhead model - albeit one with some more loading cost - than the radically weirder & more complex models of splicing apart the baby & having part of the work happen one place & part of the work happen elsewhere. And none of that is good for the user anyways. I look forward to a new novel exciting future, but this is not the RFC 8890 way forwards. SPAs are kind of stuck, but still the best version of the web we have.
Depending on who you talk to or what online forums you frequent, you may have had an impression that the first group was far smaller (or larger) than the last group. Due to changes in the media you consume, or changes to the culture of various online forums, it may seem to you like the first group is now larger (or smaller!) than the second instead.
On the margin I suppose a few people who liked SPAs due to the novelty may have got bored with them, but most people, I think, have had pretty static views about SPAs, because the pros and cons are pretty obvious.
Yeah, okay, some prominent engineer tweeted something bad about SPAs last month and it went semi-viral, but prominent engineers have been tweeting and writing bad things about SPAs for the entire existence of SPAs. And more than a few of those have gone pretty viral.
Better question might be "How come I saw this tweet on my feed, but didn't notice all the earlier ones?", but I don't think that's super interesting.