One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.
SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.
Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
> Flight simulators just had more cash for more advanced chips
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
Computers themselves were non-consumer to begin with, but the Personal Computer broke the technology moat to consumers before anything else and once that had passed it was mostly a matter of time imho.
Many 3d games like doom, quake1, flight unlimited,etc ran purely on software rendering since CPU's were already providing enough oomph to render fairly useful 3d graphics in the mid 90s. CPU power was enough but consoles/arcades showed that there was more to be gotten (but nothing hindered games at that point).
And already there, the capital investment for game consoles (Atari,NES,SNES,PS1,PS2, etc) and arcade games(like the above mentioned 3d games) were big enough to use custom chipsets not used or purposed for anything else (I think also that in the 80s/90s the barrier of entry to making competitive custom chips was a tad lower, just consider the cambrian explosions of firms during the 90s making x86 and later ARM chips).
Yes, there was vendors that focused on the high end commercial customers, and yes many alumnis of those firms did contribute a ton of expertise towards what we have today.
But if you look at what companies survived and pushed the envelope in the longer run it was almost always companies that competed in the consumer market, and it was only when those consumer chips needed even more advanced processing that we breached the point where the chips became capable of NN's.
In fact I'd say that had the likes of SGI prevailed we would've had to wait longer for our GPU revolution. Flight simulators,etc were often focused on "larger/detailed" worlds, PS2-era chips with higher polycounts and more memory would have been fine for simulator developers for a long time (since more details in a military scenario would have been fine).
Leisure games has always craved fidelity on a more "human" level, to implement "hacks" for stuff with custom dynamic lighting models, then global illumination, subsurface scattering,etc we've needed the arbitrary programmability since the raw power wasn't there (the most modern raytracing chips are _starting_ to approach that levels without too ugly hacks).
Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.
> When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
3Dfx was not the inventor of the GPU. There’s a long history of GPU development for corporate applications.
The iPhone wasn’t the first mobile phone. Early mobile phones were very expensive and targeted as businesses who wanted their executives in touch
You’re still thinking from a consumer-centric view. Zoom out and those consumer companies were not the first to develop the products. You didn't even think about the actual originators of those types of products because you don’t see them as a consumer.
The consumer centric view is powerful - as was stated above, the originators often were niche/expensive items. The power of the consumer market was what really drove the engineering/science forward and made those companies into market powerhouses. It also arguably killed intel and build TSMC into the most technically advanced company in the world.
I'm sorry, but I'm not sure if you're implying you dislike Apple's approach to what the user is allowed to do, or suggesting we should only talk about general purpose computing devices. If it's the latter, sure, the iPhone's not an innovation in that space, discard it from my list of examples. If it's the former, I'll give you that too, but it was still the first of its kind, by a large margin.
(I remember the huge window in which phone companies desperately put out feature phones with sub-par touch screens, completely missing the value to consumers. The iPod Touch should've been warning enough... and should've been (one of) my signal(s) to buy Apple stock, I guess :-)
Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.
> a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.
HN is strange. I have an old gaming build from 7-8 years ago and while it can do high end games on low settings and resolution, it doesn’t hold a candle to even a mid-range modern build.
“viable” is doing a lot of work in that claim. You can tolerate it at low res and settings and if you’re okay with a lot of frame rate dips, but nobody is going to mistake it for a modern build.
You’re also exaggerating how fast video cards became obsolete in the past. Many of us gamed just fine on systems that weren’t upgraded for 5-6 years at a time.
I'll take the absurd extreme end of my claim. Here [1] is a video of somebody running modern games on a GeForce GTX 1080 Ti, a card that was high end... 8 years ago. And he's doing it on high-ultra settings in 4k, and it's still doing fine. Spend a couple of hundred on a "new" video card and he'd be rocking a stable 60+FPS on everything, with some games he's still hitting that even with his card!
And back in the early 2000s, even bleeding edge current-year rigs would struggle with new games like Doom 3, Far Cry, Crysis, and so on. Hardware was advancing so rapidly that games were being built in anticipation of upcoming hardware, so you had this scenario where high end systems bought in one year would struggle with games released that year, let alone systems from 5-6 years prior.
Obviously if you're referencing CRPGs and the like, then yeah - absolutely anything could run them. The same remains even more true today. Baldur's Gate 3's minimum requirement is a GTX 970, a card more than 11 years old. Imagine a 1989 computer trying to run Baldur's Gate 2!
I'm still on PCIe 3.0 on my main machine and the RX580 works fine for my needs. Outside of the scope of OP, I recently picked up a (new) 5060 not due to the impending memory production apocalypse but because I wanted to extend my current setup with something I recently read about on LSFG, previously posted here but garnered no interest/comments.
I wonder about this...I had thought I would be on PCIe 5.0 by now but I'm still on my AM4 PCIe 4.0 board since AM5 and PCIe 5.0 seem...glitchy and heat prone. And apparently I'm still not saturating PCIe 4.0...
> We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
> with very little gains in raster performance this decade.
I have a flagship 7-8 year old GPU in one machine and a mid-level modern GPU in another.
It’s flat out wrong to claim “very little gains” during this time. The difference between those two GPUs is huge in games. The modern GPU also does it with far less power and noise.
I can’t understand this HN mentality that modern hardware isn’t fast or that we’re not seeing gains.
100%. We’ve seen crazy swings in RAM prices before.
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)