Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"There is no way the trillions of dollars of valuation placed on AI companies can be backed by any amount of future profit."

This is just a case of the user being unable to see far enough into the future. Yes, there's huge future profit to be had.



I think a lot of this viewpoint comes from the fact that the median software engineer doesn't really have a lot of exposure to mature, and often therefore regulated industries and how much make-work paper pushing and ass-covering paper pushing there is.

I have no idea what fraction of our economic productivity is wasted doing these sort of TPS reports but it's surely so massive that any software that lets us essentially develop more software on the fly to cut that back even slightly is highly valuable.

Previously only the most moneyed interests and valuable endeavors could justify such software, like for example banks flagging sus transactions. Current AI is precariously close to being able to provide this sort of "dumb first pass set of eyes" look at bulk data cheaply to lesser use cases for which "normal" software is not economically viable.


AI will not reduce the amount of time wasted on paperwork. It'll massively increase the amount generated and consumed.


The problem is that those same workers have like 5% key stuff they do, based on knowledge and depth they probably wouldn't have without all the surrounding 'TPS' style bs. Definitely not knowledge you can take from 10 seperated workers with their 5% and somehow get 1 worker working on that stuff 50% of the time.

Boring ass code reviews come in super handy because of the better familiarity, getting exposure to the code slowly, exposures to the 'whys' as they are implemented not trying to figure out later. The same with buyers overlooking boring paperwork, team leads, productions planners. Automating all that is going to create worse outcomes.

In a sane world if we could take the fluff away we would have those people only working 5% of the time for the same pay, but we live in a capitalist system where that can't be allowed, we need 100% utilization.


> based on knowledge and depth they probably wouldn't have without all the surrounding 'TPS' style bs.

>Boring ass code reviews come in super handy because of the better familiarity, getting exposure to the code slowly, exposures to the 'whys' as they are implemented not trying to figure out later.

But to what extent is this truly necessary vs a post-hoc justification? Workers are pushed to work right to the limit of "how little can you know about the thing without causing bad results" all the time anyway.

>In a sane world if we could take the fluff away we would have those people only working 5% of the time for the same pay, but we live in a capitalist system where that can't be allowed, we need 100% utilization.

<laughs in Soviet bureaucracy>.

The catholic church was making fake work for itself for about 500yr before it caused big problems for them. It's not the capitalism that's the problem. It's the concentration of power/influence/wealth/resources that seems top breed these systems.


Aside from the better versions of what AI is visibly doing now (software dev, human language translation, video gen, etc), many of the AI bears are dismissing the potential impact of hooking AI up with automated experimentation so it's able to generate new types of data to train itself. The impact on drug discovery, material science, and other domains are likely to be very significant. The Nobel Prize in Chemistry for AlphaFold is just a glimpse of this future.


Completely agreed. It won't even displace the people who were diligent in all of those crafts. It will supercharge them. And there will be novel combinations producing new services/products. It's going to be great.


"automated experimentation so it's able to generate new types of data to train itself"

AIs don't understand reality. This type of data generation would need a specific sort of validator function to work: we call this reality. That's what "experimentation" requires: reality.

We already have this right now, with the AI training ingesting AI crapgen, with StackOverflow posts no longer happening. That would seem to point to a degrading AI training set, not an improving one.


A number of startups are working in verifiable domains where they can provide realistic data. This is an interesting thread from one of those startups: https://x.com/khoomeik/status/1973056771515138175

Here's a discussion with Isomorphic Labs (Google DeepMind spinoff) on this line of thinking: https://www.youtube.com/watch?v=XpIMuCeEtSk


Side note: I happened to look at the SO "Community activity" widget earlier this week and was quite surprised to see just how far engagement has fallen off. I don't have historical entries to reference but I'm _fairly certain_ there used to be hundreds of thousands of users (if not more) online during the middle of an average work day (I'm in America/New_York) and there are currently ... 16,785.


A more sane answer is garbage in, garbage out, and this future never materializes.


Don't send garbage in!


Isn't that about as tangible as "don't write bugs"?


Not really. One is a conscious design choice for what you choose to build as your ethos or Magnum Opus or what have you. And the other is a consequence of dealing with hard techincal engineering and scientific matters. :)


If AI kicks off another "industrial revolution" level of productivity gains the profits could be well into the quadrillions of dollars. Sounds ridiculous but remember that the $10 T-shirt you're wearing would have taken a week of expert human labor to produce before the loom, cotton gin, etc.


Indeed. There are trillions of dollars /per year/ paid to workers in the US alone.


Like, there is an argument that can be made here, but "there's just not enough money in the world to justify this" definitely isn't it


Just because trillions are currently spent on employees, does not mean that another trillions exists to spend on AI. And if, instead, one's position is that those trillions will be spent on AI instead of employees, then one is envisioning a level of mass unemployment and ensuing violence that will result in heads on pikes.


Trillions of dollars is pocket change if you wait for enough inflation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: