Why wpuld anyone want to "learn" how to use some non-deterministic black box of bullshit that is frequently wrong? When you get different output fkr the same input, how do you learn?
How is that beneficial? Why would you waste your time learning something that is frequently changing at the whims of some greedy third party? No thanks.
there is certainly a future where this isn't the case. Learning how to use AI and use it in your workflows will likely for sure be a part of any serious dev's future, but being beholden to a data center does not seem to reflect reality. Consider all the 5m-8m models and how powerful they are today compared to what the best models did 2 years ago. If you want to stay absolute bleeding edge model wise, sure you'll be stuck at a data center for some time...
Why isn't this just kinda seen as a repeat of the original birth of computers? Consider the IBM 350 (3.5mb) rented in the 50s for thousands per month. Now I have a drawer filled with SD cards that go up to 128gb that i cant even give away.
Yeah, if we just ignore R&D, fixed costs, depreciation, and the fact that there's a high likelyhood investor were expecting a return, yeah, ignoring all of that, and trusting their number we may say inference turns a profit.
In accounting, almost anything you want can be true, at least for some time.
The weird part is that it's "shitting over the floor" in quite a deterministic ma nner. Every 600seconds (+- less than 0.5 seconds) doing the exact same thing.
The person who posted this bug doesn't seem like the pinnacle of software engineering. To me, this looks like either a user error or some corrupt file or context you should be able to clean up pretty quickly.
reply