Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I doubt it's going to go the same way as search. You can't run Google on consumer hardware, but you can run LLMs locally.

At worse, newer models will get worse and you can just stick to older models.

You could also argue that proprietary models gated by an API are better than anything you can run locally, and yeah maybe those will get worse with time.

They're not going to get any worse than what you can run locally though. If they do, open models will overtake them, and then we'd be in a better position overall.



> You can't run Google on consumer hardware, but you can run LLMs locally.

You can't run an up to date model locally. When I ask Googles models they have knowledge from stuff just a week ago, without using search. You wont get that from a giant local model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: