Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT and Claude are going to find it difficult to compete with Google, MSFT, and Meta longterm, who are rolling out their AI capabilities to billions of users. The aforementioned started from zero and we know most people will choose the default provided. With MSFT owning half of OpenAI, I suspect it is only a matter of time before ownership officially changes.


It's possible for sure, as the big guys flip too soon into commercializing, while the smaller or open-sized guys are becoming more and more efficient for similar results, they are also targets to be partnered with or acquired.

Compute being a huge demand is a part truth, and also part PR.

If we look at open source models, are many not improving.. and needing fewer and fewer resources (say compute) for better and better results?


Google Cloud Vertex AI let's you run any of the open source models on Google infra, they even have Claude models available. MSFT likely has similar in their cloud.

Where are people going to run their models? I for one will choose the cloud I already use. It has APIs for the big models and simple deployment of both open source and other proprietary models.

This is completely separate from providing end users a service. How many people self host or run their own alternatives when there are managed services available? It is unlikely people are going to switch en mass to open source models, especially while there is a price war on SoTA models. It's becoming far cheaper to call a SoTA API than have an always on open source model.

From my experience, running a small model locally was both slower (tokens/sec plus overall system slowdown) and had worse results. I switched to cloud based APIs and will likely not consider reversing this decision. Multiple orders of magnitude improvements would need to happen in both performance and quality


> It is unlikely people are going to switch en mass to open source models

It depends on the task at hands. For complex tasks no way personal computer can compete with giants data centers. But, as soon as software becomes available, users will gladly switch to local AI for personal data search / classification / summation, etc. This market is potentially huge, for private sensitive there is no other way.


Can you point to any other software where this is true? Why do you think this will be different?


Tools like LM studio are interesting.

You can find a model, one click download and use relative to what your computer can do. You can rightsize it

Easy to run on a MBP for end users.


Do you think the majority of humans will do this when there is a chatbot already available by default or by saying a wake word?

There is a growing constituent that doesn't even own a computer and rely on their phone only.

You have to step out of your shoes, where you are interested in these things, to consider things from a non-technical person's pov. There is a lot of unknown in having to decide what model to use... or they can just use the one put in front of them that doesn't drain their battery


I spend a lot of my time stepping outside of my shoes and making technical things accessible to non-technical people.

People having on-device private AI is something Apple is already pushing. Not sure if you've had a chance to catch up on that lately.

The important parts to explore are what people aren't doing, and what will be ready for people to adopt.

Still, this isn't the perfect example and I think it's a little unrealistic to use this as the reason to shut down the possibility of something easily existing on-device. But it is an example.

It's true, you're focusing on specifically how LM Studio works. But I provided an example that makes it a few orders of magnitude to use on a desktop, and for the interested, this is enough to learn from youtube.

You should see how non-technical people are learning to use AI and teaching others, it is very eye opening.


I would look outside of what's happening in LLM / AI to see what things have looked like historically for other technologies. Right now, anything in the AI field is early adopters and not evidence for what will happen when mass adoption happens.

Search, browser, social media, email, laptops, phones, cloud providers, word processors, business office suites... brand name vs open source... it's a telling story

I'm not against people using local models, but from personal experience, I have a hard time seeing the average consumer choosing this route. People avoid the hassle of self management/hosting




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: