> Per Ray, the changes Google has been making seem to have most affected global websites with lots of inline ads and affiliate revenue links for e-commerce, because those are indicated as being more spammy
This part sounds good to me. I’ve been waiting ages for Google to crack down on affiliate spam.
I was actually looking for the “web” filter today as I got the generative experience for the first time, and it wasn’t there. Once it does roll out though I’ll be keeping an eye out for a user script that auto switches to that filter. I can’t stand generative AI and I want it out of my damn search.
Can I asked what you’ve used? I was trying out Kagi for a while but it seems like they’ve been pivoting to focusing on their own GPT and putting GenAI summaries in the search feed. I’m currently defaulting to DDG but the search results are so poor half the time I usually end up banging to Google anyways.
DDG is not much better than Google as I understand. I heard they've been caught selling info.
This is my usage pattern:
1. Main - Home computer: Whooogle running on a RPi3 (it uses SearXNG)
2. Phone - Bing/SearXNG: Unfortunately the convenience of an app still gets me sometimes, and with GPT-4 Copilot, Bing is sometimes an option I use.
3. Work - Brave Search (given I cant use my Whooogle). Generally performs well but putting my trust in their "we dont track you or sell your info". Which given their crypto ties, ads within the browser (new tab etc.) doesnt make me feel 100% confident.
Kagi isn't really pushing the generative stuff, or any specific features. If you don't like the summaries, you can turn them off (I think the one you mean is settings, search, more, Summary Box). You can turn off basically all the extras beyond just the titles and links.
Slowly migrating away from google search anyway. Phind is working out better - also gen AI but I trust them more than google not to fk with the results
DDG works about as good as Kagi for me. And I enjoy using GPT4 with the search tool for power assist, since it's better at planning search terms, extracting into, and integrating the search results into a longer conversation and critical analysis.
My usage of Google has dropped dramatically in the last few months, with ChatGPT and Claude answering 80% of my questions with astronomically better answers now that it has context of what I’m doing.
Google really really screwed up by missing this, now their clumsy catchup methods reek of desperation and internal chaos.
Google knowingly made their search results shittier and shittier for years in pursuit of improved KPIs, to the point that alternative search engines are better, and so are LLMs that occasionally hallucinate or outright crash.
> Google knowingly made their search results shittier and shittier for years
unfortunately this extends to youtube too. now they have a new shitty trick. you click on the link and they randomly give you a completely different video.
my guess is they did something to cut costs by making each search less ressource intensive. I would pay to have google search behave as it used to. nowadays it seems to ignore most of what I try to tell it to look for, and instead shows what it "thinks" most other people probably mean.
ChatGPT and Claude are going to find it difficult to compete with Google, MSFT, and Meta longterm, who are rolling out their AI capabilities to billions of users. The aforementioned started from zero and we know most people will choose the default provided. With MSFT owning half of OpenAI, I suspect it is only a matter of time before ownership officially changes.
It's possible for sure, as the big guys flip too soon into commercializing, while the smaller or open-sized guys are becoming more and more efficient for similar results, they are also targets to be partnered with or acquired.
Compute being a huge demand is a part truth, and also part PR.
If we look at open source models, are many not improving.. and needing fewer and fewer resources (say compute) for better and better results?
Google Cloud Vertex AI let's you run any of the open source models on Google infra, they even have Claude models available. MSFT likely has similar in their cloud.
Where are people going to run their models? I for one will choose the cloud I already use. It has APIs for the big models and simple deployment of both open source and other proprietary models.
This is completely separate from providing end users a service. How many people self host or run their own alternatives when there are managed services available? It is unlikely people are going to switch en mass to open source models, especially while there is a price war on SoTA models. It's becoming far cheaper to call a SoTA API than have an always on open source model.
From my experience, running a small model locally was both slower (tokens/sec plus overall system slowdown) and had worse results. I switched to cloud based APIs and will likely not consider reversing this decision. Multiple orders of magnitude improvements would need to happen in both performance and quality
> It is unlikely people are going to switch en mass to open source models
It depends on the task at hands. For complex tasks no way personal computer can compete with giants data centers. But, as soon as software becomes available, users will gladly switch to local AI for personal data search / classification / summation, etc. This market is potentially huge, for private sensitive there is no other way.
Do you think the majority of humans will do this when there is a chatbot already available by default or by saying a wake word?
There is a growing constituent that doesn't even own a computer and rely on their phone only.
You have to step out of your shoes, where you are interested in these things, to consider things from a non-technical person's pov. There is a lot of unknown in having to decide what model to use... or they can just use the one put in front of them that doesn't drain their battery
I spend a lot of my time stepping outside of my shoes and making technical things accessible to non-technical people.
People having on-device private AI is something Apple is already pushing. Not sure if you've had a chance to catch up on that lately.
The important parts to explore are what people aren't doing, and what will be ready for people to adopt.
Still, this isn't the perfect example and I think it's a little unrealistic to use this as the reason to shut down the possibility of something easily existing on-device. But it is an example.
It's true, you're focusing on specifically how LM Studio works. But I provided an example that makes it a few orders of magnitude to use on a desktop, and for the interested, this is enough to learn from youtube.
You should see how non-technical people are learning to use AI and teaching others, it is very eye opening.
I would look outside of what's happening in LLM / AI to see what things have looked like historically for other technologies. Right now, anything in the AI field is early adopters and not evidence for what will happen when mass adoption happens.
Search, browser, social media, email, laptops, phones, cloud providers, word processors, business office suites... brand name vs open source... it's a telling story
I'm not against people using local models, but from personal experience, I have a hard time seeing the average consumer choosing this route. People avoid the hassle of self management/hosting
This part sounds good to me. I’ve been waiting ages for Google to crack down on affiliate spam.