I have been completely shocked by the number of people in the tech industry who seem to genuinely place no value on humanity and so many of its outputs. I see it in the writing of leaders within VC firms and AI companies but I also see it in ordinary conversations on the caltrain or in coffee shops.
Friendship, love, sex, art, even faith and childrearing are opportunities for substitution with AI. Ask an AI to create a joke for you at a party. Ask an AI to write a heartfelt letter to somebody you respect. Have an AI make a digital likeness of your grandmother so you can spend time with her forever. Have an AI tell you what you should say to your child when they are sad.
If you want another side data point, most people I know both in Japan and Canada use some sort of an AI as a replacement for any kind of query. Almost nobody in my circles are in tech or tech-adjacent circles.
So yeah, it’s just everyone collectively devaluing human interaction.
Because the responses are often distilled down from the same garbage Google serves up, but presented as the opinion of Claude, whom she increasingly trusts.
I use Claude a lot. I have the most expensive Claude Max subscription both for my own consultancy and at client sites, separately. I'm increasingly close to an AI maximalist on many issues, so I'm not at all against extensive use of these models.
But it's not quick enough to of its own accord resort to verifying things before giving answers to be suitable as a general purpose replacement for Google unless you specifically prompt it to search.
Google search results: a dozen sponsored links; a dozen links to videos (which I never use -- I'd rather read than watch); six or seven pages with gamed SEOs; if you're lucky, what you actually want is far down near the end of the first page, or perhaps at the top of the second page; the other 700 pages of links are ... whatever. Repeat for our five times with variously tweaked queries, hoping that what you actually want will percolate up into the first or second page.
Claude: "Provide me links to <precise description of what you actually want". Result: 4 or 5 directly relevant links, most of which are useful, and it happens on the first query.
Claude is dramatically more efficient than Google Search.
> Claude: "Provide me links to <precise description of what you actually want". Result: 4 or 5 directly relevant links, most of which are useful, and it happens on the first query.
Which, as I pointed out, is not the point, as you're advocating exactly the kind of prompting I said wouldn't be a problem. It's not how she uses it.
Ah, that's a good call-out. I don't use Claude aside from in Cursor; I use ChatGPT for normal queries and it's pretty good about doing searches when it doesn't think it knows the answer. Of course it'll search when prompted, but it'll often search without prompting too. I just mistakenly assumed that your fiancée's usage of Claude implied Claude was actually searching as well.
Google search sucks now because it's been targeted by the spammers and content farms. Before that happened it was pretty good. LLMs will eventually be poisoned the same way, whether by humans or other LLMs.
Garbage in, garbage out + chat bots will be monetized which means they will show you things their ad partners want you to see vs what you actually want.
frankly I've found even chat gpt free to be more useful when looking for something - I'd describe what I'm looking for, what are must-have features, what I definitely don't mean, etc and it'll suggest a few things. This has rarely led to not finding what I'm looking for. it's absolutely superior to Google search these days for things that have been around a while. I wouldn't check the news with it.
> If automation reaches the point where 99% of humans add no value to the "owners" then the "owners" will own nothing.
I don't think that's right. The owners will still own everything. If or when that happens, I think the economy would morph into a new thing completely focused on serving the whims of those "owners."
> If or when that happens, I think the economy would morph into a new thing completely focused on serving the whims of those "owners."
I think you might be a little behind on economic news, because that's already happening. And it's also rapidly reshaping business models and strategic thinking. The forces of capitalism are happily writing the lower and middle classes out of the narrative.
>> If or when that happens, I think the economy would morph into a new thing completely focused on serving the whims of those "owners."
> I think you might be a little behind on economic news, because that's already happening. And it's also rapidly reshaping business models and strategic thinking. The forces of capitalism are happily writing the lower and middle classes out of the narrative.
No, that doesn't surprise me at all. I'm basically just applying the logic of capitalism and automation to a new technology, and the same thing has played out a thousand times before. The only difference with AI is that; unlike previous, more limited automation; it's likely there will be no roles for displaced workers to move into (just like when engines got good enough there were no roles for horses to move into).
It's important to remember that capitalism isn't about providing for people. It's about providing for people with wealth to exchange. That works OK when you have full employment and wealth gets spread around by paying workers, but if most jobs disappear due to automation there's no mechanism to spread wealth to the vast majority of people, so under capitalism they'll eventually die of want.
What would they get from the plebs? Suppose we went through The Phools and so the plebs were exterminated, then what? Perhaps we'd finally have Star Trek economics, but only for them, the "owners". Better be an "owner", then.
> Perhaps we'd finally have Star Trek economics, but only for them, the "owners". Better be an "owner", then.
I don't think we'll have Star Trek economics, because that would be fundamentally fair and egalitarian and plentiful. There will still be resource constraints like energy production and raw materials. I think it will be more like B2B economics, international trade, with a small number relevant owners each controlling vast amounts of resources and productive capacity and occasionally trading basics amongst themselves. It could also end up like empires-at-war (which actually may be more likely, since war would give the owners something seemingly important to do, vs just building monuments to themselves and other types of jerking off).
Consider being a significant shareholder in the future as analogous to citizenship as it exists today. Non-owners will be persona non gratae, if they're allowed to live at all.
See also: Citigroup's plutonomy thesis[1] from 2006
tldr: the formal economy will shift to serving plutocrats instead of consumers, it's much more profitable to do so and there are diminishing returns serving the latter
Those nerds can now develop an AI robot to make love to their wives while they get back to blogging about accelerationism with all the time they freed up.
Making predictions on how it will turn out VS designing how it should be. Up til now, powerful people needed lots and lots of other humans to sustain their power&life. Thus that dependency gave the masses leverage. Now I'd like a society we're everyone is valued for being human and stuff. With democracies we got quite far in that direction. Attempts to go even further... Let's just say "didn't work out". And right now, especially in the US, the societal system seems to go back to "power" instead rules.
Yeah, I see a bleak future ahead. Guess that's life, after all.
In the "learn to love democracy and freedom" sense, sure, but in the economic sense? "Didn't work out" feels like a talking point stuck in 1991. Time has passed, China is the #2 economy in the world, #1 if you pick a metric that emphasizes material or looks to the future. How did they get there? By paying the private owners of our economy to sell our manufacturing base to them piece by piece -- which the private owners were both entitled and incentivized to do by the fundamental principles of capitalism. The ending hasn't been written, but it smells like the lead-up to a reversal in fortune.
As for our internal balance of power, we've been here before, and the timeline conveniently lines up to almost exactly 100 years ago. I'm hoping for another Roosevelt. It wasn't easy then, it won't be easy now, but I do think it's fundamentally possible.
This is the direct result of abandoning religion altogether and becoming a 100% secular society.
I am currently reading the Great Books of the Western World in order to maybe somehow find god somewhere in there, at least in a way that can be woven into my atheist-grown brain, and even after just one year of reading and learning, I can feel the merits.
Accepting Science as our new and only religion was a grave mistake.
why exactly do you need a deity to tell you to love your fellow man? Do you need god in your life to want to love your children? I think this is not quite right. I don't expect that the desire to create these tools independent of outcome in the valley is simply about greed , and for companies like anthropic, the ability to use AGI fear as a means to drive investment in themselves from VC class that lives the idea of obliterating human labor. We need less money in tech - we'll probably get it soon enough.
> why exactly do you need a deity to tell you to love your fellow man?
Because that is not a given, as shown by the entirety of human history. Without God, the only arguments for love, or what is right, is just what people think/feel/agree on at a certain time and place, which has a lot of variations and is definitely not universal.
> Do you need god in your life to want to love your children?
Most people don't need God to love their children, and the ones that don't might not be convinced otherwise by God.
That said, what do you do exactly for that love? Do you cheat and steal to guarantee their future over others? If not because of some "benefit to society" logical argument that would convince no-one, why would one even care about that and not exploit society for their own benefit?
Almost everyone loves themselves and their family above all others. Only God can tell you to love your neighbors and even your enemies.
There are still many societies around the world where most people are mostly self centered and you can see the results. You are taking for granted many values you have, as if you arrived to them logically and indipendently instead of learning them from your parents and a society that derived them from God for centuries.
Are we completely ignoring the tonnes of awful things people have done in the name of their god? Belief in a higher power doesn't automatically make you good/bad. The same is true of the inverse.
>Without God, the only arguments for love, or what is right, is just what people think/feel/agree on at a certain time and place, which has a lot of variations and is definitely not universal.
Lets ignore that laws exist for a second....Does god say everybody in Manhattan should reserve the left side of the escalators for people walking up them, and the right should be left for people just standing and escalating? No, but somehow a majority of the population figured it out. Society still has rules, both spoken and unspoken, whether god is in the picture or not
If you are serious about these questions, read Dominion by Tom Holland. He makes a very long and thorough historical case that Christianity has contributed more good than bad over the centuries. (I don’t know what comparable works are for other religions.)
Decoupled from the social systems built by organized religion, our “elites” are taking society to a horrific place.
Could you build up traditions and social structures over time without any deity that would withstand the hedonism and nihilism driving modern culture? Perhaps. But it would require time measured in generations we don’t have.
At the very least , it takes generations to build up shared traditions and values across a society. If you want an atheistic version of that, you would need to start now and it’s going to take a long time to build.
> If you want an atheistic version of that, you would need to start now and it’s going to take a long time to build.
Why do you think we would have to start from zero? Even in highly religious countries not all traditions and values are tied to religion, and even those that are can be disconnected from their religious roots.
I would just say the success rate hasn't been very high so far.
The current evidence suggests that as people have become less religious, society has become more fragmented and individualistic, with less shared values and less sense of community or family.
Shared religious community has been replaced by quality time with screens, not meeting in person in some alternative atheistic community.
Writing off an entire facet of life as toxic, is toxic.
Anything taken to extreme can be harmful, but some of the most grounded and successful (as in, living well) people I know are those with a self-aware religious foundation to lean on. People may bring up examples of religious cults as a reason to discard all religion, but surely the same could be said for the many secular cults. We shouldn't throw out the baby with the bathwater, as they say.
We don't need religion for that, humanism exists a way of living for instance.
I don't think (most) people treat science "as a religion".
Some tech leaders seem to have swap Ayn Rand (who if you look at the early days definitely acted like a cult leader), to this AI doomer cult, and as a result seem to be acting terribly.
Religion was much wider spread in the 1800s, but that didn't stop industrialists acting terribly.
I can't say I'm shocked. Disappointed, maybe, but it's hardly surprising to see the sociopathic nature in the people fighting tooth and nail for the validation of venture capitalists who will not be happy until they own every single cent on earth.
There are good people everywhere, but bring good and ethical stands in the way of making money, so most of the good people lose out in the end.
AI is the perfect technology for those who see people as complaining cogs in an economic machine. The current AI bubble is the first major advancement where these people go mask off; when people unapologetically started trying to replace basic art and culture with "efficient" machines, people started noticing.
Friendship, love, sex, art, even faith and childrearing are opportunities for substitution with AI. Ask an AI to create a joke for you at a party. Ask an AI to write a heartfelt letter to somebody you respect. Have an AI make a digital likeness of your grandmother so you can spend time with her forever. Have an AI tell you what you should say to your child when they are sad.
Hell. Hell on earth.