Ah, 'suffering builds character'. I haven't had that one in a while.
Maybe we should not want to get prepared for RealPeople™ if all they can do is break us and disappoint us.
"But RealPeople™ can also elevate, surprise, and enchant you!" you may intervene. They sure than. An still, some may decide no longer to go for new rounds of Russian roulette. Someone like that is not a lesser person, they still have real™ enjoyment in a hundred other aspects in their life from music to being a food nerd. they just don't make their happiness dependant on volatile actors.
AI chatbots as relationship replacements are, in many ways, flight simulators:
Are they 'the real thing'? Nah, sitting in a real Cessna almost always beats a computer screen and a keyboard.
Are they always a worse situation than 'the real thing'? Simulators sure beat reality when reality is 'dual engine flameout halfway over the North Pacific'
Are they cheaper? YES, significantly!
Are they 'good enough'? For many, they are.
Are they 'syncophantic'? Yes, insofar as that circumstances are decided beforehand. A 'real' pilot doesn't get to choose 'blue skies, little sheep clouds in the sky', they only get to chosen not to fly that day. And the standard weather settings? Not exactly 'hurricane, category 5'.
Are they available, while real flight is not, to some or all members of the public? Generally yes. The simulator doesn't make you have a current medical.
Are they removing pilots/humans from 'the scene'? No, not really. In fact, many pilots fly simulators for risk-free training of extreme situations.
Your argument is basically 'A flight simulator won’t teach you what it feels like when the engine coughs for real
at 1000 ft above ground and your hands shake on the yoke.'. No, it doesn't. An frankly, there are experiences you can live without - especially those you may not survive (emotionally).
Society has always had the tendency to pathologize those who do not pursue a sexual relationship as lesser humans. (Especially) single women that were too happy in the medevieal age? Witches that needed burning. Guy who preferred reading to dancing? A 'weirdo and a creep'. English knows 'master' for the unmarried, 'incomplete' man, an 'mister' for the one who got married. And today? those who are incapable or unwilling to participate in the dating scene are branded 'girlfailure' or 'incel' - with the latter group considered a walking security risk. Let's not add to the stigma by playing another tune for the 'oh, everyone must get out there' scene.
One difference between "AI chatbots" in this context and common flight simulator games is that someone else is listening in and has the actual control over the simulation. You're not alone in the same way that you are when pining over a character in a television series or books, or crashing a virtual jumbo jet into a skyscraper in MICROS~1 Flight Simulator.
Now ... why you want to police the decisions others make (or chose not to make) with their data ... it has a slightly paternalistic aspect to it, wouldn't you agree?
This is the exact kind of thinking that leads to this in the first place. The idea that a human relationship is, in the end, just about what YOU can get from it. That it's just simply a black box with an input and output, and if it can provide the right outputs for your needs, then it's sufficient. This materialistic thinking of other people is a fundamentally catastrophic worldview.
A meaningful relationship necessarily requires some element of giving, not just getting. The meaning comes from the exchange between two people, the feedback loop of give and take that leads to trust.
Not everyone needs a romantic relationship, but to think a chatbot could ever fulfill even 1% of the very fundamental human need of close relationships is dangerous thinking. At best, a chatbot can be a therapist or a sex toy. A one-way provider of some service, but never a relationship. If that's what is needed, then fine, but anything else is a slippery slope to self destruction.
> This is the exact kind of thinking that leads to this in the first place. The idea that a human relationship is, in the end, just about what YOU can get from it. That it's just simply a black box with an input and output, and if it can provide the right outputs for your needs, then it's sufficient. This materialistic thinking of other people is a fundamentally catastrophic worldview.
> A meaningful relationship necessarily requires some element of giving, not just getting. The meaning comes from the exchange between two people, the feedback loop of give and take that leads to trust.
This part seems all over the place. Firstly, why would an individual do something he/she has no expectation to benefit from or control in any way? Why would he/she cast away his/her agency for unpredictable outcomes and exposure to unnecessary and unconstrained risk?
Secondly, for exchange to occur there must a measure of inputs, outputs, and the assessment of their relative values. Any less effort or thought amounts to an unnecessary gamble. Both the giver and the intended beneficiary can only speak for their respective interests. They have no immediate knowledge of the other person's desires and few individuals ever make their expectations clear and simple to account for.
> Not everyone needs a romantic relationship, but to think a chatbot could ever fulfill even 1% of the very fundamental human need of close relationships is dangerous thinking. At best, a chatbot can be a therapist or a sex toy. A one-way provider of some service, but never a relationship. If that's what is needed, then fine, but anything else is a slippery slope to self destruction.
A relationship is an expectation. And like all expectations, it is a conception of the mind. People can be in a relationship with anything, even figments of their imaginations, so long as they believe it and no contrary evidence arises to disprove it.
> This part seems all over the place. Firstly, why would an individual do something he/she has no expectation to benefit from or control in any way? Why would he/she cast away his/her agency for unpredictable outcomes and exposure to unnecessary and unconstrained risk?
It happens all the time. People sacrifice anything, everything, for no gain, all the time. It's called love. When you give everything for your family, your loved ones, your beliefs. It's what makes us human rather than calculating machines.
You can easily argue that the warm, fuzzy dopamine push you call 'love', triggered by positive interactions, is basically a "profit". Not all generated value is expressed in dollars.
"But love can be spontaneous and unconditional!" Yes, bodies are strange things. Aneuryisms also can be spontaneous, but are not considered intrinsically altruistic functionality to benefit humanity as a whole by removing an unfit specimen from the gene pool.
"Unconditional love" is not a rational design.
It's an emergent neural malfunction: a reward loop that continues to fire even when the cost/benefit analysis no longer makes sense. In psychiatry, extreme versions are classified (codependency, traumatic bonding, obsessional love); the milder versions get romanticised - because the dopamine feels meaningful, not because the outcomes are consistently good.
Remember: one of the significant narratives our culture has about love - Romeo and Juliet - involves a double suicide due to heartbreak and 'unconditional love'. But we focus on the balcony, and conveniently forget about the crypt.
You call it "love" when dopamine rewards self-selected sacrifices. A casino calls it "winning" when someone happens to hit the right slot machine. Both experiences feel profound, both rely on chance, and pursuing both can ruin you. Playing Tetris is just as blinking, attention-grabbing and loud as a slot machine, but much safer, with similar dopamine outcomes as compared to playing slot machines.
So ... why would a rational actor invest significant resources to hunt for a maybe dopamine hit called love when they can have a guaranteed 'companionship-simulation' dopamine hit immediately?
What do you think of the idea that people generally don't really like other people - that they do generally disappoint and cause suffering. (We are all imperfect, imperfectly getting along together, daily initiating and supporting acts of aggression against others.) And that, if the FakePeople™ experience were good enough, probably most people would opt out of engaging with others, similar to how most pilot experiences are on simulators?
Ultimately, that's the old Star Trek 'the holodeck would - in a realistic scenario - be the last invention of a civilization' argument.
I think that there will always be several strata of the population who will not be satisfied with FakePeople™, either because they are unable to interact with the system effectively due to cognitive or educational deficiencies, or because they are in a belief that RealPeople™ somehow have a hidden, non-measurable capacity (let's call it, for the lack of a better term, a 'soul'), that cannot be replicated or simulated - which makes it, ultimately, a theological question.
There is probably a tipping point at which the number of RealPeople™ enthusiasts is so low reasonable relationship matching is no longer possible.
But I don't really think the problem is 'RealPeople™ are generally horrible'. I believe that the problem is availability and cost of relationship - in energy, time, money, and effort:
Most pilot experiences are on simulators because RealFlight is expensive, and the vast majority of pilots don't have access to an aircraft (instead sharing one), which also limits potential flight hours (because when the weather is good, everyone wants to fly. No-one wants the plane up in bad conditions, because it's dangerous to the plane, and - less important for the ownership group - the pilot.)
Similarly: Relationship-building takes planning effort, carries significant opportunity cost, monetary resources, and has a low probability of the desired outcome (whatever that may be, it's just as true for 'long-term potentially married relationship as it is for the one-night stand). That's incompatible with what society expects from a professional these days (e.g. work 8-16 hours a day, keep physically fit, save for old age and/or potential health crisis, invest in your professional education, the list goes on).
Enter the AI model, which gives a pretty good simulation of a relationship for the cost of a monthly subway card, carries very little opportunity cost (simulation will stop for you at any time if something more important comes up), and needs no planning at all.
Risk of heartbreak (aka: potentially catastrophic psychiatric crisis, yes, such cases are common) and hell being people doesn't even have to factor in to make the relationship simulator appear like a good deal.
If people think 'relationship chatbots' are an issue, just you wait for when - not if - someone builds a reasonably-well-working 'chatbot in a silicone-skin-body' that's more than just a glorified sex doll - a physically existing, touchable, cooking, homemaking, reasonably funny, randomly-sensual, and yes, sex-simulation-capable 'Joi' (and/or her male-looking counterpart) is probably the last invention of mankind.
You may be right, that RealPeople do seek RealInteraction.
But, how many of each RealPerson's RealInteractions are actually that - it seems to me that lots of my own historical interactions were/are RealPersonProjections. RealPersonProjections and FakePerson interactions are pretty indistinguishable from within - over time, the characterisation of an interaction can change.
But, then again, perhaps the FakePerson interactions (with AI), will be a better developmental training ground than RealPersonProjections.
Ah - I'll leave it here - its already too meta! Thanks for the exchange.
Maybe we should not want to get prepared for RealPeople™ if all they can do is break us and disappoint us.
"But RealPeople™ can also elevate, surprise, and enchant you!" you may intervene. They sure than. An still, some may decide no longer to go for new rounds of Russian roulette. Someone like that is not a lesser person, they still have real™ enjoyment in a hundred other aspects in their life from music to being a food nerd. they just don't make their happiness dependant on volatile actors.
AI chatbots as relationship replacements are, in many ways, flight simulators:
Are they 'the real thing'? Nah, sitting in a real Cessna almost always beats a computer screen and a keyboard.
Are they always a worse situation than 'the real thing'? Simulators sure beat reality when reality is 'dual engine flameout halfway over the North Pacific'
Are they cheaper? YES, significantly!
Are they 'good enough'? For many, they are.
Are they 'syncophantic'? Yes, insofar as that circumstances are decided beforehand. A 'real' pilot doesn't get to choose 'blue skies, little sheep clouds in the sky', they only get to chosen not to fly that day. And the standard weather settings? Not exactly 'hurricane, category 5'.
Are they available, while real flight is not, to some or all members of the public? Generally yes. The simulator doesn't make you have a current medical.
Are they removing pilots/humans from 'the scene'? No, not really. In fact, many pilots fly simulators for risk-free training of extreme situations.
Your argument is basically 'A flight simulator won’t teach you what it feels like when the engine coughs for real at 1000 ft above ground and your hands shake on the yoke.'. No, it doesn't. An frankly, there are experiences you can live without - especially those you may not survive (emotionally).
Society has always had the tendency to pathologize those who do not pursue a sexual relationship as lesser humans. (Especially) single women that were too happy in the medevieal age? Witches that needed burning. Guy who preferred reading to dancing? A 'weirdo and a creep'. English knows 'master' for the unmarried, 'incomplete' man, an 'mister' for the one who got married. And today? those who are incapable or unwilling to participate in the dating scene are branded 'girlfailure' or 'incel' - with the latter group considered a walking security risk. Let's not add to the stigma by playing another tune for the 'oh, everyone must get out there' scene.