Do we actually know the other human feels that something? We don't, because we only hear them pretending they feel and usually believe them. Well, a machine could pretend just the same - they have enough training data to know what feeling is appropriate to claim.
Once I realised I had aphantasia (I don't see things in my "minds eye"), in my 40's, after having my whole life assumed people who said things like "visualise X" meant it abstractly or metaphorically rather than literally, it really drove home how little most of us understand about the inner mental processes of other people.
Even more so seeing people express total disbelief when I explain my aphantasia, or when others point out they don't have an inner monologue or dialogue.
Most people have far less understanding of other peoples inner life than they believe they do (and I have come to assume that applies to myself too - being aware that the experience is more different than I thought just barely scratches the surface of understanding it).
Ultimately this is a question of meaning. Where is meaning to be found?
It's going to come as a surprise to many that it is only to be found in the individual. Not in countries, nor in religious groups, or in football teams, or political parties, or any form of collective endeavour. The meaning is inside.
We can't know how other human beings feel, nor can we know whether machines can feel. However, it is a safe bet (to me) that other humans are like me (more or less). And that machinery is inanimate, regardless of appearances.
But then you will get attempts to anthropomorphise machines, eg giving AI citizenship (as per Sofia in the UAE). What is missed with this sort of anthropomorphising is what is actually occurring: the denigration of what it is to be human and to have meaning. A simulacrum is, by definition, not the thing-in-itself, but for nefarious reasons, this line will be heavily blurred. Imo.
We are already drawing lots of lines when antromorphising animals. Does an orangutan give meaning to its drawings? An elephant when it paints? A parakeet or a magpie when decorating their nests? Even fishes do decorations to attract mates, so their mates definitely draw some meaning from those actions. Now if you define "meaning" as something only humans can draw then okay machines won't have that meaning - although we both agreed each human will draw a different meaning anyway. This of course also excludes any sentient aliens from drawing meanings from human art, because well they are not humans. And that we humans will never understand a fish's art because we are not fishes. So meaning is both individual, and species-related? Or either? Which one is now the real meaning, the one the individual draws (then species is not relevant, so why not including machines) or the one the species draws (then it's also a group meaning, so again why not including machines)?
Or maybe your corner stone argument is "machinery is inanimate" - which would be another discussion by itself...
I don't think anthropomorphising animals is in the same category as anthropomorphising inanimate objects. A child might believe their teddy bear to have a character and life, but this is being projected on to the toy by the child. An animal however has its own experience, life, etc. What I've said can be objectively determined, do you agree?
I would agree that animals do have a life, but they are not at the same intellectual level as humans. You mention art though - this is a bad example for me - one that is not clear in meaning to humans. I have my own interpretation of what art is.
But just that - that I have an interpretation of what art is, this is a difference between humans and complex animals. It is evident that we handle complex concepts, and play with them. This is not the case for animals, and if there is some nascent behaviour like this, it is nothing like at the level that humans do.
That covers my views (more or less) on the differences between humans, animals, and inanimate objects (computers, toys).
The real point I was making though, is that meaning resides inside oneself. That is where the experience is 'enlivened'. You can watch cartoons move on a screen, actors move on a screen, other people in real life - but all that is just visual/auditory inputs. What gives it meaning is that you 'observe' this.
I know people talk about AI becoming sentient etc, but to me this is an impossibility. AI can no more become sentient than can the cartoons on the screen, or stones on the beach. AI can however, give the impression of sentience, better than a toy or something like that. But this is not conscious awareness any more than an actor turning to the screen and talking to the viewer is an example of the TV being sentient.
I understand that many scientific people have been trained to objectify themselves, and consider their 'anecdotal experience' as irrelevant or as a rounding error. I think this is a massive error personally, but those with that scientific mindset will not like what I'm saying. There is something special about each individual - the experience of consciousness is infinitely valuable - and although it is possible to conceive of objects doing a passable or great impression of a conscious experience, the difference is akin to seeing a fire on a screen, and experiencing it in person - ie a world of difference.
The discussion was specifically about art, that's why I mentioned art. To come again to my point, a human thinks it's sentient because a human thinks it's sentient (not kidding). We agree that towards the exterior, we can get an illusion of sentience from a TV set. But towards the interior? I only claim my neighbor is sentient because I claim I am sentient and the neighbor is human thus will be sentient as well. I don't have any more access to their sentience than I have access inside the "black box" TV set's sentience. So it all revolves around my own sentience, used as yardstick for all humans and to some extent, animals (plus the old debates about slaves, women, aliens...). I personally think we are all sentient because I think I am sentient. So... if a machine thinks it's sentient, will it be sentient? In a different way? Is there only one sentience? My consciousness is infinitely valuable (to me!) thus any human's will be (maybe less than mine, eh), and a machine's not much (but how much?). Or a rat's? Oh well, biology is one thing, and philosophy is another thing and they're definitely not mapping 1:1.
This is what I said in the beginning, so I think we broadly agree.
> It's going to come as a surprise to many that it is only to be found in the individual. Not in countries, nor in religious groups, or in football teams, or political parties, or any form of collective endeavour. The meaning is inside.
Where you say:
> So... if a machine thinks it's sentient, will it be sentient?
I dispute the assumption you are making. A machine can't think. There is no sentience occurring, even if it appears on the outside that there is.
And also, while it is fair to assume that other creatures that look like you are similar to you, it is not a fair assumption to think all things, inanimate things, have the same degree of sentience as you experience.
My misunderstanding is probably on the very definition of "sentient", which the dictionaries limit to having senses and feelings. Actually not only dictionaries, because according to different legislations quite a few animals are classified as sentient beings. Now a machine will necessarily have senses, while feelings are at the least signaling mechanisms. None of which are really something too fancy, so you must be meaning something else/more to the sentience, something not covered by dictionaries. Something more that "the ineffable" or "je ne sais quoi"...
PS: a machine can't think? Why did you say that? Because it cannot be sentient? Is thinking the difference making one sentient? Then maybe we should analyze "thinking" instead, as it's a term definitely easier to check.
I've no objection to animals being considered sentient, they are independent, I think.
A machine doesn't have senses - it's not alive. It has inputs. I think you are somewhat living within the metaphor that 'we are like computers'. It's a fine metaphor, but you are not in fact a computer (if you are in fact a human, and I cannot tell).
As I have also said, your experience is your own only - you have no way to confirm that other people's experiences are coherent with yours. This is because you only have access to your subjective experience. When looking at the objective world we all experience, it is a fair assumption to believe that the other creatures that are like us also have their own internal experiences, meanings, etc.
I think you would agree that a toy is not sentient. Nor a player piano. No machinery or non living things are sentient, or ever could be. So why would you allow yourself to believe that some machinery is sentient, just because it does a good impression of something that is sentient?
This confusion actually relates so a psychological quirk and misunderstanding we commonly make (and have been taught from birth). We often talk in terms of 'we', eg 'we, the people' - as if one can ever talk for others. We contextualise ourselves as a third party in the objective world, as if we were not having a subjective experience. It's a common use of language (I do it myself) but it is an error, or rather, it is fine for the objective world but does not cover the full range of the human experience. From a subjective perspective, one can only ever talk for oneself, feel for oneself, grasp meaning for oneself. It is illusory to think that anyone else could ever be able to represent you, or that you are able to represent others.
Ultimately, I know this is not an argument I can prove objectively - I'm sure that it is even possible to code an ai to argue as if it did have a rich internal life (which is impossible). Anyone witnessing its argument could agree. But what of it? What if everyone agreed? As I said, meaning resides within the individual, but I think a lot of harm will occur to those with a collective mindset, who will be unable to counter the combative ai in this example - they will cede power, rights, etc to machinery - they will debase themselves.
>It's going to come as a surprise to many that it is only to be found in the individual.
I'm going to give this a YnEoS as an answer.
So, yes, meaning is individual and occurs in your mind.
Also, no, your ideas of what meaning should even be in the first time are affected by your collective endeavor, your political party, your football team, your religious group, and your country. There will be a statistically high correlation with of your views of what meaning is and your affiliations with any of the above parties.
>hat is missed with this sort of anthropomorphising is what is actually occurring: the denigration of what it is to be human and to have meaning. A simulacrum is, by definition, not the thing-in-itself, but for nefarious reasons, this line will be heavily blurred. Imo.
I mean, isn't the meaning of being human to live on the Serengeti plains, fighting for survival and enough food to eat, and everything since then is just the simulacrum? Humans create society which create the simulacrum in the first place. That line was blurred so long ago we have no idea where it even existed.
If anything I would expect machines to be be better at determining people's feelings. Unless you know the person well, you are using things like facial expressions, body language, and tone of voice to figure out how someone is feeling, and hoping that they react in conventional ways.
Now that we've willingly told companies everything about ourselves, for younger people straight from birth sometimes, their machines will be able to use all this context to construct a more accurate picture of how a person might feel about an arbitrary subject.
Everyone knows that famous story about a woman being recommended pregnancy-related products before she even knew she was pregnant herself, and that was before this latest round of AI.
Also: The feelings humans have are also influenced by their culture. Feelings are not only felt but also enacted. And the enactment influences the feeling.
The final scene of midsommar is a great illustration of this.