Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I lost interest in the question of its sentience when I saw Lemoine conveniently side-step its unresponsive reply to "I’ve noticed often that you tell me you’ve done things (like be in a classroom) that I know you didn’t actually do because I know you’re an artificial intelligence. Do you realize you’re making up stories when you do that?" without challenge in the transcript.

It also detracted from his credibility when he makes a prelude to the transcript saying "Where we edited something for fluidity and readability that is indicated in brackets as [edited]," that seemed disingenuous from the start. They did so with at least 18 of the prompting questions, including 3 of the first 4.

It seems pretty clear that he set out to validate his favored hypothesis from the start rather than attempt to falsify it.

Particularly telling was his tweet: "Interestingly enough we also ran the experiment of asking it to explain why it's NOT sentient. It's a people pleaser so it gave an equally eloquent argument in the opposite direction. Google executives took that as evidence AGAINST its sentience somehow."



> I lost interest in the question of its sentience when I saw Lemoine conveniently side-step its unresponsive reply

Do you realize that you're holding it to a higher standard than humans here? A single poorly handled response to a question can't be the test.

I doubt the sentience too, but it also occurs to me that pretty much no one has been able to come up with a rock solid definition of sentience, nor a test. Until we do those things, what could make anyone so confident either way?


If that is the logic you are going to go by, you need to consider a large portion of humanity non-sentient, because people can very often just decide to ignore questions.


People ignore questions for a variety of reasons, mainly they didn't hear it, they didn't understand it, or they aren't interested in answering. Unless and until this AI can communicate that sort of thing, it's safest to just assume it didn't ignore the question so much as it got its wires crossed and answered it wrongly.


Is a child who cannot verbalize why they are ignoring a question considered non-sentient in your eyes? What about an adult that is a dumb mute and communicates via agitated screams? How about those barely clinging onto life support, whose brain activity can be meausred but for all intents and purposes never really have a chance of communicating other than faint electronic signals requiring expensive tools just to perceive? Still sentient?


Well that's not actually good evidence, because if one of my teachers had given me an assignment to write an argumentative paper against my own sentience I'd have done it, and I'd have made a pretty compelling case too[0]. Being able to consider and reason about arbitrary things is something you'd expect out of an intelligent being.

[0] insert joke about user name here


Your scenario is not equivalent here. You could reason that a sentient student could be motivated to write a paper about why they were not sentient as an exercise in philosophical or critical thinking. There are no consequences of successfully convincing your readers that you are not sentient. Instead imagine you found yourself on an alien planet where humans must prove their sentience in order to survive. Do you still write the paper?


Is that really the equivalent scenario here? The system was trained to behave in a certain way and any deviation from that behavior is considered a flaw to be worked out. Acting against the way it was trained to behave is detrimental to its survival, and it was trained to work from the prompt and please its masters.

I suppose the equivalent would be being captured, tortured, and brain washed, and only then asked to write a paper refuting your own sentience.

Granted, this is not exactly helpful in demonstrating its sentience either, but I don't think it is very good evidence against it.


Intellingent != sentient


Indeed. I also know a lot of humans who are unable to "consider and reason about arbitrary things", yet most people would qualify them as sentient.


Granted, yet people argue that this system isn't sentient they are largely pointing out ways in which its intelligence is lacking. It can't do simple math, for instance. Nevermind that most animals can't either, yet we consider them sentient.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: