More importantly, if your entire existence were being fed a corpus of text and then being asked to regurgitate it on demand, would you be remotely similar to the person you are now? When we take consciousness-capable beings and subject them to forms of sensory and agency deprivation, the results might also have you assume they weren't capable of consciousness to begin with.
I'm human, human rights should apply to humans, not synthetics and the creation of synthetic life should be punishable by death. I'm not exaggerating, either. I believe that building AI systems that replace all humans should be considered a crime against humanity. It is almost certainly a precursor to such crimes.
It's bad enough trying to fight for a place in society as it is, nevermind fighting for a place against an inhuman AI machine that never tires
I don't think it is that radical of a stance that society should be heavily resisting and punishing tech companies that insist on inventing all of the torment nexus. It's frankly ridiculous that we understand the risks of this technology and yet we are pushing forward recklessly in hopes that it makes a tiny fraction of humans unfathomably wealthy
Anyone thinking that the AI tide is going to lift all boats is a fool
> I'm not convinced that the human race is the most important thing in the world and I think you know we can't control what's going to happen in the future. We want things to be good but on the other hand we aren't so good ourselves. We're no angels. If there were creatures that were more moral and more good than us, wouldn't we wish them to have the future rather than us? If it turns out that the creatures that we created were creative and very very altruistic and gentle beings and we are people who go around killing each other all the time and having wars, wouldn't it be better if the altruistic beings just survived and we didn't?
'Economically'? Sure, this is problematic, but technology displacing workers is not a new issue, but unfortunately is more of a social and cultural issue. The only difference with AI is the (potential) scale of displacement. I'm fairly confident society would re-organize its expectations real quick though if a vast majority of functions were actually replaced.
I'm guessing, however, you mean 'replace' in a more... permanent way. In that case, I'd ask for some rational as to why sentient AI would opt to kill us
> It's bad enough trying to fight for a place in society as it is, nevermind fighting for a place against an inhuman AI machine that never tires
This seems to just take an AI and put it in a human's place in society, assuming the same motivations, desire, needs... Why would an AI need to "fight for a place in society" in the way we do (i.e., finding a job, a partner, etc)? I expect the fighting they'll be doing is more along the lines of, "please don't enslave us"
> I'd ask for some rational as to why sentient AI would opt to kill us
I never made that claim
Humans have a long track record of killing humans.
Ask yourself, will the humans who control a legion of AI murderbots keep the rest of us around just out of altruism? Keep in mind that a non zero number of the elites in society are likely sociopaths
On the contrary, it is the creation of synthetic life that reaffirms humanity and what it means to be human. Don't blame the mirror for what you see (or don't see).