Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It seems narrow, but there really is no safety-friendly explanation for Altman et al giving their robot a flirty lady voice and showing off how it can compliment a tech dude's physical appearance. That video was so revolting I had trouble finishing it. I think a lot of people felt the same way - it wasn't because the voice sounded like Scarlett Johannson.


What’s unsafe about an AI having a voice? Is it due to someone being duped into thinking the voice is a person’s?


I think the point was what kind of voice.


Yes, specifically it seemed like OpenAI was actively encouraging people (men) to have fake personal relationships with a chatbot. I am wondering if Sam Altman gave up on the idea that transformers can ever be general-purpose problem solvers[1] and is pivoting to the creepy character.ai market.

[1] They are “general purpose” but not at all “problem solvers” https://arxiv.org/abs/2309.13638


That’s a good point. I suppose they could make people co-dependent on the chatbot and even have them do unethical things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: