Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Google results are almost entirely from places that I control, directly or indirectly. I can delete the repos, retract my papers, ask moderators to remove my comments. For more serious cases there are courts and laws.

There's no reason why language models should be immune from what is standard expected behavior in society.

I'm not raging against the sea, I'm raging against a bulldozer operator who has plugged their ears.



Why should AI have different privacy standards than google?

The papers you referred too in the top comment have been talking about the ability for AI to infer PII from anonymous data. But that’s not what happened here. You are complaining about AI returning non-anonymous data with easily findable results via other mechanisms. I’m not sure I understand why AI should be expected to understand and filter out information that is otherwise public?


>There's no reason why language models should be immune from what is standard expected behavior in society.

So sue OpenAI then. That's your recourse if you believe you've been harmed. I don't think you'll be very successful, given that even people here aren't strongly on your side. I think normies on a jury trial are going to be even less sympathetic to your arguments than the HN crowd.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: