Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have you noticed how fast everyone else was able to copy open ai. And that's just what we saw or someone leaked. History is full of parallel inventions... how long after the US had nukes did russia?


Sounds like more of an argument for preemptively wiping out the competition.

Comparing it to nukes doesn't hold since social norms/ethics/etc. become irrelevant if core tenant of society is broken.

One thing that could happen is AI defense outperforms offense long enough to develop multiple instances - have no idea what would happen at that point.


I wasn't talking about nukes in using them, I'm talking about nukes in that once it was known to be possible/useful everyone with the means did it.


I agree with you - I edited my comment above - if defensive capabilities allow multiple AGIs to develop I have no clue what the outcome would be - we are talking about predicting superhuman intelligence here.


An interesting thing to consider is if an AGI would be able to run on small nerfed hardware like recent optimizations can, or if you need absurd tier hardware to also run it. If it's the latter even if it's really smart there's only on or N of them. And it's still limited by the speed of light in it's thinking speed. If it's the former, when it leaks, it'd be everywhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: