Is that the only solution here? We need to destroy billions of lives so that we can potentially prevent "unsafe" super intelligence?
Let me guess, your cure for cancer involves abolishing humanity?
Should we abolish governments when some random government goes bad?
Insufficiently regulated capitalism fails to account for negative externalities. Much like a Paperclip Maximising AI.
One could even go as far as saying AGI alignment and economic resource allocation are isomorphic problems.
From history, governments have done more physical harm (genocides, etc) than capitalist companies with advanced tech (I know Chiquita and Dow exist).
Is that the only solution here? We need to destroy billions of lives so that we can potentially prevent "unsafe" super intelligence?
Let me guess, your cure for cancer involves abolishing humanity?
Should we abolish governments when some random government goes bad?