Hacker Newsnew | past | comments | ask | show | jobs | submit | jomaric's commentslogin

If we trade understanding for optimization, we begin to lose interest in explanations for how the world works. If we lose this desire for understanding, we then don’t have much to say about how the world should be governed, how the world ought to be, or what the good is. If AI provides us with instant answers and provides us with an unearned certainty about the world, we lose what Eran Fisher calls the emancipatory interest to defend liberal institutions.[10] For him, if our primary goal becomes AI derived optimized knowledge, we learn to see “freedom” as coming from outside of us and not an internal drive to self-determination or self-governance. We become indifferent to the fact that AI might obscure causality or nuance and hence make it difficult for us to understand how it gets to its results. We won’t care because we won’t need other humans to help us understand the world. If the algorithm identifies “other” citizens as “threats to the state,” we might lose our desire to challenge the “algorithm’s logic” since we don’t understand it, nor are we interested in understanding its rationale


"Each time we submit to the temptation of indulging in the familiar... we move one step closer to becoming illiberal subjects... indulging in the familiar can habituate us away from exploring new ideas. The result can be the death of liberal democratic institutions – slowly, then all at once."


after years of exposure to algorithmic recommendation engines and platform capitalism models that promise to give us “control” over our information and entertainment diet, we are losing our taste for “outliers.” I define outliers here as cultural content that doesn’t comport to our algorithmically curated views of the world. We’ve slowly become habituated to a deeply illiberal optimization ethic that rejects “outlier” perspectives. Rather than seeing deviations from the “algorithmic models in our heads” as opportunities to grow, we increasingly see outliers as dangerous anomalies to be ignored or ridiculed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: