> I would argue that corporate actors (a state, an army or a corporation) are not true superorganisms but are semi-autonomous, field-embedded systems that can exhibit super-organism properties, with their autonomy being conditional, relational and bounded by the institutional logics and resource structures of their respective organisational fields.
Lotsa big words there.
Really, though, we're probably going to have AI-like things that run substantial parts of for-profit corporations. As soon as AI-like things are better at this than humans, capitalism will force them to be in charge. Companies that don't do this lose.
There's a school of thought, going back to Milton Friedman, that corporations have no responsibilities to society.[1] Their goal is to optimize for shareholder value.
We can expect to see AI-like things which align with that value system.
And that's how AI will take over. Shareholder value!
> Daedong-beob unified the various forms of taxes to a single kind: rice. This, in effect, made growing rice equivalent to growing money, encouraging even more production than strictly necessary.
This is still relevant these days, whenever someone talks about linking a currency (and taxes collected in that currency) to a commodity like gold. The market for the metal becomes distorted, and the overall economy distorted as well, vulnerable to anything that might impacts the the mining or refinement of the metal.
Another historical connection might be how the weird status of silver and gold are linked to European colonization.
Here's some advice, which you are free to take or to leave, (and that's very reasonable since my advice is generic and I can't speak to your exact circumstances)
1) Just start doing it. Don't wait for someone to give you permission to prioritize customer needs. Pick up the phone and call them. Go to a store and watch them shop for your product. Talk to people. When you set OKRs/KPIs, just make them about customer needs. Be the change you want to see. Especially if you have spiritual buy-in from your manager. But your intentions have to be crystal clear on point about wanting what's best for the customer and the business and doing so in a rational way until you earn some social capital. It cannot be interpretable as political or they'll construe it as that and fire you
2) Use bureaucracy jiu-jitsu. Do what you think needs to be done in order to make the customer and the business successful. When someone tries to make you work on their arbitrary BS, give them a form to fill out and tell them it's in your team's triage queue. Make them force you to deprioritize a project which has clear value and then when that happens, make sure you announce loudly to all stakeholders why that project was depriortized and what's going to be implemented instead. If your stakeholders have even 2 braincells, they'll do the pushing back for you - you don't need to do it alone.
Lastly, you can't change an entire culture by yourself. For some places, it's just too late. But if you can find allies who feel the same way you do, you can be the beginning of a large change for your whole org by just being a good example
Lotsa big words there.
Really, though, we're probably going to have AI-like things that run substantial parts of for-profit corporations. As soon as AI-like things are better at this than humans, capitalism will force them to be in charge. Companies that don't do this lose.
There's a school of thought, going back to Milton Friedman, that corporations have no responsibilities to society.[1] Their goal is to optimize for shareholder value. We can expect to see AI-like things which align with that value system.
And that's how AI will take over. Shareholder value!
[1] https://www.nytimes.com/1970/09/13/archives/a-friedman-doctr...