I think it’s a textbook example of HN skimming through the paper and the summary.
The paper basically said:
1) AI may affect 2.2% of tech adoption, in terms of wage values,
2) but that’s only the surface. The rippled impact may be as much as 11.7% wage values.
That’s it. That’s all the index that they came up with measures, nothing else. They didn’t say there would be no displacement outcome, only that the index doesn’t quantify it. In other words, it’s the worst case scenario.
That's not true. They didn't measure wages, but used it as a proxy. What they're actually measuring is work done, or tasks.
Last I checked, most people work a job where there is more work to do than time in the day to do it - which would be the conditions for believing that wage value index would be closely correlated with displacement.
Not only does the article title say the thing the paper says it's not saying, there is little reason to believe that the thing it says is the outcome, even if the paper wasn't explicit about not saying the thing.
The paper basically said:
1) AI may affect 2.2% of tech adoption, in terms of wage values,
2) but that’s only the surface. The rippled impact may be as much as 11.7% wage values.
That’s it. That’s all the index that they came up with measures, nothing else. They didn’t say there would be no displacement outcome, only that the index doesn’t quantify it. In other words, it’s the worst case scenario.
Give it a read and come back with better critics.