Just anecdotal, but my life observation of DK is often highly intelligent and competent people in a particular field who then generalize that to pontificate and proclaim, directly or indirectly, superior understanding to certified domain experts (e.g., have directly related advanced degree(s), work in the field for decades.)
It thus seems more or as much a psychological effect - in short, people with a personality type of superiority and know-it-all, yet have never done the deep and hard work to gain or demonstrate any competency in said areas.
A common side observation is of course unfounded conspiracy theories, that the derided experts have sinister intentions.
Anecdotes are not data. Even in plural form. We can witness isolated instances of what seems to be a phenomenon without it actually being part of a phenomenon. Other posters have already established that some experts can overestimate their expertise as well. The study mentioned by the original post and in my comment seems to suggest that the overestimation bias is prevalent across a wide range of cohorts of expertise. Senior students are just as likely to overestimate their talents as freshmen for instance. This effect likely extrapolates to experts as well it's just hard to get good data.
No DK doesn't say no bluster, no proclamations or no artificial assertions of expertise. It doesn't even say that the overestimates are just as prevalent among experts as laypeople. All it says is as near as we can tell the effect size of the overestimation is the statistical autocorrelation and our best efforts to produce the same effect without relying on the autocorrelation have failed.
I think there are a lot of ways to accept the anecdotes you mentioned occur that need much weaker assertions than DK as a psychological phenomenon and would hesitate to jump to DK based on that information.
To be fair, I think the counterargument is that ostensible experts can overstate their ability/skill/knowledge to detrimental effect just as easily, and by virtue of the label ignore the reality of the argument/scenario at hand. That is, experts can overlook mistakes they're making, or conflicts of interest, etc because of their status, and because they overestimate their own ability. There's been studies of this in group decision making in crisis situations, where hierarchies can cause failures because the "leader" becomes overconfident and fails to heed warnings by others in the group.
This all gets really murky quickly in practice because of what "low" and "high" competence means, and what constitutes the actual scope of expertise with reference to a particular scenario.
>The V-tail design gained a reputation as the "forked-tail doctor killer",[16] due to crashes by overconfident wealthy amateur pilots,[17] fatal accidents, and inflight breakups.[18] "Doctor killer" has sometimes been used to describe the conventional-tailed version, as well.