The Dunning-Kruger effect, the expert blind spot, and why my ideas sound stupid

Born in Hong Kong
7 min readNov 27, 2022

This started as a conversation I had with an acquaintance, who is a lawyer for workers’ rights. Originally, I was going to write about the content of that conversation, but now I think the mechanics of that conversation is actually more interesting. However, for context’s sake, here is the what the conversation was about.

I had this perspective that the platform economy is taking over jobs so fast, that the laws for protecting workers’ rights can’t keep up, e.g. Uber/GrubHub/Etsy/YouTube, none of them are required to provide health insurance for its workers (very US-centric, I know), because the workers are treated as independent contractors, etc. And how this is an example of how the political institutions of the US are not capable to keep up the accelerating rate of social changes, and technology under a capitalistic society will be able to outpace its ability to set up new boundaries to protect the rights of its citizens. This can be seen in its struggle to regulate social media platforms. A Facebook account with 1 million followers can say pretty much anything they want to, while a rural radio station with a dozen listeners can be fined for saying “fuck.” As social media platforms circumvent regulations that only exist for mass media and ultimately replacing mass media with a lawless information space, platform economy will follow the same pattern and we will end up with a similar lawless economy.

But this isn’t about that discussion. This is about how that conversation went.

As my lawyer friend proceeded to dive into the handful of cases he was working with, and the conversation was steered away from the broader discussion of whether the existing institutions are capable of solving these problems. Maybe it was because he was not a political scientist, maybe it was because this sort of theoretical discussion was not illuminating and so we both subconsciously gave up on trying to get something out of that. This also made me feel a bit stupid and inadequate, as I was not as well-versed about the topic as he was, which was why I wanted to get his opinion on my perspective, but it seemed to me that he was not interested in entertaining the broader argument. Although the conversation never seemed to dispute my perspectives directly, they fell to the wayside as the conversation become more granular and more about specific cases.

Maybe I was in the Dunning-Kruger valley, I thought.

But maybe he was in the Dunning-Kruger valley as well? But maybe it was a different valley? Maybe there is another valley higher up the curve?

The Dunning-Kruger effect

People who know a little bit about the Dunning-Kruger effect tend to be overly confident in the application of the Dunning-Kruger effect, which is itself a manifestation of the Dunning-Kruger effect.

The Dunning-Kruger curve itself also seems to have been produced by someone who suffers from the Dunning-Kruger effect. How one self-assess their own comprehension of a topic is way more complex than a simple curve. It is at least an N-dimensional spectrum, where there are peaks and valleys everywhere, and depends on many internal and external factors, for example:

(1) Your experience with the subject, which is essentially the only parameter considered in the original Dunning-Kruger effect.

(2) Your scope of expertise on the subject. Say, if the subject is European led colonialism, and you are only an expert on Hong Kong, you may know a lot about the subject — by the sheer volume of knowledge, but may not understand colonialism as well as someone with a wider scope (note to self).

(3) Your progression path on the subject. Say, if the subject is magnetism, and you have progressed from your undergrad condensed matter physics education and became an experimental crystallographer, you may know less about the subject compared to a theorist with the same amount of experience in the field, but is specialized in a different manner.

And so on and so on.

Moreover, most if not all these points also depend on the subject matter, where some subject matter may have a more treacherous N-dimension plot than others.

Ultimately, I don’t find the Dunning-Kruger effect all that illuminating — no more than the saying “fools are full of confidence and geniuses are full of doubt.” It is like the geocentric model of the universe — elegant and intuitive at first glance but falls apart upon closer inspection.

The sad thing is, unlike the geocentric model, I don’t think an elegant but correct alternative exists, because of the base assumption of there being a consistent correlation between one’s knowledge about a certain subject matter and one’s self-assessment. There are simply too many factors involved. To try to force a pattern will be like to force a theory on the observation of “countries in hotter climates tend to be poorer.” Sure, you can argue that the observation seems consistent enough, but to call it an “effect” would be misleading, because this will imply causality, and goes down a slippery slope where other factors, such all of world history, are ignored. And I have seen the bottom of this slippery slope. It is full of claims like how people in hotter climates have darker skins and darker people have smaller brains, etc. It’s bad logic guided by a bad assumption.

Most of all, I consider the Dunning-Kruger effect is damaging because everyone who have heard of the effect tends to think of themselves as being slightly on the “good” side of the curve, and the effect becomes somewhat of the opposite of a self-fulfilling prophecy, with most of its victims being moderate intellectuals — who are “nerdy” enough to have learned about the effect through a webcomic or a Reddit post.

(Shown here) The Dunning-Kruger maze

The expert blind spot

The common usage of the term “expert blind spot” is in education, i.e. an expert’s inability to tell if a piece of information is “common knowledge” or not. I have to deal with this phenomenon daily at my job, where I have to argue with the authors, for the sake of readability, that certain technical terms be explained, e.g. that a layperson would not know what a perovskite is.

But this is not the definition I’m speaking of. I am speaking of the expert blind spot in a much broader sense.

There is a phrase in Chinese: 钻牛角尖, which means “to squeeze toward the end of a bull’s horn.” It is similar to the saying “can’t see the forest for the trees,” but also implies a progression of “the more someone looks at a forest, the more that person only sees a single tree.”

I’d argue that this tends to be more of a problem for what I consider to be “empirical” subjects, i.e. subjects that rely more heavily on anthropocentric interpretations to make sense of the empirical data, or in other words, subject matters that require more descriptive than predictive thinking.

History, for one, is almost purely empirical, and it is especially dangerous when one tries to turn a descriptive argument into a predictive one. (See the “countries in hotter climates tend to be poorer” example from earlier.) During this transformation, a narrow expert, being trapped at the end of a bull’s horn, while also being the most authoritative expert on the subject matter, may hold too much power, while at the same time needed by the general population to exercise that power.

For example, during the 1700s, white men might have been the only authoritative commentators on the subject of slavery, given that most black slaves at the time were uneducated and illiterate — that is a problem and a problem that is not caused by the lack of expertise, or even of representation. Because even if the blacks were given access to education, the uneducated blacks would remain unrepresented, plus, the education is offered by the already educated class. This is the dilemma faced by many colonized people — to either assimilate, which is seen as succeeding in escaping from oppression while also escaping from one’s previous identity, or to remain unrepresented and oppressed.

Why do my ideas sound stupid?

There is a reason why the idea in your head sounds stupider after you’ve said it out loud. By writing this essay to articulate these thoughts, I have simultaneously committed the same crime I accuse of others. By criticizing the Dunning-Kruger effect and trying to articulate the expert blind spot, I too have become too reductive in my words.

Every nuance becomes a fractal and exposes more nuances ad nauseum.

I think this is because language is inherently reductive. Because words are invented, not discovered, a word is only invented when the concept already exists. In other words, language itself is empirical, and I can only speak of something that conforms to the bias embedded in the language by those who have invented it.

There is also a difference between an idea and the words that are ultimately used to convey that idea. In a sense, articulating an idea is like measuring the quantum state of a particle, where during the process the probability function collapses to a discrete state — and there is no way to avoid this collapse, because the process itself is the collapse of ambiguity.

To be eloquent is to be reductive.

This statement applies to itself as well.

This is a stream of consciousness post of sorts. There will be many more like this.

--

--