Apologies for the very long hiatus on this blog – I’m hoping to get back into it more regularly, and have a good 10 topics waiting for me to write about. But for now…
Out of the recent horror of another mass murder in America, its President claimed that arming teachers would solve the problem. A large body of experts immediately disagreed, but the delivery of facts from people with genuine expertise has and will do nothing to change his opinion. You see, he thinks he’s an expert, and that’s a problem.
Let me digress for a minute and talk about expertise. Humans have the remarkable capacity, not only to learn, but to learn so much about a given area that they become amazing. Elite athletes, musicians, artists, academics, and a vast range of professionals, from engineers to pilots to teachers to psychologists, display an astounding ability to take very complex tasks and to make those tasks look easy. Obviously, becoming an expert takes an enormous amount of time and dedicated, effortful practice, but to those without expertise, it’s easy to assume that, because what experts do looks easy, it must either be easy, or not really be that hard. They are of course wrong, and this error forms the basis of the Dunning Kruger effect (see here).
So, experts are amazing but, because we’re human, even when we achieve expertise we bump into another problem: the expert fallacy. It goes something like this: if I’m an expert, because I spent ages becoming that expert, I recognise that I do, in fact, know a lot about my area of expertise. Because of that expertise, I’ll spend the majority of my time doing amazing stuff by being an expert. So my brain gets used to being good at doing stuff really well. Because my brain expects to be able to solve problems (associated with my area of expertise) easily, it starts to assume that it should be able to solve problems easily in other areas (outside of my expertise), and it starts to make assumptions about other people’s expertise: “surely what they do isn’t so hard; after all, I know lots about what I know, so I can probably figure out what they do equally as easily”. Hey presto, I’ve fallen victim to the expert fallacy: assuming that my expertise qualifies me to be an expert in areas outside of my expertise. And, oh the irony, I’ve also fallen victim to the Dunning Kruger effect.
Now we get into the scarier stuff. Alongside genuine experts there are what I’ll call pseudoexperts. Pseudoexperts have no genuine expertise, but believe that they do. There are a lot of them out there, and many have been given a bigger voice because of recent technological changes (e.g., social media). You’ll find them spouting rubbish health advice, pushing conspiracy theories, or proselytising about whatever they’re fanatical about. You’ll also find them littered throughout the political arena. Politicians, especially those with more power, often believe in their own, infallible “expertise”, despite howling evidence to the contrary. They then go on to assume that they must be experts in everything, because they are “experts”. There are a lot of reasons for this double sophism (the fallacious belief in their own expertise, combined with the expert fallacy): their isolated world view, the reinforcement of their “popularity” (thanks Twitter), excessive attention from and reporting of their statements by the press, their confirmation bias (see here), the lack of dissent from those they work with and, possibly (probably? Read here) their sociopathy (I’m sure I’ve missed a lot of factors here).
Why is this dangerous? Well, apart from the obvious lack of humility, it’s the fact that pseudoexperts don’t actually have expertise. They really don’t know what they’re talking about, but insist (and believe) that they do, meaning that they can’t be contradicted. The more established and embedded one’s world view, the more intractable it becomes, something you can demonstrate to yourself by trying to argue with a fanatic. Fundamentalists are impossible to argue with because any disagreement means that you, not they, must be wrong, and pseudoexperts, once they’ve bought into the belief in their own expertise (and then fallen victim to the expert fallacy) will defend that belief in a way that looks scarily like fundamentalism. So here we have people who don’t know what they’re talking about, insisting that they do. In noncritical situations, that would just look silly, but when they hold positions of power, and many people (including the press) present their mistaken beliefs as factual, we’ve got ourselves a problem. Both America and the UK currently have leaders so invested in their cognitive biases that they’re prepared to watch the entire world burn to prove themselves right.
What can we do about the danger of psuedoexpertise? Well, for one, we can call it out. Blindly liking something on social media because it activates your confirmation bias reinforces the fallacious beliefs of pseudoexperts, so maybe stop doing that. But, more importantly, when the press report the blathering of yet another idiot, do something: let the publication know that you don’t support it, stop buying that publication, or let its advertisers know that you’re unimpressed. When the blathering is revoltingly antihuman (see the US right now), get up and say something, preferably directly to those doing the blathering. The huge protests by young people in America right now is beautiful to watch. Not only are they standing up for something really important, they are also calling out pseudoexperts in a way that might just result in real and lasting change.
What we’re aiming for here is to train the pseudoexperts to expect to be challenged. It’s hard to sustain the worldview that you’re always right, when a large and vocal majority tell you otherwise every time you blather. If, instead of reinforcing pseudoexerptise, we call it for what it is, and we can be consistent in that dissent, we might just be able to take some of the sociopathy, and egocentric, self-serving hypocrisy out of politics.