Serious investment thinking that doesn’t take itself too seriously.

HOME

LOGIN

ABOUT THE CURIOUS INVESTOR GROUP

SUBSCRIBE

SIGN UP TO THE WEEKLY

PARTNERS

TESTIMONIALS

CONTRIBUTORS

CONTACT US

MAGAZINE ARCHIVE

PRIVACY POLICY

SEARCH

-- CATEGORIES --

GREEN CHRONICLE

PODCASTS

THE AGENT

ALTERNATIVE ASSETS

THE ANALYST

THE ARCHITECT

ASTROPHYSIST

THE AUCTIONEER

THE ECONOMIST

EDITORIAL NOTES

FACE TO FACE

THE FARMER

THE FUND MANAGER

THE GUEST ESSAY

THE HEAD HUNTER

HEAD OF RESEARCH

THE HISTORIAN

INVESTORS NOTEBOOK

THE MACRO VIEW

POLITICAL INSIDER

THE PROFESSOR

PROP NOTES

RESIDENTIAL INVESTOR

TECHNOLOGY

UNCORKED

A pandemic of armchair experts

by | Dec 9, 2021

The Professor

A pandemic of armchair experts

by | Dec 9, 2021

How we decide who and what to believe

We appear to live in an age of misinformation. 

Certain broadcasters and social-media celebrities openly promote fake facts or misrepresentations of science and data to their audience, many of whom do not seem to care whether they are right or wrong, as long as they are hearing what they want to hear.

The promotion of misinformation can be caused by an over-inflated belief in their own judgment and knowledge, or often, they simply relish the chance to proclaim their own contrarian or ideological views. Sometimes, it’s just about self-interest.

Many of us have at least a few controversial beliefs. We might believe that the death penalty deters crime or that raising the minimum wage decreases unemployment or that raising business taxes will reduce innovation.

We might even believe that women are not as good at maths as men or that the Earth is flat.

Some of these beliefs we will hold strongly.

But when we attempt to justify our beliefs, we often find the evidence pool is very shallow.

Researchers have identified a chronic illusion of explanatory depth, in that we overestimate our understanding of the world. 

We can discover this by trying to justify our pet beliefs. To illustrate, when I interrogate myself about why I believe the death penalty is not a deterrent, I find there is not a lot there except for consensus beliefs among my peer group – some of whom I hope have looked into the evidence – some intuition and vague memories of looking at some blog posts or newspaper articles. This is not a lot. But it is perhaps not surprising: we simply don’t have time to be experts on everything.

Sometimes people are described as having fallen prey to the Dunning-Kruger effect or even as ‘having’ Dunning-Kruger. Donald Trump was one such person.

The Dunning-Kruger effect, however, is a population-level effect, so no individual can ‘have’ it. It primarily means that just because someone is confident, it doesn’t mean they are right. In fact, there are individual differences in confidence, with some people being absurdly sure of themselves and others quite diffident. 

But the confidence of highly confident but wrong people comes not from their ignorance, but from the fact that they are inherently confident about everything. Some researchers have described it as arrogance

If he knew more, would Trump have been less confident? I doubt it. Trump was (or is) simply full of bluster and his confidence was simply unrelated to his knowledge.

What determines the beliefs we adopt when we have a choice?

Scientific evidence can help, but often we believe what we want to believe anyway.

These beliefs might be ‘chosen’ through indoctrination. They might be the result of self-interest or strongly held ideology, such as wealthy people believing taxes rob people of initiative. Or they might be required to fit into a social group.

How do specific beliefs become linked to specific social groups? In some cases, the link is quite clearly defined. 

Strongly religious people generally do not believe in evolution and atheists are not creationists. Partisanship also produces dispositions to belief. The moral values of conservatives involve different issues – such as respect for authority – than those on the left, who put more weight on harm prevention. Liberals tend to be more drawn to seeking out change and novelty, both personally and politically, while conservatives, in contrast, have a stronger preference for things that are familiar, stable and predictable. 

Often, simply knowing a belief is endorsed by a member of ‘their’ side is enough to get people to support it. 

Many current controversies have this flavour, such as whether Covid vaccines or masks should be required, or whether nuclear power is good for the environment. We look to our peers, and to the authorities and ideologies we respect, and follow their lead.

We are also more likely to follow those who are highly confident, even though confidence is a poor predictor of accuracy. And, of course, those we follow, being human just like us, are probably doing the same thing. 

Armchair experts are just behaving normally

Let’s return to those high-profile broadcasters, social-media celebrities and armchair experts who have been wilfully spreading an avalanche of misinformation. 

They are really no different from everyone else. 

If it is natural to believe things based on little evidence, and to believe things because they fit with our social group and partisan preferences, it should not surprise us that some hold beliefs quite at variance with ours. Or that they apparently do so despite, as it appears to us, overwhelming contradictory evidence – from their perspective we are doing the same thing. We should not be surprised if a TV reporter or Twitter celebrity is just as likely as anyone else to believe things based on flimsy evidence.

As individuals, we may have fallen on the side of accepted scientific wisdom (where the bulk of the evidence and experts sit) during the pandemic, but there will probably be other situations where we too have beliefs that are based on our own misjudgements, ideologies or personal gain.

The American writer and political activist Upton Sinclair famously wrote: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it”.

Even a scientist, when hired directly by a pharmaceutical company to evaluate the efficacy of a new blockbuster drug, might be disposed to find evidence of the drug’s effectiveness. 

Conversely, there are probably reasons why a small – but prominent – number of scientists have taken a distinctly outlier stance regarding the pandemic, or other issues, such as climate change.

We need go no further than this to understand why there will be armchair experts proposing all possible positions, and when they gain attention and celebrity for doing so, they will stick with those positions.

To give up their position will be to lose all the attention, all the celebrity and all their credibility. Imagine what would happen to Donald Trump if he were to come down on the side of poor refugees. Imagine what would happen to the radio hosts who have built up a larger following based on their unwavering libertarian views if they suddenly declared they had changed their minds about masks. 

Once committed to a set of beliefs, the armchair expert is in it for the long run.

Originally published by The Conversation and reprinted here with permission.

About Daniel Read

About Daniel Read

Daniel Read is Professor of Behavioural Science, Warwick Business School, University of Warwick.

INVESTOR'S NOTEBOOK

Smart people from around the world share their thoughts

READ MORE >

THE MACRO VIEW

Recent financial news and how it connects across all asset classes

READ MORE >

TECHNOLOGY

Fintech, proptech and what it all means

READ MORE >

PODCASTS

Engaging conversations with strategic thinkers

READ MORE >

THE ARCHITECT

Some of the profession’s best minds

READ MORE >

RESIDENTIAL ADVISOR

Making money from residential property investment

READ MORE >

THE PROFESSOR

Analysis and opinion from the academic sphere

READ MORE >

FACE-TO-FACE

In-depth interviews with leading figures in the real estate/investment world.

READ MORE >