If you change your mind (aka mental agility)

Changing Mind website image

In 1977, Abba sang in hope that ‘if you change your mind, I’m the first in line, honey I’m still free, take a chance on me!’.

But how optimistic were Bjorn and Benny being? Was the person of their dreams likely to change their mind? Frankly, the odds were against it. We don’t do ‘changing our mind’ very well. And there are reasons for that. But it’s a handy skill to have, and with the world moving very quickly, we are all in need of a scarce commodity – agile thinking.

In this article, I review in outline four of the main cognitive processes which make changing our mind very difficult. I am not talking about changing our mind out of indecisiveness. I am looking at changing a decision in the light of new evidence or information.

1. Ignoring evidence (and favouring values) 

We might like to think that we change our mind on evidence. If the evidence changes, then we should be agile, and change our behaviour to match the evidence. That would be rational.

But in truth, evidence plays a surprisingly minor role in our decision making. We are astonishingly good at ignoring evidence, contriving arguments to discredit perfectly good evidence, or seizing on poor evidence if it suits us. The most extraordinary arguments in favour of ridiculous claims (such as that the earth is flat) are easily found on social media sites. And the rise of absurd beliefs is growing at a time when there is greater access to good empirical data.

What is going on? Why are we ignoring evidence?

One big chunk of the answer is that we only really hear the evidence that accords with the foremost value in our mind. Other evidence is overlooked, ignored, or blotted out.

Change mind 2 website image

Here’s an example. For decades, tobacco companies made smoking tobacco seem cool. Being cool, it seems, was a value we could be manipulated into putting into first place, especially amongst the young. In 1963, Americans smoked half a trillion cigarettes; damaging their health with every last one of them.

Whilst we were thinking about how to be attractive, the Governments around responded with data about our health. But as we valued being cool (here and now) more than (long term) health, we ignored the data. Our brains just tuned it out.

The governments would have done much better to give us data about the value that we were prioritizing (i.e. being attractive). Perhaps they should have said that smoking didn’t make you attractive. It just turns your teeth yellow and makes you stink.

History showed that the most effective remedy was to simply ban the adverts that made us think that smoking was attractive. We no longer had evidence that smoking was cool. The link between our primary value (being attractive) and our behaviour (smoking) was broken – allowing other values and other connected behaviours to occupy our minds.  We did other cool things, and looked after our health rather better. Our habits, which had seemed immoveable, changed quite rapidly.

If we want to think in an agile way, we must be perceptive about our own values, and see that with fractionally different values, we could change our minds (and behaviours) much more easily.

There are some traditional values in leadership which directly conflict with mental agility. Leaders can value being, ‘purposeful’, ‘decisive’ and ‘consistent’. Some leaders are derided when they perform ‘U-turns’. If someone feels that they must honour the values of decisiveness, consistency, and purposefulness, agility is neigh on impossible.

There is a huge training need for individuals and groups in terms of cultures and mindsets that allow for agility and see where agility conflicts with other values.

Change mind 3 website image

2. Confirmation bias

In short, this famous bias describes the phenomenon where we like to hear views or data that confirm our existing beliefs. We get satisfaction in hearing affirmation of our views, and feel dissonance when we hear opposing evidence. So, we seek out what we want to hear (i.e. confirmation) rather than what we would best learn from (i.e. opposition).

Our social media feeds and our search engines know this too, and feed us confirmatory data, leading us to extremism and polarity.

There are many strands of this bias which are worth investigating in detail, but I can’t let this article become a book! So here are some highlights.

  1. We have a bias towards accepting the first (or original) thought that we hear on a topic. This tendency is heavily influenced by the source of it. The more we respect the original source, the more we latch onto the initiating idea. Most kids initially accept the ideas of their parents, for example.
  2. We then go through a process of overweighting evidence that confirms that original view, and underweighting counter evidence. We often only seek out evidence that confirms our view, and find counter evidence to be, effectively, invisible.
  3. We tend to operate an ‘inverse proportion’ fallacy. That means that if we receive strong evidence in favour of our proposition, and our belief in it grows, then we assume that the validity of any other point of view diminishes in proportion. But this is not logically valid. There might be good evidence on BOTH sides of an argument. The strength of one position does not necessarily imply a corresponding weakness in the other. But we like to think it does.
  4. Belief persistence. We like things that are ‘ours’. Even ideas. We don’t like to give up things that we have owned (even mentally). So we hang onto them in spite of reason.

No individual or organisation is immune from critical errors through confirmation bias, and examples of terrible decision errors from global leaders to everyday relationships can be readily found. No individual, team, or organisation is safe from error until its people have really understood this bias.

3. Proximity bias
Change mind 4 website image

We are also bad at changing our minds because of our habit to surround ourselves with those who agree with us. That fuels confirmation bias, AND activates another bias – proximity bias.

To be agile, we need to accept as a premise, that new, and good, ideas can come from anywhere. Not just those immediately around us. But this is not something that our brains will easily accept.

We have a powerful rule of thumb that things that are close to us are more important than things that are further away. Our bias towards things that are proximate to us is easily understood when we are thinking about life when our main concern was predators and prey. Predators and prey which are close to us are, of course, more important than those that are far away.

So if we fill up our immediate space around us with friends and colleagues who like us and flatter us, we end up being unable to hear the more distant voices of challenge or dissent. We ignore the dissenting voices, that want us to change our minds, in part because we don’t like to hear their challenge, and in part because we have an automatic bias against those voices that are not close by.

4. Sunk cost fallacy

Another fallacy that we will readily recognise in our everyday lives is the ‘sunk cost fallacy’. It is the idea that we want to see things through once we have made a start on them. Dropping an idea in which there has been investment is not intuitively comfortable. But holding on is an irrational fallacy. If we blindly stick with ideas because we don’t want to give up on them, we will not be agile.

The metaphor you sometimes hear is about holding onto a rising balloon.

Have you had the experience of taking a wrong turn when driving, and having to decide whether to turn back and start again, or press on and hope that you can make the best of the new route? Intuitively we don’t like facing the certainty that we have wasted time, and turning around, which is usually the best option, is a hard one to swallow. Usually, we persist with our error and keep going. Often we then get hopelessly lost and wonder why on earth we didn’t just accept our earlier error and undo it.

Our desire to see some return on work already put into something is almost irresistible. The cost, time or effort put into something often keeps us committed to ‘seeing it through’ even when we are aware that we would be happier and better off if we just abandoned it.

Conclusion

This article only touches lightly on some of the many issues that affect our daily decision making. Our brains have many cognitive processes which have been coded into us when (as a species) we lived very different lives. The rate of change in our society is much faster than the rate of change of evolution, and our brains are suited for a different world than the one that we inhabit professionally. We can help, with training, to recognise our weaknesses, and to introduce habits to improve our decision making.

The article has been generic, and not tailored to any particular professional environment. We have been working on these issues in the fields of law and law enforcement, and have tailored training available in these professional spaces.

We would be pleased to work with you on your ‘critical thinking’ skills and would be delighted to hear from you.  

Email us