We’re coming up to that time of year where you might find yourself starting to turn your attention towards reviewing the Financial Year’s successes (and hopefully no more than a failure or two) and planning for an improved result in the coming year.

Take a moment though to consider some of the pitfalls in relying solely on your own input in undertaking this process….

Dr Dean Burnett’s first book “The Idiot Brain” explores the concept, that from a rational perspective our brain can be illogical, inefficient and sometimes just plain weird. Consider the following examples:

Our memory is unreliable

Unlike computers (with which our brain is often compared) our brains don’t use logical processes for storing information. Our storage method is a far more flexible and organic process and consequently it’s a lot less reliable – way less fixed than we assume.

Indeed, every time you retrieve a memory it actually becomes tweaked in some sort of way.

For instance, we’re all familiar with the habit of embellishing a story slightly in the telling to make ourselves sound better. Actually, this has an effect on the true story whereby slowly but surely, the brain starts to remember the event differently to what actually happened. This process of altered recall can also happen with stored memories, which the brain can subtly edit to (say) make us feel better about ourselves.

Our brain makes us afraid

Over millions of years the human brain has evolved to become super sensitive to danger.

As you would appreciate though, the modern world isn’t the same as the one faced by our prehistoric ancestors, who would have relied heavily on this in-built threat detection system. This “flight or fight” reaction is controlled by a region of the brain known as the amygdala. There are times when this ancient part of the brain is of great assistance in responding to and avoiding modern imminent threats (eg. evading a motor vehicle while crossing a road). However, many of the “threats” we currently face that trigger the amygdala require a more measured response. Unfortunately, the accumulation of these responses over time results in the sort of heightened anxiety that disrupts rational thought.  

We are desperate to be liked by others

In the same way our brains have developed a sophisticated threat detection system, we’ve also evolved so that large portions of our brains are dedicated to engaging with others. When being part of a community is important to survival, our perception becomes ingrained to the point where we have evolved a tendency to be very aware of how other people regard us. Our fear of criticism and need for approval means that at times, we can be very easily influenced by others.

Daniel Kahneman in his book “Thinking, Fast and Slow” describes the human mind as being comprised of two “operating systems” (again with the computer analogy!):

  • System 1: operates automatically and quickly, with little or no effort and no sense of voluntary control (eg. we are pre-programmed with a strong aversion to losses – particularly money-related!)
  • System 2: allocates attention to the effortful mental activities that demand it, being associated with the subjective experience of agency (intentional action), choice and concentration.

An understanding of how the brain utilises System 1 and System 2 in making decisions gives rise to another issue to add to the list of irrational cerebral tendencies:

Our brains’ “rational” decision making ability is unreliable

How this plays out is illustrated by a concept termed the “Availability Heuristic”. In essence, our perception of how often an event occurs is determined by the ease with which we can recall specific times when the event happened.

Let’s say for instance that we’re in the middle of an argument with our partner about who makes the greater contribution to the housework. This conundrum could be easily solved by System 2 (eg. taking out a piece of paper and documenting respective tasks done over the previous month, allocating hours to each task etc). However, System 1 is far more likely to take over by recalling an example or two from memory and then landing on a conclusion (eg. I’m obviously pulling my weight if not over-delivering!)

Recalling information from memory involves both System 1 and System 2. In this instance, we typically substitute a “System 2” question (how much work do I do around the house?) with a question of System 1 (how easily can I recall work I’ve done around the house?)

Accordingly, events that attract our attention (for instance a recent dramatic event that has attracted lots of media coverage) will be easily retrieved from memory and can skew our decision-making. Personal experiences are also more available than incidents that happened to others, or statistics (for example, cohabitating partners’ estimates of household effort / contribution to arguments typically add up to >100%!)

Furthermore, we have a reinforcing tendency towards Confirmation Bias, where we search for, recall and interpret information in a way that confirms our pre-existing beliefs or hypotheses. The more the emotionally charged the issue or the greater extent to which the issue is associated with our deeply entrenched beliefs, the stronger the tendency.

So how can all of this play out when we’re in decision-making mode?

  1. Rather than examining all the available evidence and carefully weighing it up, we often look for information that merely backs up our existing beliefs or – even worse – anchor on the last thing we heard;
  2. We focus much more on avoiding losses than making gains; and
  3. As fundamentally social animals, we herd. If we’re not trying to outdo those around us, we’re desperately trying to be just like them.

One of the most effective methods of guarding against our natural human tendencies is to seek external input, especially when undertaking essential planning and making key decisions.

I’m certainly doing just that as a key part of my preparations for the year ahead….

Peter Wilkinson – Director, Sam Wilko Advisory