top of page

The glass isn’t half-empty or half-full. It's just complex or simple

Using gut instincts

Are you a glass half-empty or half-full kind of person? The question paints optimism and pessimism as a personality trait. Whether, or not, we seem to be pessimists or optimists, risk takers, or risk-averse for that matter, the question reflects something interesting about how we characterise each other’s decisions and decision-making.

In today’s world, we still, as a society, take a lot of decisions based on ‘gut instinct’. Although we often have data and increasingly we use that data to help us make informed decisions, in a lot of ways, when it comes down to it, we are still led by our own intuitions and our gut.

It is all about what we call ‘intuition’. When our personal view of some scenario is positive we are ‘optimists’ and when our view is not positive we are liable to be called ‘pessimists’.

Similarly, when our intuition is to be cautious, we are called ‘risk-averse’ when it calls for action we are called ‘risk-takers’. What we all seem to have in common, however, is that we see decision-making as partly down to a personal preference and we refer to our 'intuition', our 'personal preferences' or our ‘instincts’ as the basis for the decision. This is a problem, however, if we want to back up our decisions with factual, checkable reasons. In other words, it is a problem for more scientific, rational decision-making.

The science of ‘gut’ decisions

One of the main reasons that we can justify using our instincts instead of being guided by 'purer' rationality, is that we assume that some aspects of decisions are just about a personal preference. In the language of decision theory science this personal preference is often characterised by scientists and researchers in terms of an attitude to risk. We see that in a given area of expertise (or ‘domain’ if you’re a scientist), some people appear more inclined to take bigger risks whilst other people appear more inclined to be cautious.

The empirical evidence is that attitudes to risk although seemingly subjective, are not personality traits. Research published in 2002 (see notes at end) by a team of psychologists describe research to measure attitudes to risk in different domains. The research shows that the same people can be risk-takers in some domains and risk-averse in others. Rather than a personality trait, attitudes to risk seem to be related to the perceived benefits and risks in a particular area of expertise, presumably in ways that are fixed for that person in that domain. For instance, the researchers found that more women, on average, seem to be risk-averse in many domains, with the notable exception of the social domain. The researchers argue that risk attitudes are based on perceived information about benefits and drawbacks of courses of action, but that still implies that these beliefs are somewhat subjective rather than objective.

Personalities and expensive mistakes: Hawks and doves

Our inability to be objective about risk-taking leads to problems when we want to assess what organisations and businesses should do when faced with big investment and strategic decisions. The most obvious of course being political decisions taken by our political leaders. An example is the decision in the UK to hold a referendum on leaving the EU by, the then Prime Minister, David Cameron. By his own lights, this must be considered a disaster since his strong preference and intuition was that the UK would vote to remain. At the time of writing, the US with Trump as leader seems to be about to begin a war with North Korea. Again, we often simply characterise the basis of these decisions in terms of personality traits rather than in terms of actual evidence. In foreign relations, those favouring aggressive approaches are known as ‘Hawks’, while those favouring more conciliatory diplomacy are known as ‘Doves’.

By leaving many aspects of very big decision-making calls up to the personality or subjective beliefs of the final decision-maker, we seem to be accepting that at the end of the day, it is indeed down to someone’s gut convictions. We seem to accept that there is no ‘fact of the matter’ about the right or wrongness of many of the big decisions. Unfortunately for us, however, mistakes in these decisions can, of course, be very, very expensive and not just in politics.

Personalities and Expensive Mistakes II: Bull markets and Bear markets

To continue with the theme of expensive decisions, in the financial world decision-making requires a careful mix of both productive opportunity taking and caution when necessary. Rather like the politics of foreign relations the two basic personality traits are again blended together in descriptions of ‘market sentiment’: We have descriptions of optimistic ‘Bull’ markets and more pessimistic ‘Bear’ markets. However, when risk-taking is not analysed in an effective way, even so called low-risk positions can become extremely risky. An example of this is the famous 2008 crash, where many investments marked as extremely safe by ratings agencies, in fact, turned out to be anything but. Therefore, understanding how to manage risk as a system also involves an analysis of how we classify these risks in the first place and manage risk more effectively. If risk is partly about personal preference this is a problem for the classification or risk and our decisions based on these classifications.

Using the notion of ‘complexity’ instead of personal preference

My belief is that the answer can be to use the notion of ‘complexity’ to help us identify the rationality or objective reason for risk-taking versus risk-aversion. This leads back to my previous article ‘Ways to Be Rationally Complex’. If we can use an analysis based more on rationality then we can understand why people might appear to be risk-taking in some domains and more risk-averse in others as actually rational decisions. It would also help us to understand the real difference between the two in terms of when the decisions pan out to be good or bad. It would help to turn decision-making in to a real science, rather than a data assisted ‘art’. We would be risk-takers when it is rational to be so, and risk-averse when it is rational to be so.

NB: By the way, as a short digression, I must apologise that the use of the term ‘complexity’ in this blog article if you have been reading other articles, where I use the term in a slightly different way. The meaning of complexity, here, is closer to the more commonsense, everyday definition, which is a complex as ‘difficult’ or ‘complicated’. (It is, of course, impossible to always stick to one simple definition, because complexity is a highly contested and over-used ‘umbrella’ term for many different things.)

To return to the topic, what is the answer? How can this concept of rational complexity can help us to make better decisions? I believe that part of the answer to these problems may lie in the children’s game often known as 20 Questions and will be discussing this in the next article.

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Black Facebook Icon
  • Black Twitter Icon
  • Black Instagram Icon
bottom of page