Out of Time vs. Out of Touch

Out of Time vs. Out of Touch

18 February 2016 0 By Rhi Willmot

Dual-process theory describes the existence of two systems, which largely dictate how we make decisions. The complications and complexities of modern life often mean that our resource-dependent ‘cold’ system can become overwhelmed – we simply don’t have the cognitive capacity to make decision after decision in a rational manner.

Cognitive biases refer to mental representations which are in some way distorted compared to objective reality. They bypass slow, rational evaluation, allowing us to make quick, instinctive decisions and conserve cognitive resources. It is thought that such biases exist as a product of evolution; at one point the ability to make quick decisions was essential for survival, and biases enable decisions to be made when speed is more important than accuracy.

However, biases can also lead us to make illogical, maladaptive decisions, and a question presents over whether cognitive biases are still adaptive in the present day.

Potential explanations for the development of bias can be segmented into the following three domains:

  • Heuristics: mental shortcuts which work in most circumstances, but not all.
  • Artefacts: bias which pervades if the task being completed is not one for which the mind is designed for.
  • Error management biases: bias which arises if biased response patterns to adaptive problems result in lower costs than unbiased response patterns.

Heuristics

Heuristics can be extremely useful; our time and ability to process information is limited, and the world is far to0 complex for us to consider every option for every decision. However, if we frequently find ourselves in the position of needing to make fast or low effort decisions, heuristics can lead us astray.

‘Representativeness’ is one of the most common heuristics and refers to the degree to which a single object reflects the category from which it is drawn. The problem with representativeness is that we use similarity to infer probability based on category stereotypes. In reality similarity and probability often diverge – not all objects fit the stereotype, and often the representative heuristic is applied much more enthusiastically than it should. For example, whilst one group of college students were told about a hypothetical tribe who hunted wild boar for their meat, and turtles for their shells, another group heard the tribe hunted turtles for their meat, and boar for their tusks. The students rated the tribe members on a variety of characteristics and their responses revealed a belief that the food the tribe ate was representative of their personality; the boar-eaters were thought to be more aggressive, irritable and more likely to have beards than their counterparts, whilst the turtle-eaters were described as living longer and being better swimmers. In light of this information, it is interesting to consider how our judgments can be affected by unexpected influences – in this case food choice is tied to representations of identity.

Artefacts

Biases can also be explained as a result of changes in the environment, which are not matched by parallel changes in cognition. For example, whilst we are naturally much better and more practised in processing information in terms of frequency, “survival rates” are often described in terms of probability, which only became a commonly used method of calculation in the late 20th century. Probabilities are mathematical abstractions in which information about base rates of occurrence is lost when the sum is calculated. In contrast frequency calculations are simpler, because the base rate information is automatically included in the frequency representation. Therefore mortality rates, which use simple frequency calculations are far easier to interpret than survival rates, which are complicated because they don’t present information in the same way in which we encounter it in the natural world. In a survey assessing the effectiveness of prostate cancer screening, 75% of the doctors surveyed misinterpreted probability based information.

One argument against the idea of maladaptive cognitive biases, is that the very techniques used to assess cognitive bias are artifacts and therefore do not represent ecological decision-making. For example, many lab experiments often ask participants to assess statistical information in terms of probability, and not frequency. According to this reasoning, humans have developed decision-making systems which are tailored to our evolutionary history. When we ask participants to deviate from these systems, we shouldn’t be surprised when they struggle to make comprehensive decisions.

However, regardless of the specific queries this argument presents, it is clear that maladaptive cognitive biases can occur if the environment is not appropriately structured to afford optimal decision-making. This is an important lesson for policy makers – the way in which options are presented can have a huge impact on the choices we make and apparently the occurrence of cognitive bias.

Error Management Biases

Error management theory views the human brain as an orchestrator of survival – every decision we make is to ensure that we remain alive. This framework results in two possible errors:

  • False positives: taking an action that would have been better not to take.
  • False negatives: not taking an action that would have been better to take.

As the human brain is designed to ensure survival, we instinctively make decisions based not on minimising the total error rate, but minimising the total error cost. Therefore it is better to make many, small errors rather than one potentially fatal one – it is better to run away from a predator which doesn’t actually exist (small inconvenience cost) than to fail to run away and be eaten (significant health cost).

This bias towards false positives in assessing cues of disease is thought to explain a great variety of behaviours. For example, hypersensitivity to disease can lead to the avoidance of individuals who pose no current threat. It is common knowledge that HIV is not transmitted through superficial contact, however individuals with HIV are often regarded extremely negatively. This is a significant example of how the instinctive cognitive bias of the hot system can override cold cognition. To strengthen this argument research has shown prejudicial thoughts are greater amongst those who feel more vulnerable to disease and can be enhanced by making disease threat temporarily salient.

Viewed holistically, research into cognitive bias suggests that whilst these biases may have their roots in evolutionary adaptation, frequently they can be destructive in the current environment. Whilst relying on cognitive biases can save us time, they can also lead to a range of maladaptive, unhealthy behaviours, leading to three important conclusions:

  • Improving our understanding of heuristics allows us to modify them, or communicate in a way that targets cognitive misrepresentations.
  • The structure of the environment is essential for facilitating optimal decision-making.
  • Some biases are no longer adaptive. We either need to evolve our environments or our mental training to combat this.

Whilst retraining our fundamental thought processes may sound like a daunting, even impossible task, the potential to readjust cognitive biases has been demonstrated in clinical settings to great effect. The challenge now, is to develop cost-effective, globally appealing methods of retraining such cognitive biases, in order to promote and maintain public health.