Is Your Thinking Flawed?
June 17, 2019 9:25 pm
Here are three things you can do to improve your thinking.
This terrific article (linked below) summarises a number of common mental biases and describes a unique ‘research method/theatre piece’ that is helping people to learn more about how common biases impact extreme thinking/decision-making. The research shows that these same biases can also result in the polarisation of opinion and at times, a complete disregard for the facts.
This is because ‘efficiency’ is built into the primal function of the human brain. The result is that we take mental shortcuts – it takes significantly more brain power to ‘hold’ two opposing ideas in our mind at once. It is also likely that when there is opposing information, we already have a personal view that biases which set of data/ideas seem more plausible to us. As the authors of this article points out, it is easier to reject data (or the person providing the data) that ‘disrupts our self-view’ than to change our view of ourselves; particularly if the new evidence suggests we have acted or thought something that is foolish or hurtful.
While this research examines more extreme situations, in our work at Learning Quest, we find the same biases impact day-to-day reasoning and decision-making at work. For example, if an employee believes himself to be a fair, caring and competent employee and you suggest that he has put the profit of the business above the needs of the customer, he is likely to look for/‘find evidence’ to suggest that failure was due to a process/policy; or it was someone else’s fault; or even somehow the customer misrepresented themselves which ‘caused him’ to make this unfair decision. Similarly, if someone sees herself as a competent leader, and people in her business have ‘done the wrong thing’ or performed poorly, it is easy for her to favour the data that shows she gave all the right guidance, coaching and support, but the employees made mistakes, were cavalier in their application of policy or not competent or ethical.
It is difficult to see ourselves as flawed. However, as this article and heaps of neuroscience and psychology research demonstrate, we are far from perfect in our thinking. Our brains operate in a way that means that despite our best intentions, our thinking will be biased, and we will at times make thinking errors.
The science has given us some clues as to how to mitigate the risks of naturally occurring faulty thinking.
- First, scientists have shown us that thinking and decision-making are improved when multiple perspectives are included. This is where the issue of inclusion comes in. It is not enough to have a diverse workforce, although that is a good starting place. We must make it possible for diverse points of view to be ‘heard’ and not immediately dismissed. This is easier said than done because those ‘diverse viewpoints’ always seem the be ‘out of step’ with ‘how we think and do things around here’. In our work across a range of industries and levels of leadership, we find it takes practice and feedback. We also find that it is essential that this feedback/practice occur in the same context that will be required for day-to-day decisions, in order to ‘shift the way leaders have conversations’.
- Although not discussed in this article, we know that when people are under pressure or when things are changing quickly, we are even more vulnerable to making these types of mistakes. Leaders must learn to regulate their thinking/emotion to reduce the sense of pressure when timeframes are tight, there is an overload of information or they are confronted with a sense of urgency. While some leaders are skilled in these areas, we find, many are surprised by their reactions when ‘stress tested’ in high pressure or fast-changing environment.
- A third way we have found for leaders to think more clearly and make better decisions is to become aware of their own thinking ‘tendencies’ and recognise their ‘habitual patterns’ when confronted with challenges and complex or ethical decisions. It is not sufficient to simply ‘reflect’ on what we ‘think we would do’ or even ‘what our values tell us to do’. This reflection is only the starting point. To improve thinking capability in this area, we require external input; to ‘see ourselves as we are’; the ability to ‘suspend’ judgement; the skills and habits to regulate emotion under pressure and the empathy to effectively solicit and include a diversity of thought.
If you or any members of your team/organisation want to gain insight and learn how to mitigate the risk of biased thinking, connect with us.
We deliver our science-based programs in a number of ways;
- One-on-one leadership programs TalentFAST™,
- Our team assessment and coaching program TeamFAST™and
- A variety of group-based culture-change programs, as well as online delivery.