Cognitive biases are systematic errors affecting our perception, memory and thinking. Such errors have been the cause of nuclear accidents, plane crashes and business bancrupcies. It is clearly important to detect, and prevent, cognitive biases in action.
As I write this, Wikipedia lists 191 cognitive biases. Good luck spotting them when they occur. So, let me try and offer you a more actionable way of thinking about cognitive biases. They are specific breakdowns in concern or competence.
Let’s start from the fundamtentals: As organisms, we need to be able to understand and influence the world (“competence”) towards certain ends (“concerns”). Most cognitive biases have one of two effects: They make us over- (sometimes under-) estimate our competence – or they change our concerns.
Do you think your mother-in-law is an above-average driver? No? How about yourself? 93% of people famously consider themselves above-average. We believe we master our environment even if we don’t – a famous early experiment showed that people believe they control a lamp even if the association between light and switch position is random. Our flattering assessment when it comes to our own competence is helped by how we explain our failures: While success is due to personal mastery, failure is due to unfair competition, bad exchange rate or the evil colleague from another silo.
So much for influcence. Is the second aspect of competence, understanding, subject to similarly serious distortions? Yes: Our need for fast understanding and orientation makes us accept arbitrary numbers (“anchors”), seek only information confirming our preconceptions, believe the crowd or people wearing white coats, and – this is crazy – even change past memories to fit with present experience. We make sense of largely random events by packaging them into “meaninful” (but untrue) stories, anywhere from politics (“Axis of Evil”) and media (“… died from grief”) to personal development (“nursery messed me up”). Change stories are different, they are true, of course.
Imagine yourself out there in the wild, stuck in the middle of the food chain, looking for your next meal. The upside of venturing out of that secure bush is a little less hunger, the risk: the crown of creation bumping into the king of animals. Hundredthousands of years of such practice affected our concerns: we became risk and loss averse. Gaining something feels less intense than ceding it. This is a ubiquitous phenomenon: We would never pay our own asking prices on Ebay, and when choosing insurance we tend to opt for (exorbitant) excess reduction fees to avoid co-pay . Think about what this means for change programs: How many people experience a net benefit, when what they cede counts more than what they gain?
Not only do unsatisfied concerns make themselves felt more than satisfied ones – the intensity depends on how much effort we have put into pursuing them. How often do we decide not to part with stuff when moving house, because, uh, I feel strangely attached to this skewed Billy bookshelf it took me hours to assemble. Organizations are riddled with products and projects with poor outlook they do not part with because of the effort already invested.
Many or most cognitive biases can be understood as distortions of concern or competence. How about the third factor determining successful collaboration: Is coordination affected by bias? Take the tragedy of the commons: Natural resources are over-exploited because benefit is privatized, damage socialized. This is a failure of regulation, which is a specific type of coordination (involving rules).