Safety is an Outcome of Cognitive Bias in Organizational Decision Making
Human and Organizational Performance (H&OP)
“We see the world, not as it is, but as we are──or, as we are conditioned to see it.”
-- Stephen R. CoveyUp to half of the persons who watch the above 90 second video test fail. This is because the viewer is conditioned to count the passes from the people in white. They need to act fast to pass the test. There is no time to analyze and rationalize what they are seeing in the video. That system analysis and rational of thinking is way too slow to pass the test.
So, the viewer relies on the intuitive part of their mind that is fast, efficient and built on memory so it does not require slow, expensive thinking. The intuitive part of their mind that has them on autopilot most of the time. The viewer completely focuses on counting the passes from the players in white. It works, the viewer correctly counts the number of passes from players in white. The viewer gets the desired result.
But at what cost? Or rather in Human and Organizational Performance (H&OP) terms what was the performance? Remember, Performance = Behavior + Actions. If the gorilla represents an evolving hazard that could harm the players, then what was the performance when relying on intuitive, automatic thinking behavior? We know the Result "players correctly counted, don't worry about anything else, mission accomplished", but what about the performance? There is a price when using fast, efficient, intuitive decision making. That price is that the decisions are often irrational and biased. When an Organization makes decisions this way it usually leads to bad outcomes.
When an Organization looks back it can't understand how it missed something that was so obviously dangerous to its workforce, to its customers. "We didn't know..." The truth is there was no time to know, no time was given to "think", rationalize and analyze before making a decision. Making rational, unbiased, careful decisions requires a lot of thinking energy. When it comes to assessing the risks of hazards there can be no shortcuts, no quick thinking. When in the face of hazards and doubt appears, stop when unsure and take the time to rationally think it through. Its no wonder the best way to control hazards is to find and remove them from the system.
Sounds easy, right? Just find the hazards, assess the risk and make the decisions... but it is not that simple:
√ Hazard Identified
√ Risk Assessed
√ Decision Made
! Yet Still the Occasional Catastrophe
When presented with imperfect information an Organization that is under stress, will understandably take a "thinking shortcut" and rely on its biases to make decisions. The biases cause irrational thinking. by distorting an Organization's situational awareness and reducing the Organization's ability to correctly assess risk. The ensuing decisions can and often do cause harm. Simply put: "the beliefs of the Organization were ultimately mistaken". Just like people, Organizations inevitably make mistakes. All Organizations make mistakes, but top performing Organizations learn from their mistakes and use that knowledge to plan for failure, anticipate human error and fail safely.
Since Organizations are in fact really just groups of humans, they are afflicted with the same set of human biases shown below in the cognitive bias codex. The "Need to Act Fast" group of biases is particularly troubling for most Organizations. When Organization is "moving fast" and a situation arises the mistakes tend to magnify bad outcomes. The longer the Organization is "moving fast" the more it becomes unsure, and confused. The Organization begins to routinely ignore cries for help from employees and customers. In the last steps leading up to the catastrophe the Organization fails to seize its last opportunites to stop when unsure. A gorilla was in the room the whole time, but the Organization's senior leadership needed to act fast and never saw it. All Organizations have biases, but top performing Organizations manage their biases and don't let biases and flawed beliefs overtake decision making.
Ref: Jm3, CC BY-SA 4.0, via Wikimedia Commons
Bonus Section for I/O Psychologists
Can artificial intelligence (AI) have cognitive biases?Modern AI including machine learning which learns from large data sets and reinforced learning which learns from thousands of simulations can both have biases. The biaess can exist either in the data sets or the simulated environments that the AI is learning from. When enhancing a decision making process with AI, an Organization has to be even more careful to make sure that cognitive biases do not flaw AI assisted decision making because when the decision making is automated, clouded judgement and mistakes can be hugely multiplied. As AI gains traction, the top performing learning Organizations may treat AI just like they do human error by learning from AI's mistakes, planning for failure, anticipating AI making errors and failing safely.