Study finds that pilots' decisions are just as irrational as ours

Posted on May 18, 2016

In a paper in Applied Cognitive Psychology Stephen Walmsley and Andrew Gilbey of Massey University have shown that pilots’ judgment of weather conditions, and their decisions on how to respond to them, are coloured by three classic cognitive biases. What’s more, expert flyers are often the most vulnerable to these mental errors.

The researchers first addressed the “anchoring effect”, which is when information we receive early on has an undue influence on how we subsequently think about a situation. Nearly 200 pilots (a mix of commercial, transport, student and private pilots) were given the weather forecast for the day and then they looked at visual displays that showed cloud cover and horizontal visibility as if they were in a cockpit, and their task was to quantify these conditions by eye. The pilots tended to rate the atmospheric conditions as better – higher clouds, greater visibility – when they’d been told earlier that the weather forecast was favourable. Essentially, old and possibly irrelevant information was biasing the judgment they were making with their own eyes.


Next, hundreds more pilots read about scenarios where a pilot needed to make an unplanned landing. An airstrip was nearby, but the conditions for the route were uncertain. Each participant had to solve five of these landing dilemmas, deciding whether to head for the strip or re-route. For each scenario they were told two statements that were reassuring for heading for the strip (e.g. another pilot had flown the route minutes ago) and one that was problematic (e.g. the visibility was very low). In each case, the participants had to say which piece of information was most important for deciding whether to land at the nearby airstrip or not. Across all scenarios, the pilots showed no preference for any type of statement. This is worrying because pilots should be prioritising the disconfirming evidence over the others, but in fact they were just as likely to rely on reassuring evidence, which is an example of what’s known as “the confirmation bias”.

In a final experiment more pilot volunteers read decisions that other pilots had made about whether to fly or not and the information they’d used to make their decisions. Sometimes the flights turned out to be uneventful, but other times they resulted in a terrible crash. Even though the pilots in the different scenarios always made their decisions based on the exact same pre-flight information, the participants tended to rate their decision making much more harshly when the flight ended in disaster than when all went well. The authors of the study are concerned that pilots are also vulnerable to the “outcome bias” – because pilots who decide to fly in unwise weather and get lucky could be led by this bias to see their decisions as wise, and increasingly discount the risk involved.

To read the full article, click on the link below


Source material from BPS Research Digest