Few air safety investigations wrap up with a public recommendation that the aircraft’s captain ought to have his head examined.

But whatever a psychologist might discover about the poor decision-making which led a Smartwings Boeing 737 crew to fly all the way from Greece to the Czech Republic on a single engine is probably not going to be revelatory.

smartwings livery Boeing 737 Max

Source: Smartwings

That some pilots’ decisions, from the outside, seem awesomely, incomprehensibly atrocious is not a new phenomenon.

The fatal crash of a Hawker Siddeley HS748 in Illinois in 1983 resulted from the captain’s opting not to divert when, just after take-off, it lost both generators. Attempting to proceed to the destination exhausted the batteries and the aircraft suffered complete electrical failure.

NASA’s Ames Research Center included the accident among 37 in which crew behaviour contributed to the outcome, analysing them for tactical decision errors in a May 1998 conference paper.

“A common pattern emerged,” the researchers said. About 75% of the errors represented “plan-continuation”, decisions to stick with original intentions despite cues to change the course of action.

This manifests itself as failure to execute a go-around, failing to de-ice again after time-limit expiry, and other symptoms of the condition popularly known as ‘get-there-itis’.

“Recognising that these errors occur is only the first step,” says the NASA paper. “Determining why they occur is the more difficult task.”

It puts forward a suggestion that “bounded rationality” – a lack of information, or lack of knowledge, or failure to project potential risk into the future – can combine with certain cognitive limitations to produce a potentially hazardous mix.

These limitations include unrecognised ambiguity in cues, leading pilots not to question their interpretation of a situation, as well as company pressures and social factors which might set up conflicts with safety objectives.

If the cognitive limits become stressed, says the paper, decision errors could be introduced by underestimating the risks of a situation, failing to evaluate consequences, or having overconfidence in the ability to cope.

Which means seniority and experience, somewhat paradoxically, can impede decision-making ability if pilots become complacent or numb to risk.

“Studies have found that people tend to be risk-averse when outcomes are framed as possible gains, but risk seeking when outcomes are framed as potential losses,” the paper adds.

“This raises the question of whether deteriorating situations that imply a change of plan… are seen as loss situations and therefore promote risk-seeking behaviour.”

While the NASA analysis is more than 20 years old, investigators probing the Smartwings shenanigans highlight the same issues: conviction of ability, downplaying of risk, failure to change the plan, all presumably in pursuit of a perceived gain.

In which case, the reasons for the incident are probably already familiar. Should a psychologist end up having a friendly chat with the captain, they would arguably do better to ask not why he pursued his course of action, but what would have encouraged him not to.