Inadequate crew knowledge of automated systems was a factor in more than 40% of accidents and 30% of serious airline incidents, Abbott says. She catalogues evidence of disharmony between crews and their highly automated aircraft, based on detailed studies of accident and incident data and line operation safety audits between 2001 and 2009, so the research is recent and involves real operations. Abbott says the findings are raw data at this point, and there is more work to do before publication this year.
Among the handling problems pilots have repeatedly demonstrated, Abbott's findings include lack of recognition of autopilot or autothrottle disconnect; lack of monitoring and failure to maintain aircraft energy/speed; incorrect upset recovery; inappropriate control inputs; and dual sidestick inputs. On flight management system use, she found pilots often focus on programming the FMS to the detriment of monitoring the flightpath.
Abbott finds there are many failures for which pilots receive little or no help from checklists or from training of any kind. These include failures or malfunctions of air data computers, computer or software failures, many electrical failures, and uncommanded autopilot disconnects or pitch-up for which the reason is not known. She comments: "Failure assessment is difficult, failure recovery is difficult, and the failure modes were not anticipated by the designers."
© Rex Features
This Aires Colombia Boeing 737-700 overran the runway in a storm at San Andres Island airport
Despite being aware of the sometimes fickle nature of automation, she observes, pilots nevertheless frequently abdicate too much responsibility to automated systems. The reasons for this include a perceived lack of trust in pilot performance by the airline, policies that encourage the use of automated systems rather than manual flying and insufficient training, experience or judgement. The result is that "pilots may not be prepared to handle non-routine situations", says Abbott. When she examined accidents in which crews had, at some stage, reverted to manual flying, she found "manual handling/flight control errors" were contributory factors in 60% of cases.
Abbott highlights particular vulnerabilities in automated systems and their interfaces, including mode confusion, and a pilot tendency to use processed information from the FMS instead of raw data. Another problem she identifies is that much of the information supplied to pilots is itself automated - what she calls "information automation".
She says: "The current focus on managing modes and automation may not always integrate well with flightpath management tasks." She found evidence that pilot knowledge is seriously lacking in many areas of automated systems, including understanding the flight director, autopilot, autothrottle/autothrust, and flight management system/computer systems and their limitations; operating procedures, mode transitions and behaviour; and unusual attitude recognition and recovery.
Abbott predicts that recommendations on pilot training are likely to say that it should focus on standard operating procedures for flightpath management, distinguish between guidance and control and encourage flightcrews to tell air traffic "unable to comply" when appropriate. Finally, each individual airline should ensure its standard operating procedures are tailored to its specific needs. Abbott says the industry as a whole needs to review practice, regulatory guidance and requirements for training in numerous areas. These include flightpath and energy management, recovery from off-path circumstances, use of alternative modes to meet air traffic clearances/requirements, operators' operational policies and managing malfunctions.
That list of tasks - a resounding indictment of the industry's inability to adjust training appropriately for the modern cockpit environment - should be enough to keep the aircraft manufacturers, airlines and regulators busy.