Preparing pilots for googlies

As flight deck automation becomes more reliable – to the point of hardly ever failing - it is becoming more of a human factors problem. The UK Air Accident Investigation Branch makes this clear in its report on the Thomsonfly Boeing 737-300 that stalled and was momentarily out of control during its approach to Bournemouth two years ago.

The AAIB cites a Civil Aviation Authority study “Flight crew reliance on automation”, observing: “Pilots familiar with operating older aircraft which had more variable reliability are nearing the end of their careers, and there is a generation of pilots whose only experience is of operating aircraft with highly reliable automated systems.” Is the AAIB implying that younger pilots are less good than the older ones when things go wrong? It seems so. Maybe that’s because the exercises mandated in recurrent training programmes have scarcely changed since the days of the Super Constellation. So training no longer represents what crews are likely to have to deal with in today’s aeroplanes.

The Thomsonfly incident was caused by the crew’s failure to notice that the autothrottle had disconnected with the engines at idle, and their late recognition that the airspeed had dropped seriously low. This has similarities with the circumstances of the February Turkish Airlines fatal accident on approach to Schiphol; there, the autothrottle retarded the power levers to idle – uncommanded – but the crew did not notice the power reduction or the speed loss.

To coin a cricketing analogy, crews today are like batsmen practised only in receiving lots of fast, straight deliveries. What they actually need is training for the occasional googly. Like a subtle automation failure, for example.

3 Responses to Preparing pilots for googlies

  1. Adam Johns 22 May, 2009 at 6:40 pm #

    Hi David

    I recently wrote my dissertation on the effect of flight deck automation on CRM. The Thomsonfly incident is a prime example of how CRM training needs to be tailored more towards automation related issues.

    The main argument of my thesis was that system designers need to adopt a human-centred approach to automation. I think the regulators need to re-think current CRM training, and like you suggested, throw in a few subtle automation failures.

    The accidents of Cali (1995) and the recent 737 Russian crash emphasise this.

  2. FLYER 2 June, 2009 at 10:30 pm #

    Hi Mr Learmount,
    Automation has reached great hights in aviation and is a wonderfull tool, but it does not take the place of Airmanship. An instructor I knew would say, “Have one hand on the steering wheel, one hand on the throttles, your feet on the peddles look out of the window and fly the ruddy thing”.He would never fly the approach with auto throttle engaged,he felt that you should be able to tell your speed by the position of the throttle levers in the gate. Not an easy thing to do, but I think that some of the things he said ring true.

  3. Mark Searle 14 December, 2011 at 1:23 am #

    It was a Thomson/First Choice 767 that got bent in a hard landing in Bristol on 3 October 2010. It was announced at the time that the AAIB were investigating, but there’s no mention of the incident on their web site. Why the cover up?