What you expect to see

If there’s one question about the BA 777 St Kitts short runway take-off I would like to have answered, it would be this: I know that there is a psychological tendency for pilots to see what they expect to see, but when the crew were arriving onto runway 07 from the south side, turning right to line up and looking left down the runway to ensure the approach was clear (it says so in the report), the skipper definitely noticed that the take-off run available looked short, but why had he not reacted to the sight of the massive amount of runway on his left that was not available to him?

It’s this kind of mistake, when made by highly trained, perceptive people, that we need to understand more about if we are to become more self-aware and more able to take charge of our own human factors (=human weaknesses).



5 Responses to What you expect to see

  1. David Connolly 7 September, 2010 at 1:00 pm #

    There is always a degree of reconciliation of expectation with perception of reality in all human activity. Like the golden deceiver I tried to pick up in my last layover pub. When she excuted a 180, she took on the appearance of a sphinx, more than an assumed minx. At least she didn’t bark. “Dog Days Hazard Augustus-QED !” I made a tactful bactrack. Assumed Flex Temp takeoffs assume rated VMCG,protecting off-side fouls, so to speak. There are no assumed runway takeoffs. This sort of event is what one would assume to happen in CAT I-II US/EU, not CAVOK St. Kitts. As SIA discovered in Taipei in October 2K, another takeoff rule of thumb deserves consideration. “RAVOK ?”…Runway Availabe, OK ?.Is this crew still operating ?

  2. Frank Caron 10 September, 2010 at 12:17 pm #

    why had he not reacted to the sight of the massive amount of runway on his left that was not available to him?

    That is the question…

    You will never get a (scientific) answer to this question. By scientific, I mean an answer that can be repeated in a similar situation, which will give us the opportunity to train operators to avoid this bias.

    You have to understand very few things about human behavior (named Human factors in this industry).

    1) you see, you hear and you feel what you expect to see to hear or feel. .Yes…, but the roots of this bias are more deeper.

    2) perception means collecting data from our external (or internal world) in order to use these data in our current behavior. If the perception is not in accordance with what we expect, we should be able to react to this discrepancy and adopt a new strategy. That seems logic!

    3) but this forgets that our perceptions are based and linked to our decisions. Here the decision was to join a runway through a taxiway and take off.

    4) we may have the illusion that decisions are real time process leading to actions. It may be. But in aviation were anticipation is one of the main factors that tailored the expert pilots, most of the decisions are taken well before the action. We call it “plan of action”, because if you want to perform smooth and efficient procedures, it is better to anticipate the upcoming procedures.

    5) so this crew (both of them), have made the decision to take off after joining the runway well before the time they reach this specific taxiway to line up on the runway. Since what you are doing seems in accordance with your plan (i.e. you expect to see a runway a front of your aircraft), it is very difficult to question the situation. I do not want to say it is impossible, but the fact is that is highly difficult…
    It does not matter you are an experienced pilot or not, you are just caught by what you have decided previously. Here, the captain has some concern, but because this, he will not reconsider his previous plan of action, and the Fo won’t do it as well for the same reason, even though they both felts something was strange…

    7) this is one the bias we have to make operators aware (not only pilots), numerous errors are made early during a process (briefing, etc.). As a consequence, the erroneous result will be shifted in the future. The more time separating the decision and the action, the more it is difficult to reconsider the decision and its possible consequence(s).

    8) Yes it is definitely a problem. But it has also one big ans essential advantage for our daily behaviors… Just think differently… If we have to question systematically every decision we have made (previously or not), our life would be a… pure nightmare. Just imagine you permanently considering what has been done or not, properly or not, controlling every single action as it needed being always and always (re)evaluated.
    The consequence on our performance would be a disaster for 2 reasons:
    First, disturbing our behavior every minute if not seconds (just think about the number of decisions you take every minute…), will be a lost in efficiency, smoothness and result, therefore making us totally non-adapted to our operational needs.
    Secondly, we will be totally exhausted by the constant consumption of our mental resources used for this permanent questioning.
    Therefore this process (not reconsidering) is highly NECESSARY because it PROTECTS our daily behavior from permanent questioning and doubts.

    I agree sometimes it would be useful to question, but most of the time it won’t be done… But mental behavior does not always obey to scientific and logical… The human behavior is hugely integrated and adapted to our world. Indeed, it is not perfect, and it appears full of necessary… paradoxes.

    The human operator, the least reliable element of an aerodyne, but the most essential…!
    Frank Caron (1991)

  3. David Connolly 12 September, 2010 at 2:22 am #

    Well said Frank, excellent analysis. “The least realiable is the most essential-QED !” a noble thought to begin any day in expectation of a better tomorrow…I shall think anew and objectively of subjective OCD. I think the origin of this mindset was Teneriffe on March 27 1977 of KLM’s Capt. J V Van Zanten. A disaster of that magnitude demands such a moniker and nothing less. Clearly, through the foggy mist of history, I suffer from OCD-isaster mindset, myself in an abhorrance of wasted life’s time in general and avoidance of remedial CRM in particular.
    You illistrate sobering logical CRM psychology, unlike psychiatry which makes organised religion of universal metrics appear scientific.

  4. Peter Bore 12 September, 2010 at 6:21 am #

    1. David. The AAIB report is much more informative that the Flight report that you included a link to. Perhaps, to encourage your bloggers to be as well informed as possible, you could in future include links to AAIB reports. As you will be aware they are detailed and usually very clearly written so that they are understandable by the non-professional.

    2. Frank Carron is correct. You see what you expect to see and only see what you look for. Repeated re-visitation of pre-planned decisions would often be counterproductive if only because it would over-load the modest, though extremely versatile, processing capacity of the human brain. But one must be alert to circumstances which do signal that revisting an earlier decision is necessary. In medicine, when seeking to make a diagnosis you are trying to identify a group of features which fit together to make a coherent story consistent with a medical problem you know about. It is easy to ignore the bits that don’t fit and often that is the appropriate thing to do. But sometimes the bit that doesn’t fit is the clue that the story you are trying to construct is the wrong story. That is high-level analytic thinking. That, if you like, is when you earn your pay.

    3. Thus the mistake made by the captain is seeing what he wanted/expected to see is normal human behaviour and will never be eradicated. The usual operational solution is multiple checks. In this case at least nine failures occurred.

    • The airport had inadequate signage.
    • British Airways staff had not inspected the airport before starting operations into St Kitts.
    • Neither pilot was familiar with the airport and because the above they were inadequately briefed about its limitations.
    • Despite the crew’s unfamiliarity with the airport and their recognition that the charts they had been supplied with were not clear, no taxi briefing was conducted.
    • The airport had no mechanism to collect and collate reports of misidentification of taxiways and thus failed to recognise the disastrous potential of what was a weekly event.
    • The crew used the wrong taxiway.
    • The captain recognized that the runway ahead of them looked short but did not take action to unequivocally determine the aircraft’s position.
    • The co-pilot, as well as the captain, failed to recognise their incorrect positioning.
    • ATC knew that the crew wished to depart from point Alpha and knew that the aircraft had lined up at point Beta but failed warn the crew that they were at point Beta.

    4. Any one of these factors by itself could have prevented the incident.

    5. However when the crew was made aware of their error at the end of the flight they reported it to BA and BA reported the incident to the AAIB. The crew and BA are to be commended.

    6. I am surprised at the statement in the report “The commander stated that he has not taken off from any other runways where the intersection was approximately 50% of the overall runway length.” London Heathrow’s runways are littered with intersections. I presume he did not mean what is literally stated and some qualifications are implied. But it seem less than rigorous on the part of the AAIB that it quotes this statement without seeking to resolve some of the ambiguities and uncertainties it contains – particularly in a report about an incident where ambiguities and uncertainties are the principal issues involved.

    7. When nine things all fail at the same time one gets a little concerned that the culture of safety in aviation is not what it might be. And equally worrying is the fact that if only eight failures had occurred the end result would have been regarded as ‘normal operations.’

    Peter Bore
    September 2010

  5. Tony 16 September, 2010 at 3:04 am #

    >> I know that there is a psychological tendency for
    >>pilots to see what they expect to see, but when the
    >>crew were arriving onto runway 07 from the south side,
    >>turning right to line up and looking left down the >>runway to ensure the approach was clear (it says so in
    >>the report), the skipper definitely noticed that the
    >>take-off run available looked short, but why had he
    >>not reacted to the sight of the massive amount of
    >>runway on his left that was not available to him?

    Runways often look shorter than their length, especially with a slight rise. Every time I take off from a certain airfield it looks like we’ll never stop within the runway for an RTO near V1, but I know that in reality there is sufficient room (I’ll avoid saying plenty). The more experience you have, the more this phenomenon is put to the back to the mind, like having the leans in IMC, you concentrate on the instruments/data and not your perception. Therefore, there weren’t any real triggers to alert the crew of this mistake and in many respects they were probably thinking correctly and objectively by discarding their fallible perception!

    At the end of the day when the cows come home unless one specifically chooses to look for problems then one probably won’t find them – we already know this. Some types of checks intuitively look for issues some for confirmation. For example, checking the approach is clear when lining up means attempting to see something that maybe difficult to see that you don’t want to see. But checking you’re on the correct runway means attempting to look for something that you do want to see. Those are the problem checks that are most susceptible to confirmation bias, same for this incident. But this is old news.

    FWIW, Due to the tremendous of concentration involved, I think it is impossible for a human to think defensively all the time and probably only reliably for short spells with the tendency to switch to more relaxed modes of thought. Given that the fallibility is one’s ability to switch modes of thought in a timely and appropriate manner – it ain’t going to happen all the time.

    For this particular incident, it seems that any crew operating out of that airfield were effectively set up to fail. That does not mean they are not responsible, just that if the signage was inadequate, well what do we expect?