This interesting study by Sarter and Woods revealed that the automation surprises “occur when the crew detects that automation or aircraft behaviour is deviating from their expectations” [2]. In turn, such ‘surprises’ provide the vital opportunity (and learning) to correct unexpected or undesirable aircraft behaviour.
Most of the surprises shared by the pilots fall under two broad categories – “situations in which the system fails to take an expected action”; and situations in which “automation carries out an unexpected action that was not commanded”. This has “implications for error detection and error recovery” [2]. For the first type of situations, pilots are actively monitoring the system behaviour, using confirmation strategy, hence it is easy or fast to detect an error; whereas for the latter situations, when the ‘autonomous’ automated system with ‘authority’ takes an action that the crew does not expect (and by corollary, does not monitor) there may be delay in detection or worst still, no detection, affecting the recovery or corrective action, which may compromise safety of flight.
It is important for the crew that the detection and recovery from a mismatch between their expectation and automation induced aircraft behaviour happens reliably and quickly. Such automation surprises, thus, provide an opportunity to the operating crew and others about various indications of having “misunderstood, miscommunicated with, misused or mismanaged the automated systems” [2]. This helps crew identify, revise and refine their mental model of the automation, in turn exploring the system under varying operational circumstances to practice managing this powerful resource of advanced automated systems for safe flight.
But the worst case scenario with frightful consequences of automation is that in case the automated events go unnoticed by the crew, or even if noticed belatedly, the time to recover gets prolonged with the flight moving away from safety towards “an undesirable, potentially hazardous and possibly unrecoverable situation” [2]. Unfortunately, some of these fears have already come true in the tragedy of Air France flight AF 447, where the crew could not identify the approach to stall, and there was lack of an immediate reaction on their part allowing the aircraft to exit from the flight envelope [1].
Read More
Reference
1. Final Report on the accident on 1st June 2009 to the Airbus A330-203 registered F-GZCP operated by Air France flight AF 447 Rio de Janeiro – Paris. Bureau d’Enquêtes et d’Analyses, France: July 2012.
4. Sarter NB and Woods DD. “How in the world did we ever get into that mode?” Mode error and awareness in supervisory control. Human Factors. 1995, 37. 5-19.
5. Sarter NB and Woods DD. Pilot interaction with cockpit automation: II. An experimental study of pilots’ model and awareness of the flight management and guid- ance system. international Journal of Aviation Psychology, 1994, 4, 1-28.
6. Federal Aviation Regulation Section 91.117
8. Lenorovitz JM. Indian A320 crash probe data show crew improperly configured aircraft. Aviation Week and Space Technology. 1990, June 25:84-85.
9. Sparaco P. Human factors cited in French A320 crash. Aviation Week and Space Technology. 1994, January 3: 30.
Acknowledgement Image courtesy Wikipedia
1 ping
[…] Old Facts, New Insights – Lessons from A-320 Part 4 » […]