The final report of crash of Air France Flight AF 447 stated that the precipitating event of the accident was “temporary inconsistency between the measured airspeeds…..that led in particular to autopilot disconnection” which was compounded by “inappropriate control inputs that destabilized the flight path” . This accident has brought the focus of the aviation community back to overdependence on automation at the cost of forgetting basic piloting skills!
Automation is meant to improve operational efficiency and precision, but enhanced autonomy and authority of advanced automation has given rise to unexpected problem of ‘communication with machines’ rather than about its operations. This has occurred because “advanced automation can initiate actions without immediately preceding or directly related operator input (i.e. autonomy), and it is capable of modulating or overriding user input (i.e. authority)” . Advanced automation in glass cockpits of today’s aircraft is a reflection of the potential of autonomy and authority in the “human-automation interactions”, at times leaving the crew baffled, confused or in worst case scenario, bewildered about unfolding chain of events aptly reflected by the final words of RH co-pilot of the ill-fated flight AF 447  –
“But what’s happening.”
It has been found that automation surprises tend to occur in “situations that involve a high degree of dynamism, nonroutine elements, or both, or conjunctions that impose high demands in the human’s attentional resources and therefore involve a high risk of losing track of system behaviour” .
Before outlining the automation surprises reported in the early days of Airbus A-320, it is important to highlight the changes expected on part of the human operator in technologically advanced systems. The same is summarised hereafter .
- Coordination of human actions for efficient operations requires “high attentional and knowledge demands on operators to maintain awareness of automation’s status, behaviour, intentions and limitations” .
- Automation surprises occur whenever operators are “unable to anticipate and track automation activities and changes” belying operators’ expectations, but happen because of “misassessment and miscommunications” of human-automation interactions. This results in poor “understanding of what the automated systems are set up to do and how the automated systems will be handling the underlying process(es)” .
- There are “three interrelated factors” giving rise to conflict between automation responses and operator expectations :-
- Poor mental models
- Low system observability
- Highly dynamic and/or nonroutine situations
- Automation has increased the demand for understanding the capabilities and knowledge of such systems. Gaps or misconceptions in the acquired knowledge base, due to inadequate training or worst still cutbacks in training, both in terms of time and resource allocation, compromise the preparedness of the operator to cope with the ‘novel’ problems of the refined system at his disposal.
- A pitfall of advanced automated systems is negligible feedback to the operator about their current or future activities . It can be aggravated by mental workload or cognitive work required for extracting the relevant information from the available data. This ‘observability’ is “the ability of available feedback to actively support operators in monitoring and staying ahead of system activities and transitions” . There is impaired learning and refinement of the operator’s mental model, despite increasing experience, in case of low or poor observability, which does not help build up the required knowledge base while handling such systems over a period of time.
- Old Facts, New Insights – Lessons from A-320
- Old Facts, New Insights – Lessons from A-320 Part 2
- Old Facts, New Insights – Lessons from A-320 Part 3
- Old Facts, New Insights – Lessons from A-320 Part 4
Acknowledgement Image courtesy Wikipedia