Introduction
Aviation accidents, though relatively rare, often result in significant loss of life and prompt intense scrutiny into their causes. Psychological theories, models, and research play a crucial role in dissecting these incidents, particularly through the lens of human factors, which examine how human cognition, behaviour, and interactions with systems contribute to errors. This essay critically evaluates the extent to which such psychological insights can explain why aeroplanes crash and propose preventive measures. Drawing on module evidence from Week 23 and Chapter 14, ‘Humans in Systems’ (which discuss human error in complex environments), as well as independent research, the analysis will address three key areas: the blame attributable to humans versus systems for errors; the utility of human factors research methods and the Swiss Cheese Model in explaining human errors; and the potential of automation, organisational culture, confidential reporting, and proactive safety approaches to prevent accidents. By integrating these elements, the essay argues that while human errors are often highlighted, systemic factors are equally culpable, and psychological interventions can enhance safety, though limitations persist in their application.
Humans or Systems to Blame for Errors
In aviation, accidents are frequently attributed to ‘pilot error’, yet psychological theory suggests that blaming individuals oversimplifies the interplay between humans and systems. Module evidence from Chapter 14, ‘Humans in Systems’, emphasises that humans operate within multifaceted sociotechnical systems where errors arise from interactions between people, technology, and organisational processes (as discussed in Week 23 lectures on systemic failures). This perspective aligns with James Reason’s work, which posits that errors are not merely individual failings but outcomes of latent conditions within the system (Reason, 1990). For instance, the 1996 ValuJet Flight 592 crash was initially blamed on human oversight in cargo handling, but investigations revealed systemic issues like inadequate regulatory oversight and poor training protocols (National Transportation Safety Board, 1997).
Critically, however, this systemic view can sometimes downplay human agency. Research by Shappell and Wiegmann (2000) using the Human Factors Analysis and Classification System (HFACS) analysed over 1,000 aviation accidents and found that while organisational influences contributed to 70% of cases, unsafe acts by individuals—such as skill-based errors or decision-making lapses—were directly involved in nearly all incidents. This suggests a balanced attribution: humans are fallible due to cognitive limitations like attention biases or fatigue, as explored in Week 23 module materials on workload and stress, but systems often fail to mitigate these vulnerabilities. Independent studies, such as those by Dismukes et al. (2007), reinforce this by showing that even experienced pilots commit errors under high cognitive load, yet blame shifts to individuals when systemic redundancies are absent.
Arguably, the extent to which psychology helps here is limited by a tendency to retroactively apply models, potentially ignoring unique contextual factors. For example, cultural differences in crew resource management can exacerbate errors, as seen in the 1990 Avianca Flight 52 crash attributed partly to communication breakdowns influenced by hierarchical cockpit cultures (Helmreich and Merritt, 1998). Therefore, while psychological research shifts blame from pure human fault to systemic interactions, it does not fully absolve individuals, highlighting the need for integrated approaches that address both.
Human Factors Research Methods and the Swiss Cheese Model in Explaining Human Errors
Psychological research methods in human factors, combined with models like Reason’s Swiss Cheese Model, provide valuable frameworks for understanding aviation errors. The Swiss Cheese Model, detailed in Chapter 14 of the module, conceptualises accidents as resulting from aligned ‘holes’ in multiple defensive layers—representing active failures (e.g., human slips) and latent conditions (e.g., poor design or management oversights) (Reason, 1997). This model explains why crashes occur not from single errors but from cascading failures, as illustrated in the 1988 British Midland Flight 92 incident, where engine failure combined with pilot misdiagnosis and inadequate training led to disaster (Air Accidents Investigation Branch, 1990).
Human factors research methods, such as cognitive task analysis and accident investigation techniques discussed in Week 23, enhance this understanding by systematically examining error precursors. For instance, ergonomic studies use simulations to identify how interface design contributes to errors, revealing issues like mode confusion in automated systems (Sarter and Woods, 1995). Independent research by Li and Harris (2006) applied HFACS to military aviation mishaps, finding that the Swiss Cheese Model effectively categorises errors into organisational, supervisory, and operator levels, with 80% of accidents involving multiple layers.
Critically evaluating these tools, they offer explanatory power but have limitations. The Swiss Cheese Model is metaphorical and may oversimplify dynamic interactions, as critics argue it assumes linear failure progression, ignoring emergent behaviours in complex systems (Dekker, 2011). Moreover, research methods like post-accident analyses can suffer from hindsight bias, where investigators overestimate predictability (Fischhoff, 1975). Despite this, these approaches shift focus from blame to prevention, as seen in their application to the 2009 Colgan Air Flight 3407 crash, where fatigue and training deficiencies aligned to cause the accident (National Transportation Safety Board, 2010). Thus, while psychological models and methods illuminate human errors, their effectiveness depends on rigorous, unbiased application, and they may not capture all non-linear aspects of system failures.
Automation, Culture, Confidential Reporting, and Proactive Approaches to Prevent Air Accidents
Psychological insights also inform preventive strategies, including automation, safety culture, confidential reporting, and proactive safety management. Automation, as covered in Chapter 14, reduces human workload but introduces risks like complacency or skill degradation (Parasuraman and Riley, 1997). Research shows that well-designed automation, such as autopilot systems with human oversight, can prevent errors; for example, the implementation of Traffic Collision Avoidance Systems (TCAS) has significantly reduced mid-air collisions (Eurocontrol, 2018). However, over-reliance can lead to issues, as in the 2018 Lion Air Flight 610 crash involving faulty automated stabiliser systems (Komite Nasional Keselamatan Transportasi, 2019).
Organisational culture, emphasised in Week 23, promotes ‘just culture’ where errors are reported without fear of punishment, fostering learning (Dekker, 2007). Confidential reporting systems, like the UK’s Confidential Human Factors Incident Reporting Programme (CHIRP), allow anonymous submissions, leading to proactive fixes; a study by O’Leary (2002) found such systems increased error reporting by 50%, enabling cultural shifts. Proactive approaches, such as Safety Management Systems (SMS) mandated by the International Civil Aviation Organization (ICAO), use risk assessments to anticipate failures before they occur (ICAO, 2018).
Critically, these measures are not foolproof. Automation can create ‘ironies’ where humans become monitors rather than operators, potentially increasing errors during malfunctions (Bainbridge, 1983). Cultural changes require sustained effort, and confidential reporting may be underutilised in high-power-distance cultures (Hofstede, 1980). Furthermore, proactive strategies demand resources, which smaller operators might lack. Independent evidence from Stolzer et al. (2008) evaluates SMS in aviation, showing a 20-30% reduction in incidents, but implementation varies. Overall, these psychological interventions can substantially prevent accidents, though their success hinges on integration and addressing human-system mismatches.
Conclusion
In summary, psychological theory, models, and research significantly aid in understanding aviation accidents by highlighting the interplay between human errors and systemic factors, as evidenced by the Swiss Cheese Model and human factors methods. While humans are often implicated, systems bear considerable blame, and preventive tools like automation, safety culture, confidential reporting, and proactive approaches offer practical solutions, albeit with limitations such as over-reliance or implementation challenges. Implications for psychology include the need for ongoing research to refine these models, ensuring safer aviation. Ultimately, this integrated approach underscores that accidents are preventable through human-centred design, though complete elimination remains elusive due to inherent complexities.
References
- Air Accidents Investigation Branch (1990) Report on the accident to Boeing 737-400 G-OBME near Kegworth, Leicestershire on 8 January 1989. HMSO.
- Bainbridge, L. (1983) Ironies of automation. Automatica, 19(6), pp. 775-779.
- Dekker, S. (2007) Just culture: Balancing safety and accountability. Ashgate Publishing.
- Dekker, S. (2011) Drift into failure: From hunting broken components to understanding complex systems. CRC Press.
- Dismukes, R.K., Berman, B.A. and Loukopoulos, L.D. (2007) The limits of expertise: Rethinking pilot error and the causes of airline accidents. Ashgate Publishing.
- Eurocontrol (2018) TCAS II Version 7.1. Eurocontrol.
- Fischhoff, B. (1975) Hindsight ≠ foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), pp. 288-299.
- Helmreich, R.L. and Merritt, A.C. (1998) Culture at work in aviation and medicine: National, organizational and professional influences. Ashgate Publishing.
- Hofstede, G. (1980) Culture’s consequences: International differences in work-related values. Sage Publications.
- International Civil Aviation Organization (ICAO) (2018) Safety management manual (Doc 9859). ICAO.
- Komite Nasional Keselamatan Transportasi (2019) Aircraft accident investigation report: Boeing 737-8 MAX, Lion Air Flight JT610. KNKT.
- Li, W.C. and Harris, D. (2006) Pilot error and its relationship with higher organizational levels: An analysis using the human factors analysis and classification system. Ergonomics, 49(9), pp. 871-885.
- National Transportation Safety Board (1997) In-flight fire and impact with terrain ValuJet Airlines flight 592, DC-9-32, N904VJ Everglades, Miami, Florida, May 11, 1996. NTSB/AAR-97/06.
- National Transportation Safety Board (2010) Loss of control on approach, Colgan Air, Inc., operating as Continental Connection Flight 3407, Bombardier DHC-8-400, N200WQ, Clarence Center, New York, February 12, 2009. NTSB/AAR-10/01.
- O’Leary, M. (2002) The British Airways human factors reporting programme. Reliability Engineering & System Safety, 75(2), pp. 245-255.
- Parasuraman, R. and Riley, V. (1997) Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39(2), pp. 230-253.
- Reason, J. (1990) Human error. Cambridge University Press.
- Reason, J. (1997) Managing the risks of organizational accidents. Ashgate Publishing.
- Sarter, N.B. and Woods, D.D. (1995) How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Human Factors, 37(1), pp. 5-19.
- Shappell, S.A. and Wiegmann, D.A. (2000) The human factors analysis and classification system—HFACS. Office of Aviation Medicine, Federal Aviation Administration.
- Stolzer, A.J., Halford, C.D. and Goglia, J.J. (2008) Safety management systems in aviation. Ashgate Publishing.

