Aerospace Human Factors and AI [part 1/6]
Introduction to Aviation Human Factors.
AI can support humans in:
Analysing data and aiding in decision-making
Automating repetitive tasks
Enhancing memory and problem-solving abilities
Generating new ideas
Assisting with complex calculations and simulations
However, as with any emerging technology, AI can play an important role in reducing human errors but also brings associated risks.
Focused particularly on the adoption of AI in Business processes, this series will cover:
Part 1 - Introduction to Aviation Human Factors (HF)
Part 2 - The Relationship Between Human-Centred AI (HCAI) and Aviation Human Factors
Part 3 - AI's positive Impact on Aviation Human Factors
Part 4 - Emerging Human Factors Challenges Due to AI Adoption in Aviation
Part 5 - Understanding EASA's Building Block #3: A Deep Dive into Regulatory Compliance
Part 6 - Future Directions and Integration Strategies for HF and AI in Aviation
This is the first part and introductory where I kick off introducing:
What are Human Factors in Aviation?
Understanding Types of Errors in Human Factors
The Dirty Dozen
The Swiss Cheese Model
Conclusions
Let's dive in! 🤿
What are Human Factors in Aviation?
In the aerospace industry, there is a domain known as Human Factors.
Human Factors are everything that affects Human Performance
and can be applied to any industry.
To illustrate, consider the figure below from the MEDA (Maintenance Error Decision Aid) User’s Guide, which demonstrates all the potential contributing factors that can affect an aircraft maintenance engineer:
Human Factors are considered in the aviation industry from the design stage, with the goal of ensuring that new products and the processes involved in their use are human-centric.
This consideration extends to the organisations that use and maintain these products.
All these organisations must understand Human Factors and comply with various regulatory requirements.
For instance, aerospace maintenance organisations are required to demonstrate that they have a Human Factors Programme in place.
This includes the delivery of mandatory initial and recurrent training at all levels of the organisation that could have a direct or indirect impact on airworthiness.
A typical Human Factors Programme will usually consider at least the following fundamentals:
Safety Culture and Organisational Factors
Types of Human Error and their Management
Understanding of Human Performance and Limitations
Environment; Communication; Procedures, Information, Tools, and Practices
Teamwork; Professionalism, and Integrity
Reporting Errors; Disciplinary Policy; Error Investigation; Actions to Address Problems; Feedback.
Understanding Types of Errors in Human Factors
In Human Factors, errors are typically categorised into three main types, each with distinct characteristics and implications for aviation safety:
Slips and Lapses: Errors of execution where intended actions are not performed correctly, often due to attention or memory failures.
Mistakes:
Rule-based mistakes: Misapplication or failure to apply known rules.
Knowledge-based mistakes: Errors made in new situations where the rules are not well-known.
Violations:
Routine violations or Norms: Habitual and often overlooked deviations from standard operating practices.
Exceptional violations: Occur in response to unique situations.
Recklessness: A severe form of violation characterised by a conscious disregard of substantial and unjustifiable risks associated with one's actions. It is more severe than routine or exceptional violations and usually indicates a significant lapse in professionalism and responsibility.
The Dirty Dozen
A key concept in understanding human errors in aviation is the Dirty Dozen — twelve common contributors to human error identified by Gordon Dupont in 1993 while working for Transport Canada.
These include:
Each factor represents a risk that can be mitigated through targeted training and system design, illustrating the vital role of HF in maintaining airworthiness and safety.
The Swiss Cheese Model
The Swiss Cheese Model of accident causation, developed by James Reason, visually illustrates how accidents happen due to multiple, smaller errors lining up, allowing a hazard to materialize.
In this model, each layer of cheese represents defences, barriers, and safeguards in an organisation.
The holes in the cheese slices represent weaknesses in individual parts of the system, and when holes in multiple slices momentarily align, a risk of accident or failure arises.
Conclusions
In this introductory article, I have laid the foundation for understanding Human Factors in aviation.
As I will unveil in upcoming editions, AI offers powerful tools to support decision-making, error reduction, and safety improvement.
However, AI also introduces new risks that require careful management.
Through the Dirty Dozen and the Swiss Cheese Model, we've seen how minor human errors can lead to significant safety issues, emphasising the need for robust error management systems.
As we progress in this series, we will go deeper into how AI can be integrated with Human Factors in aviation to enhance safety and efficiency.
The next article will focus on the relationship between Human-Centred AI (HCAI) and Aviation Human Factors.
Stay tuned to continue exploring.
That's all for today.
See you next week 👋
References
MEDA (Maintenance Error Decision Aid) User's Guide
The HFCAS Framework, https://www.hfacs.com/hfacs-framework.html, accessed April 2024
Stolzer, Alan J.; Halford, Carl D.; Goglia, John J. (2011). Implementing Safety Management Systems in Aviation. Ashgate.
Reason, James; Hobbs, Alan (2003). Managing Maintenance Error - A practical Guide. Ashgate
Disclaimer: The information provided in this newsletter and related resources is intended for informational and educational purposes only. It reflects both researched facts and my personal views. It does not constitute professional advice. Any actions taken based on the content of this newsletter are at the reader's discretion.