The impact of human factor studies on aviation accident statistics Essay
The impact of human factor studies on aviation accident statistics
Ever since the jet transport was introduced in the 1950’s, the number of aircraft accidents resulting from mechanical failures has dropped considerably (Sexton, Thomas & Helmreich, 2000). Unfortunately, the same cannot be said about human-error related accidents whose overall rate remains significantly high and stable over the past few years (Shappell and Wiegmann, 1996 as cited Shappell and Wiegmann, 2001).
In fact, human factors have been implicated in majority of aviation accidents – a study conducted by the Bureau of Air Safety Investigation [BASI] (1996) involving 75 fatal aircraft accidents revealed that 72% of these accidents involved pilot factors such as poor judgment, divided attention and failure to make correct judgments. Recent researches in the field of human factors and aviation have recognized however that pilot factors is only one of the many causalities for aircraft mishaps.
The organisational structure or the system itself, of which the pilot and the other aircrew are only a part, plays an equally if not far more important role in addressing aircraft safety issues. In order to have a better outlook of the interrelation between the kind of system an organization has and its ability to manage errors and thus consequently avoid accidents, let’s have a look at what makes a productive system and the reason why productive systems break down.
The aviation industry can be viewed as a complex system whose primary output is the safe conduct of flight operations regardless of whether it is for transportation, recreation or national defense (Wiegmann & Shappell, 2003). One of the key elements that make the system productive is the activity of its front line operators (pilots in the case of aviation) – such activities should be an effective integration of human and mechanical factors within the system. In the aviation industry, productive activities can only occur if the equipment are well-maintained and reliable, and the workforce, well-trained.
Furthermore, efficient Human Factors in Aviation Accidents 3 supervision and effective management results to correct decisions being made (Wiegmann & Shappell, 2003). Such decisions are usually a result of careful assessment of social, economic and political factors that affect the industry, and an open feedback line between managers and workers within the system. Most of the time a productive system such as this works quite well, but at certain times some things are overlooked and what is thought to be a highly organized system breaks down.
A productive system includes in its structure defences, barriers and safeguards that ensure the safety of its members and its assets. In the aviation industry such defenses are a blend of engineering (alarms, automations), human factor (skilled pilots and control-room operators), and effective procedures and administration (Reason, 2000). The defensive layers are not actually perfect and they can be viewed as slices of Swiss cheese having many holes in it. Ordinarily, the holes open, close and shift location but pose no immediate threat to the security of the system.
The hazard lies in the lining up of the holes in each slice thus creating a path towards an accident (see Figure 1). Figure 1. The Swiss cheese model of system accidents Reason, J (2000). Available http://www. bmj. com/cgi/content/full/320/7237/768, 2007 October 2 Human Factors in Aviation Accidents 4 The holes in the system can appear for two reasons: active and latent failures (Reason, 2000). Active failures are errors or lapses committed by people who have direct contact with the system. Lapses such as these may be consciously or unconsciously done. A pilot may for example be not skilled enough to make correct judgments.
On the other hand, a pilot may deliberately disregard existing procedures resulting to adverse effects. In contrast, latent failures, as its name implies, may lie inactive for sometime waiting for an opportune time to create havoc. Examples of latent failures are errors in decision made by designers, engineers and managers, which may not have an immediate effect on the security of the system. Rather it could result into a workplace having conditions that could easily trigger errors – staff and crew may be inexperienced or there may be understaffing resulting to fatigue for most members of the crew.
Furthermore, latent failures may leave gaping holes in the system defences. When coupled with an active failure, it could trigger accidents resulting to irreversible losses. Using Reason’s concept of active and latent failures, Wiegmann and Shappell (2001) developed a system that could be utilized as a basis for performing human error analysis of aviation accidents. Called Human Factors Analysis and Classification System [HFACS], this system aims to take into consideration the multiple factors that contribute to an aircraft accident.
HFACS categorizes human error into four levels, each of which will be described briefly in the following paragraphs. The first level of human error is unsafe operator practices. As mentioned earlier, such practices may either be unconsciously or consciously committed. Unconscious actions resulting to accidents are referred to as errors while the willful disregard of laws and procedures are called violations (Wiegmann and Shappell, 2001). Errors may be further classified into three: decision-making errors, skill-based errors and perception errors.
Pilots Human Factors in Aviation Accidents 5 usually make errors in decisions when they execute poor judgment, misinterpreting and misusing the information available to them. Pilots who are experienced often rely on automatic behaviors to navigate an aircraft. The trouble with this reliance on reflex actions is that it is prone to errors caused by diversion of attention or memory failure. Omitting items on the checklist or diverting attention to other switches, disregarding more important controls may affect aircraft safety.
Such errors are referred to as skill-based errors. The last category of errors is the one that has received the least attention but is not any less important. Perception errors occur in situations wherein the awareness or sensitivity of the pilot with regards to his surroundings is degraded. Examples of such situations include flying at night or in visually-deprived weather. Armed with imperfect information, the pilot has a tendency to miscalculate distances, altitude and descent rates. Violations, still part of unsafe operator practices, are also classified into two.
Routine violations, the first classification, refers to deliberate defiance of the procedures but basically results from a supervision that disregards or allows such “minor” offenses (Reason, 1990 as cited by Wiegmann and Shappell, 2001). A motorist may for example continue to drive his vehicle 5 mph faster than what is required by the law because the police usually do not enforce the law for such a “minor” infraction. Similarly, pilots may commit similar offenses because the management “does not mind”. Exceptional violations on the other hand, are violations that are not typical of an individual or condoned by an organisation.
A motorist may be condoned by officials if he drives 5 mph faster than the requirement but definitely not when he drives 40 mph faster than what is required. HFACS’ second category of human error is the precondition for unsafe acts. Unsafe practices have deeper underlying causes. They do not just simple happen. They are a result of either substandard conditions or substandard practices of the operator. Among the Human Factors in Aviation Accidents 6 substandard conditions that affect the pilot are physical or mental fatigue, loss of situational awareness, and destructive attitudes such as complacency and overconfidence.
Similarly, intoxication and illnesses which degrade the pilot’s spatial orientation can lead to unsafe acts. There are also situations wherein the pilot is incapable of operating the aircraft safely. This may be a result of inappropriate information or, even if the information is available, the pilot may simply be lacking in skill or ability to safely react to an unsafe situation. Crew resource mismanagement, a substandard practice of pilots and other aircrew, can also lead to unsafe practices. This practice includes blocked communication lines between the management, the aircraft crew and the air traffic control personnel.
Even if the feedback lines are open, the disregard of communication results to unsafe acts which could spawn accidents. Crew resource mismanagement also includes not working together as a team, disregarding orders from superiors or conversely, disregarding feedbacks from junior crew members. Failure of coordination of activities before, during and after a flight also falls in this category (Wiegmann and Shappell, 2001). Aside from teamwork and effective communication, the personal readiness of an individual should be assessed.
The ability of an individual to be completely ready for any adverse situation may be hampered by unsafe practices such as not having enough rest, consumption of alcohol more than the required intake or consumption of medicine without medical supervision. Going higher up in the chain of command, unsafe practices by the crew often results from inefficient and ineffective supervision. The third category of human error, unsafe supervision, is a latent failure which can be further subdivided into four categories: inadequate supervision, planned inappropriate operations, failure to correct known problems, and supervisory violations.
Human Factors on Aviation Accidents 7 It is expected that those on the lower level of management will receive support and on more tangible terms, training and skills update from or provided by the supervisors. The crew should also feel well managed and can look up to an effective leadership. Inadequate supervision results when supervisors fail to make such opportunities available. As a result, the crew is isolated and at times become incapable to deal with situations that might compromise safety. The risk associated with decision-making errors is therefore increased.
At times, supervisors also inadvertently place the crew at unacceptable risk which affects their performance. Examples include disregarding crew rest time, not exercising proper judgment in crew pairing and inappropriate crew scheduling and flight planning. The remaining two categories of unsafe supervision are quite similar but are treated separately in HFACS. The failure to correct known problems refer to situations when inappropriate behaviors or insufficiencies in equipment and skills are known to the supervisor but are allowed to remain uncorrected (Wiegmann and Shappell, 2001).
Supervisory violations on the other hand are deliberate disregard of laws and procedures when managing assets. To contrast the two, the former can include condoning inappropriate behaviors which do not really break safety rules but proper judgment will dictate that such a behavior promotes an unsafe atmosphere, while the latter is reserved for disregard of behavior or actions that are known to break the law. An example would be allowing a pilot to operate an aircraft with an invalid license, which is a blatant disregard of the procedures that could ultimately lead to a tragic accident.
Moving up to the highest chain of command, decisions made by upper-level managers invariably affect attitude and practices of supervisors (Wiegmann and Shappell, 2001). Failures in an organisation would translate to a breakdown in the supervisory level and more so in the aircrew level. Organisational failure does not only refer to mismanagement by Human Factors in Aviation Accidents 8 upper-level managers but to unhealthy and unsafe practices imbibed by the organisation itself.
Resource management, a major issue in organisations, refers to the management, allocation and maintenance of the organisation’s resources, including human resource. Decisions regarding resource management generally revolve around two considerations: safety and cost-effectivity. When an organisation is financially abundant, it is easy to balance the two. However when budgets need to be slashed, it is often the safety factor that is compromised. The organisational climate is another issue that should be looked into when discussing and assessing organisational failures. The organisation’s structure usually defines its climate.
When communication channels are open, when both senior and junior officials listen to each other’s feedback, when responsibilities are properly delegated, when there is proper accountability of actions, when policies are well-defined and responsive to the needs of its members, then the organisational climate is healthy and there is a lesser possibility of arising safety issues in the organisation.
When there is inferior upper level-management with regards to operational processes, such as formal processes (which includes production quotas, incentive systems, schedules, etck), procedures (documentation, instruction about procedures, etc.) and neglect of other organisational factors, it can have a negative impact on the performance efficiency of the crew and the safety of the system. As stated earlier, latent failures such as supervisory and organisational lapses do not immediately result to a threat in the system’s safety.
But as these failures remain unchecked and uncorrected, they can ultimately build up to have a massive negative impact on safety. Unlike active failures that are quite difficult to predict, latent failures, since these are.
University/College: University of Arkansas System
Type of paper: Thesis/Dissertation Chapter
Date: 21 April 2017
We will write a custom essay sample on The impact of human factor studies on aviation accident statistics
for only $16.38 $14.9/page