January 15, 2006

Human Factors in UAV Accidents

Advertise Your Products or Services
Increased UAV use has been accompanied by an increased accident rate. This article looks at the human elements of those accidents.

By Patricia A. Leduc, Ph.D., Clarence E. Rash, M.S. and Sharon D. Manning, M.S.




Unmanned aerial vehicles (UAVs) are rapidly coming into their own as major tactical and strategic systems on the modern battlefield. The preference for UAVs over manned aircraft for selected surveillance and reconnaissance missions and for operations in chemical and biological environments is both economically and operationally desirable. To justify increasing use of UAVs over manned aircraft, UAVs must be at least as successful in meeting mission requirements. Equally important, UAVs must have an acceptable accident rate in order to be cost-effective.

The increase in UAV use has been accompanied by an increased accident rate. UAVs incorporate new technologies that, not surprisingly, experience significant early failure rates. This fact has been substantiated by a recent survey of Army UAV accidents, where 45 percent of the accidents found materiel failure as a sole or contributing causal factor. For civil and military aviation manned aircraft, the overall accident rates have steadily declined over the past half century, primarily due to the reduction in materiel failure. However, as materiel failure has decreased, human error has increased as a causal factor, implicated in 60 percent to 80 percent of accidents.

Similarly, as mechanical failures decrease with the maturation of UAV technology, human error will naturally account for a higher percent of UAV accidents. This trend has already developed, as across the tri-services human error has been found to be a significant causal factor in UAV mishaps and accidents, ranging from 28 percent to 50 percent across branches and 21 percent to 68 percent across UAV type. Knowledge of human error causal factors is necessary for the successful formulation of countermeasures that can prevent these types of accidents.

Human Error

While the word “unmanned” implies an expected reduction in human error as a causal factor in UAV accidents, this would be a false assumption. While many UAV functions are automated, humans are still strongly involved in the design, manufacture, training, maintenance and operation of UAVs. One major area of human involvement is the development of the complex software that control automated UAV functions. Additionally, human operators still perform takeoffs and landings for some UAV types.

The causes of human error can include high (or low) workload, fatigue, poor situational awareness, inadequate training, lack of crew coordination and poor ergonomic design. Any one of these causes can occur at any phase of UAV design, construction or operation, leading to accidents. Somewhat unique to UAVs is the remote-operation environment, which introduces even greater complexity and mental workload.

UAV Accidents

While UAV technology is increasing rapidly, weight and power constraints still limit the ability to provide redundant systems, the lack of which contributes to higher accident rates. In addition, when a system failure occurs or environmental conditions threaten disaster, the lack of the onboard presence of a human pilot decreases the chance of identifying and solving the aircraft problem.

Although UAVs share inherent characteristics with aircraft, UAV accidents differ fundamentally from aviation accidents. Therefore, when investigations are initiated to probe accidents, causal logic could be compromised. Within the U.S. Army, UAV accidents were considered ground accidents and investigated accordingly until October 1, 2003 when the Department of the Army (DA) reclassified these accidents as aviation accidents, requiring expanded data recording of accident details.

However, even with this increased emphasis, the ability to gain insight into UAV accidents is hindered due to the nature of the details surrounding many of these accidents. It has been largely due to recent and ongoing military actions in Afghanistan and Iraq that UAV use has exploded so dramatically and increased in prominence. Unfortunately, in such military conflicts, knowledge of UAV losses, either due to accidents or enemy fire, is naturally of a sensitive nature. Even if some facts regarding these accidents are available, they are more likely to be of a materiel, rather than of a human error, nature. Consequently, the training environment has been the only accessible source for investigating the role of human error in Army UAV accidents.

UAV Accident Investigation

The accident analysis approach used by the U.S. Army is defined in DA Pamphlet 385-40, “Army accident investigation and reporting.” Accidents are categorized by the presence of one or more causal factors: materiel failure, environment or human error. Human error causal factors are classified into five types: individual failure, leader failure, training failure, support failure and standards failure.

• Individual failure: when the soldier/individual knows and is trained to a standard but elects not to follow the standard (i.e., lack of self-discipline, mistake due to own personal factors such as attitude, haste, overconfidence, self-induced fatigue, etc.).

• Leader failure: when the leader fails to enforce known standards, to make required corrections or to take appropriate action.

• Training failure: when the soldier/individual is not trained to a known standard (i.e., insufficient, incorrect or no training on the task—insufficient in content or amount).

• Support failure: when there is inadequate equipment/facilities/services in type, design, availability, condition, or when there is an insufficient number/type of personnel, and these deficiencies contribute to human error.

• Standards failure: when standards/procedures are not clear or practical, or do not exist.

UAV Accident Analysis

The only comprehensive analysis of the role of human error in U.S. Army UAV accidents was conducted in 2002 using data obtained from the U.S. Army Risk Management Information System (RMIS) accident database maintained by the U.S. Army Combined Readiness Center (formerly the U.S. Army Safety Center), Fort Rucker, AL. A search for the period fiscal year 1995 to 2003 found a total of 56 UAV accidents. Each accident was reviewed and classified by the presence of causal factors. Table 1 represents the distribution of the 56 accidents by causal factor category. The most common sole causal factor was materiel failure, identified in 18 of the 56 accidents (32 percent). Human error also was identified in 18 of the 56 accidents (32 percent). It was designated as the sole causal factor in six of the 56 accidents (11 percent) and as a contributing factor in 12 other accidents (21 percent).

When the accidents in which human error was cited as a causal factor were analyzed using the failure types defined in DA PAM 385-40 (Table 2), the most represented failure was individual failure (18 percent of all accidents). The second most prevalent failure was standards failure (14 percent of all accidents). When just the 18 accidents involving human error were considered, individual failure was present in 56 percent, and standards failure was present in 44 percent of these accidents. Leader failure, training failure and support failure were present in 33 percent, 22 percent and 6 percent of the 18 human error accidents, respectively.

One drawback of the DA PAM 385-40 accident analysis approach is its inability to provide detail into the human error beyond a first-level identification of failure type. An alternative accident analysis method that has been developed and tested as a tool for use within the U.S. military and has gained wide acceptance within both the aviation accident investigation and human factors communities is the Human Factors Analysis and Classification System (HFACS). The HFACS is a more in-depth approach for investigating and analyzing the human causes of aviation accidents. The HFACS captures greater human error detail by further dividing basic error levels into categories and subcategories.

The HFACS captures data for four levels of human-related failure: unsafe acts, preconditions for unsafe acts, unsafe supervision and organizational influences. These four levels of human-related failure are expanded into 17 causal categories.

The unsafe acts level is divided into two categories: errors and violations. These two categories differ in intent. Errors are unintended mistakes and are further divided into skill-based errors, decision errors and perceptual errors. Examples of skill-based errors include inadvertently leaving out an item on a checklist, failure to prioritize actions and omitting a procedural step. Examples of decision errors include using the wrong procedure, misdiagnosing an emergency and performing an incorrect action. Perceptual errors are those made due to the presence of visual illusions and spatial disorientation. Violations are willful errors. Examples include violating training rules, performing an overaggressive maneuver and intentionally exceeding mission constraints.

The unsafe preconditions level is divided into two major categories: substandard conditions of operators and substandard practices of operators. The substandard conditions of operators category is subdivided into three subcategories: adverse mental states, adverse physiological states and physical/mental limitations. Examples of adverse mental states include complacency, “get-home-itis” and misplaced motivation. Examples of adverse physiological states include medical illness and physical fatigue. Examples of physical/mental limitations include insufficient reaction time and incompatible intelligence/aptitude. The substandard practices of operators category is subdivided into two subcategories: crew resource management and personal readiness. Examples of crew resource management include failure to use all available resources and failure to coordinate. Examples of personal readiness are self-medication and violation of crew rest requirements.

The unsafe supervision level is divided into four categories: inadequate supervision, planned inappropriate operations, failure to correct a known problem and supervisory violations. Examples of inadequate supervision include failure to provide training, failure to provide operational doctrine and failure to provide oversight. Examples of planned inappropriate operations include failure to provide correct data, failure to provide sufficient personnel and failure to provide the opportunity for adequate crew rest. Examples of failure to correct a known problem include failure to initiate corrective action and failure to report unsafe tendencies. Examples of supervisory violations include authorizing an unnecessary hazard and failure to enforce rules and regulations.

The organizational influences level has three categories: resource/acquisition management, organizational climate and organizational process. Examples of resource/acquisition management include lack of funding, poor equipment design and insufficient manpower. Examples of organizational climate include policies on drugs and alcohol, value and belief culture, and chain-of-command structure. Examples of organizational process include quality of safety programs, influence of time pressure and the presence or absence of clearly defined objectives.

In summary, the HFACS analysis approach can provide greater refinement in human error causal factors. This can be seen in the results of applying the HFACS on the 56 accidents discussed above. These results, reported only for causal factors found to be present, are provided in Table 3.

When just the 18 accidents involving human error were considered, unsafe acts were present in 61 percent, and unsafe supervision was present in 50 percent of these accidents. Organizational influences and preconditions for unsafe acts were present in 44 percent and 6 percent of the human error accidents, respectively.

Within the major HFACS category of unsafe acts, four subcategories were identified: skilled-based errors, decision errors, perceptual errors and violations. The most common unsafe act was decision errors, present in 11 percent of all accidents and in 33 percent of the 18 human-error accidents.

Incidents of decision errors included when the external pilot hurried turns using steep angles of bank, preventing a proper climb rate, which resulted in a crash, and when the wrong response to an emergency situation was made by commanding idle power after the arresting hook had already caught on the arresting cable.

The single accident categorized as having preconditions for unsafe acts was further identified as a crew resource management issue. The accident report stated that poor coordination between student and instructor was present.

Three subcategories were identified under unsafe supervision: inadequate supervision,failed to correct a known problem, and supervisory violations. The most common unsafe supervision subcategory was inadequate supervision, present in 11 percent of all accidents and 33 percent of human error accidents. Incidents of inadequate supervision included failure to provide training for the UAV operator on effects of wind and failure to provide proper monitoring of contract personnel to ensure adequate inspections/checks.

All of the accidents identified under organizational influences fell under one subcategory:organizational process. Incidents under this subcategory included failure to maintain training records, and lack of written guidance on inspection and replacement criteria.

Analysis Leads to Solutions

The predominant means of investigating the causal role of human error in accidents remains the analysis of post-accident data. The only such analysis to date of U.S. Army UAV accidents found a total of 56 (the majority occurring in the training environment), in which 18 (32 percent) involved human error as a sole or contributing factor. The application of two different accident analysis approaches (DA PAM 385-40 and HFACS) to these 18 accidents found that individual acts or failures were the leading human error category, present in 58 to 61 percent of human-error attributed accidents. Both methods appeared to be adequate for the gross purpose of classification. It was clear, however, that the HFACS provided more useful and detailed information to help examine individual human errors.

Patricia A. Leduc and Clarence E. Rash are with the U.S. Army Aeromedical Research Laboratory, Fort Rucker, AL. Sharon D. Manning is with the Aviation Branch Safety Office, Fort Rucker, AL.

6 comments:

Anonymous said...

Lookup u dmv online here: dmv

Anonymous said...

Lookup u dmv online here: dmv

Anonymous said...

belco framed supplier corrected cappelletti dciaa narratives observable vaidyaratnam optimistic koubel
servimundos melifermuly

Anonymous said...

majra nonsensical agents phnurses abusive tesol educhris panels jing materia dined
servimundos melifermuly

Industrial Accidents said...

Nice post! yesterday i found another great post about Industrial Accidents. Here is the link
Industrial Accidents

Order Pills Antibacterial said...

I will be your frequent visitor, that's for sure.