Talk:Pilot error

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

I've added the following sections to the article:

Threats[edit]

The term "threat" is defined as any event "external to flight crew influence which can increase the operational complexity of a flight."[1] Threats may further be broken down into environmental threats and airline threats. Environmental threats are ultimately out of the hands of crew members and the airline, as they hold no influence on "adverse weather conditions, hazardous , air traffic control shortcomings, bird strikes, and high terrain." [1] Conversely, airline threats are not manageable by the flight crew, but may be controlled by the airline's management. These threats include "aircraft malfunctions, cabin interruptions, operational pressure, ground/ramp errors/events, cabin events and interruptions, ground maintenance errors, and inadequacies of manuals and charts."[1]

References

  1. ^ a b c Earl, Laurie; Bates, Paul R.; Murray, Patrick S.; Glendon, A. Ian; Creed, Peter A. (January 2012). "Developing a Single-Pilot Line Operations Safety Audit". Aviation Psychology and Applied Human Factors (2): 49-61. doi:10.1027/2192-0923/a000027. ISSN 2192-0923.

Errors[edit]

The term "error" is defined as any action or inaction leading to deviation from team or organizational intentions.[1] Error stems from physiological and psychological human limitations such as illness, medication, stress, alcohol/drug abuse, fatigue, emotion etc. Error is inevitable in humans and is primarily related to operational and behavioural mishaps.[2] Errors can vary from incorrect altimeter setting and deviations from flight course to more severe errors such as exceeding maximum structural speeds or forgetting to put down landing or takeoff flaps.

References

  1. ^ Cite error: The named reference :7 was invoked but never defined (see the help page).
  2. ^ Li, Guohua; Baker, Susan P.; Grabowski, Jurek G.; Rebok, George W. (February 2001). "Factors Associated With Pilot Error in Aviation Crashes". Aviation Space and Environmental Medicine (72(1)): 52–58.

Threat and Error Management (TEM)[edit]

TEM involves the effective detection and response to internal or external factors that have the potential to degrade the safety of aircraft operation.[1] Methods of teaching TEM stress replicability, or reliability of performance across recurring situations.[2] TEM aims to prepare crews with "coordinative and cognitive ability to handle both routine and unforeseen surprises and anomalies."[2] The desired outcome of threat and error management training is the development of 'resiliency;' resiliency, in this context, is the ability to recognize and act adaptively to disruptions which may be encountered during flight operations. TEM training occurs in various forms, with varying levels of success. Some of these training methods include data collection using the Line Operations Safety Audit (LOSA), implementation of crew resource management (CRM) as well as cockpit task management (CTM) and the integrated use of checklists in both commercial and general aviation. Some other resources built into most modern aircraft to help minimize risk and manage threat and error are airborne collision and avoidance systems (ACAS) and ground proximity warning systems (GPWS).[3] With the consolidation of onboard computer systems and the implementation of proper pilot training, airlines and crew members look to mitigate the inherent risks associated with human factors.

Line Operations Safety Audit (LOSA)[edit]

LOSA is a structured observational program designed to collect data for the development and improvement of countermeasures to operational errors.[4] Through the audit process, trained observers are able to collect information regarding the normal procedures, protocol and decision making processes flight crews undertake when faced with threats and errors during routine or normal operation. This data driven constituency of threat and error management is useful for examining pilot behaviour in relation to situational analysis, providing a basis for further implementation of safety procedures or training to help mitigate errors and risks.[5] Observers on flights which are being audited typically observe the following:[4]

  • Potential threats to safety
  • How the threats are addressed by the crew members
  • The errors the threats generate
  • How crew members manage these errors (action or inaction)
  • Specific behaviours known to be associated with aviation accidents and incidents

LOSA was developed to assist crew resource management practises in reducing human error in complex flight operations.[5] The benefits of the results of the line operations safety audit is data revealing how many errors or threats are encountered per flight, the number of errors which could have resulted in a serious threat to safety, and correctness of crew action or inaction. This data has proven to be useful in the development of CRM techniques and identification of what issues need to be addressed in training.[5]

Crew Resource Management (CRM)[edit]

CRM is the "effective use of all available resources by individuals and crews to safely and effectively accomplish a mission or task, as well as identifying and managing the conditions that lead to error."[6] CRM training has been integrated and mandatory for most pilot training programs and has been the accepted standard for developing human factors skills for air crews and airlines. Although there is no universal CRM program, airlines usually customize their training to best suit the needs of the organization; the principles of each program are usually closely aligned. According to the U.S. Navy, there are seven critical CRM skills:[6]

  • Decision Making - the use of logic and judgement to make decisions based on available information
  • Assertiveness - willingness to participate and state a given position until convinced by facts that another option is more correct
  • Mission Analysis - ability to develop short and long term contingency plans
  • Communication - clear and accurate sending and receiving of information, instructions, commands and useful feedback
  • Leadership - ability to direct and coordinate activities of pilots & crew members
  • Adaptability/Flexibility - ability to alter course of action due to changing situations or availability of new information
  • Situational Awareness - ability to perceive the environment within time and space, and comprehend its meaning

These seven skills comprise the critical foundation for effective aircrew coordination. With the development and use of these core skills, flight crews "highlight the importance of identifying human factors and team dynamics to reduce human errors that lead to aviation mishaps."[6]

Application and Effectiveness of CRM[edit]

Since the implementation of CRM circa 1979, following the need for increased research on resource management by NASA, the aviation industry has seen tremendous evolution of the application of CRM training procedures.[7] The first generation emphasized individual psychology and testing, where corrections could be made to behaviour. The second evolution featured a shift in focus to cockpit group dynamics. With the third evolution came diversification of scope and an emphasis on training crews in how they must function both in and out of the cockpit. The fourth generation of CRM integrated procedure into the training, allowing organizations to tailor training to their needs. The fifth and latest generation acknowledges that human error is inevitable and provides information to improve safety standards.[8] Today, CRM is implemented through pilot and crew training sessions, simulations and through interactions with senior ranked personnel and flight instructors such as briefing and debriefing flights. Although it is difficult to measure the success of CRM programs, studies By Salas et al. have been conclusive that there is a correlation between CRM programs and better risk management.[8]

Cockpit Task Management (CTM)[edit]

Cockpit task management (CTM) is the "management level activity pilots perform as they initiate, monitor, prioritize, and terminate cockpit tasks."[9] A 'task' is defined as a process performed to achieve a goal (ie. fly a waypoint, descend to a desired altitude etc.).[9] CTM training focuses on teaching crew members how to handle concurrent tasks which compete for their attention. This includes the following processes:

  • Task Initiation - when appropriate conditions exist
  • Task Monitoring - assessment of task progress and status
  • Task Prioritization - relative to the importance and urgency for safety
  • Resource Allocation - assignment of human and machine resources to tasks which need completion
  • Task Interruption - suspension of lower priority tasks for resources to be allocated to higher priority tasks
  • Task Resumption - continuing previously interrupted tasks
  • Task Termination - the completion or incompletion of tasks

The need for CTM training is a result of the capacity of human attentional facilities and the limitations of working memory . Crew members may devote more mental or physical resources to a particular task which demands priority or regards the immediate safety of the aircraft. CTM has been integrated to pilot training and goes hand in hand with CRM. Some aircraft operating systems have made progress in aiding CTM by combining instrument gauges into one screen. An example of this is a digital attitude indicator, which simultaneously shows the pilot the heading, airspeed, decent or ascent rate and a plethora of other pertinent information. Implementations such as these allow crews to gather multiple sources of information quickly and accurately, which frees up mental capacity to be focused on other, more emergent tasks.

Checklists[edit]

The use of checklists before, during and after flights has established a strong presence in all types of aviation as a means of managing error and reducing the possibility of risk. Checklists are highly regulated and consist of protocols and procedures for the majority of the actions required during a flight.[10] The objectives of checklists include "memory recall, standardization and regulation of processes or methodologies."[10] The use of checklists in aviation has become an industry standard practise, and the completion of checklists from memory is considered a violation of protocol and pilot error. Studies have shown that increased errors in judgement and cognitive function of the brain, along with changes in memory function are a few of the effects of stress and fatigue,[11] both of which are inevitable human factors encountered in the commercial aviation industry. The use of checklists in emergency situations also contributes to troubleshooting and reverse examining the chain of events which may have led to the particular incident or crash. Apart from checklists issued by regulatory bodies such as the FAA or ICAO, or checklists made by aircraft manufacturers, pilots also have personal qualitative checklists aimed to ensure their fitness and ability to fly the aircraft. An example is the IM SAFE checklist (illness, medication, stress, alcohol, fatigue/food, emotion) and a number of other qualitative assessments which pilots may perform before or during a flight to ensure the safety of the aircraft and passengers.[10] These checklists, along with a number of other redundancies integrated into most modern aircraft operation systems, ensure the pilot remains vigilant, and in turn, aims to reduce the risk of pilot error.

Khosein4 (talk) 16:17, 4 November 2015 (UTC)[reply]

References

  1. ^ Cite error: The named reference :5 was invoked but never defined (see the help page).
  2. ^ a b Dekker, Sidney; Lundström, Johan (May 2007). "From Threat and Error Management (TEM) to Resilience". Journal of Human Factors and Aerospace Safety (260(70)): 1–10.
  3. ^ Maurino, Dan (April 2005). "Threat and Error Management (TEM)". Canadian Aviation Safety Seminar (CASS); Flight Safety and Human Factors Programme - ICAO.
  4. ^ a b "Line Operations Safety Audit (LOSA)". SKYbrary. Retrieved November 2015. {{cite web}}: Check date values in: |accessdate= (help)
  5. ^ a b c Cite error: The named reference :6 was invoked but never defined (see the help page).
  6. ^ a b c Myers, Charles; Orndorff, Denise (2013). "Crew Resource Management: Not Just for Aviators Anymore". Journal of Applied Learning Technology. 3 (3): 44–48.
  7. ^ Helmreich, Robert L.; Merritt, Ashleigh C.; Wilhelm, John A. (1999). "The Evolution of Crew Resource Management Training in Commercial Aviation". The International Journal of Aviation Psychology (9(1)): 19–32.
  8. ^ a b Salas, Eduardo; Burke, Shawn C.; Bowers, Clint A.; Wilson, Katherine A. (2001). "Team Training in the Skies: Does Crew Resource Management (CRM) Training Work?". Human Factors. 43 (4): 641–674. ISSN 0018-7208.
  9. ^ a b Chou, Chung-Di; Madhavan, Das; Funk, Ken (1996). "Studies of Cockpit Task Management Errors". The International Journal of Aviation Psychology (6(4)): 307–320.
  10. ^ a b c Hales, Brigette M.; Pronovost, Peter J. (2006). "The Checklist -- A Tool for Error Management and Performance". Journal of Critical Care (21): 231–235.
  11. ^ Cavanagh, James F.; Frank, Michael J.; Allen, John J.B. (April 2010). "Social Stress Reactivity Alters Reward and Punishment Learning". Social Cognitive and Affective Neuroscience (6(3)): 311–320.

Kegworth Air Crash[edit]

I've altered the reference to a 747 as the crash did not involve one. The article on the crash describes the plane as a 737 so I've called it that. If it had four engines it probably wouldn't have crashed. Britmax 16:14, 27 September 2006 (UTC)[reply]


I've deleted the Air New Zealand crash at Mount Erebus on Antarctica. This crash is not due to pilot error, as the crew was unknowingly sent to the wrong coordinates. According to the report, the crew did not know the coordinates were wrong, and could not verify them. Also, there was no reason for the crew to doubt the coordinates. 16:41, 18 August 2007 (UTC)


I edited the definition of "pilot error" to highlight the difference between error and noncompliance. I left the Garuda Flight 200 as an example of the difference between pilot error and pilot insanity. (The Captains intentional non-compliance with EGPWS warnings is not an error, so technically this is not a "pilot error" accident). In some ways similar to Silk Air 185 crash. B744B763 (talk) 01:36, 10 February 2008 (UTC)[reply]

How is "intentional non-compliance with EGPWS warnings" not an error by the pilot? What authority are you using for your definition of "pilot error"? treesmill (talk) 18:05, 21 January 2012 (UTC)[reply]

Unclear[edit]

12 Nov 2001, "the co-pilot over applied the rudder pedal, turning the A300 side to side". Does anyone know what this phrase means ? RASAM (talk) 19:40, 3 October 2009 (UTC)[reply]

British European Airways Flight 548[edit]

The Trident crash at Heathrow in 1972 was blamed on pilot error. 81.157.69.65 (talk) 23:55, 5 October 2012 (UTC)[reply]

Percentages and trend?[edit]

"During 2004 in the United States, pilot error was listed as the primary cause of 78.6% of fatal general aviation accidents, and as the primary cause of 75.5% of general aviation accidents overall.[1] For scheduled air transport, pilot error typically accounts for just over half of worldwide accidents with a known cause.[2]"

I suppose that percentages for one single, relatively recent, year, for one single country (admittedly the most significant) are at least indicative. But I think the article might certainly benefit from a presentation of the data at that second source, in table or graph form. And I'm sure some kind of discussion about any apparent trend, by decade. Alternatively, perhaps better or more fine-grained data are available? That current source is "Source: PlaneCrashInfo.com database". Martinevans123 (talk) 19:36, 25 July 2013 (UTC)[reply]

Threat And Error Management (TEM)[edit]

I would like to add to the section regarding pilot error and threat management. In particular, I would like to talk about some of the implementations which have been put into place in order to reduce the effects of pilot error such as CRM and LOSA data collection. — Preceding unsigned comment added by Khosein4 (talkcontribs) 16:09, 3 November 2015 (UTC)[reply]

Peer Review: Threat and Error Management[edit]

Overall a well sourced and coherent article but there are some areas for improvement.

  • Some sentences can be split up into two or more sentences to avoid run-on sentences
  • Inline citations should be placed one space after the period at the ends of the sentence
  • The checklist section should have more references to support the article
  • If you can find studies that show real-world examples of TEM training in use that would be a great addition to the article — Preceding unsigned comment added by Jkorb (talkcontribs) 21:25, 18 November 2015 (UTC)[reply]

The article has great information. The information is well structured. However it needs to be proofread to make the sections flow better. Words can be changed to make it flow better. Example is the " CRM developing HF skills". This should be explained better. Good examples of instruments. You could move some of the pictures to the left by adding the world left after "thumb" on the picture html. You can add more links eg: to the Pilot (aeronautics) page. Your page is well linked and linked to a lot of others. List of aviation accident involving error would be beneficial to the article.Mmiddei4 (talk) 18 November 2015 (UTC)

@Mmiddei4: Please read WP:MOS. There is no space between the full stop in a sentence and the reference. Please do not advise differently. Fiddle Faddle 18:29, 1 December 2015 (UTC)[reply]

Threat and Error Management: Checklists[edit]

There needs to be a clearer connection between checklists and their mitigation of fatigue and stress Jkorb (talk) 18:00, 1 December 2015 (UTC) jkorb[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified one external link on Pilot error. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 20:27, 12 January 2018 (UTC)[reply]

How's that again?[edit]

An OR inline citation tag was placed in a paragraph in the article today. The intention obviously has merit but in my opinion there's no way such a ref can be produced. Here's the paragraph, with emphasis from me:

"Placing pilot error as a cause of an aviation accident has often been controversial. For example, the NTSB found that the crash of American Airlines Flight 587 the primary cause of the rudder failure was in fact a design flaw in the Airbus A300 aircraft, and the co-pilot's rudder inputs should not have caused the catastrophic rudder failure that resulted in the deaths of 265 people.[original research?] There was no mention of the term "pilot error" in the NTSB report.[1]"

But, that very same NTSB reference cited in the paragraph totally refutes that design flaw allegation. It says:

"Given the aerodynamic loads at the time that the vertical stabilizer separated, it can be determined that the vertical stabilizer’s structural performance was consistent with design specifications and had exceeded certification requirements."

And further down in the ref we see more refutation:

"Thus, on the basis of all of the evidence discussed in this section, the Safety Board concludes that flight 587’s vertical stabilizer performed in a manner that was consistent with its design and certification. The vertical stabilizer fractured from the fuselage in overstress, starting with the right rear lug while the vertical stabilizer was exposed to aerodynamic loads that were about twice the certified limit load design envelope and were ...more than the certified ultimate load design envelope. Because these aerodynamic loads were caused by the first officer’s rudder pedal inputs, the analysis of these rudder pedal inputs is of central importance to this investigation."

Also, the Airbus A380 article, mentioning the accident, has only this to say about cause:

"The vertical stabilizer ripped off the aircraft after the rudder was mishandled during wake turbulence. ".

Moriori (talk) 00:05, 9 December 2018 (UTC)[reply]

I have now removed that paragraph which was truly outrageous and probably libellous as well. Moriori (talk) 23:03, 11 December 2018 (UTC)[reply]