sexta-feira, 12 de novembro de 2010

Interfaces Between Flightcrews and Modern Flight Deck - Should Be Standardized

Cockpit standardization, as appropriate, of automation interface features, such as:

  • The location, shape, and direction of movement for takeoff/go-around and autothrottle quick disconnect switches;
  • Autoflight system mode selectors and selector panel layout,
  • Autoflight system modes, display symbology, and nomenclature; and
  • Flight management system interfaces, data entry conventions, and nomenclature.
How automation can potentially mask situations that may develop into problems?


Have your crews experienced automation or flight path surprises, or mode confusion?

Can you identify any generic issues that affect crew qualification (e.g. training, checking, or recency of experience) that may need to be addressed industry-wide for all glass cockpit aircraft, or industry-wide for a particular type of glass cockpit airplanes?

Insufficient knowledge and skills. Designers, pilots, operators, regulators, and researchers do not always possess adequate knowledge and skills in certain areas related to human performance. It is of great concern to this team that investments in necessary levels of human expertise are being reduced in response to economic pressures when two-thirds to three-quarters of all accidents have flightcrew error cited as a major factor.

The aviation industry should update or develop new standards and evaluation criteria for information presented to the flightcrew by flight deck displays and aural advisories (e.g., primary flight displays, navigation/communication displays, synoptics showing system states).

  • flightcrew training investments should be re-balanced to ensure appropriate coverage of automation issues.
  • regulatory authorities should evaluate flight deck designs for human performance problems.
  • flightcrew workload is the major human performance consideration in existing Part 25 regulations; other factors should be evaluated as well, including the potential for designs to induce human error and reduce flightcrew situation awareness.
This recommendation should apply to all accident/incident investigations involving human error, regardless of whether the error is associated with a pilot, mechanic, air traffic controller, dispatcher, or other participant in the aviation system.


The use of multi-function knobs for flight critical functions and the use of different autoflight controls that have a similar shape, feel, location, and display (e.g., speed and heading control knobs).
These design features are contrary to the principles of minimizing the potential for flightcrew error and providing error tolerance. These features make it too easy for a busy flightcrew member to make an error and not realize it until the airplane’s behavior becomes sufficiently different from what the flightcrew expects. For example, it is believed by some that the similarity between the display representations of flight path angle and vertical speed played a major role in the Air Inter Airbus A320 accident at Strasbourg, France in 1992, and in several similar incidents.

Warning and Alerting Schemes

A multitude of warnings and alerts exist in the cockpits of many modern transport category airplanes to notify the flightcrew of potentially hazardous situations. A variety of methods are employed to take advantage of most of the human senses to get the flightcrew’s attention, including voice, horns, klaxons, chimes, bells, cavalry charges, buzzers, wailers, clackers, alphanumeric messages, blinking lights, flashing displays, stick shakers, different colors, etc. Many of these warnings have been mandated as a result of safety issues brought to light by specific incidents or accidents.

The FAA should encourage appropriate standardization of automation interface features by supporting recently initiated efforts in industry technical committees and exploring incentives for standardization (and possibly disincentives for inappropriate differentiation) that would lead or assist in the development of guidelines and standards. These guidelines and standards should also address the use of multi-function controls and differentiation of controls by location, shape, and feel.

Standardization is not intended to substitute for human-centered design, but implemented correctly, it can reduce the potential for flightcrew error. It can also reduce the training burden for transitioning flightcrews and improve the reliability of proper human response, particularly when reacting instinctively in critical situations. One potential pitfall of standardization that should be avoided is to standardize on the lowest common denominator (e.g., disabling the autobrakes on airplanes that have this feature because it is not included on all airplane types). Another potential pitfall is that inappropriate standardization, rigidly applied, can be a barrier to innovation, product improvement, and product differentiation. In implementing this recommendation, these potential pitfalls should be recognized and avoided. It may be appropriate to interpret this recommendation as a request for consistency, rather than rigid standardization.

MESS IN INPUT DATA information

FAA Source Data (FAA Order 8260.19/7100.11)

50 00 00.00/030 00 00.00 (deg-min-sec) 50 00 00.00E/030 00 00.00S (if not N or W)

FAA to National Flight Data Center (Ref. FAA Order 7900.2)
50°00’00.00”N 30°00’00.00”W

U.S. Govt Flt Info Pub Supplement
N50°00.00’W030°00.00’

Aeronautical charts: Jeppesen
N50 00.0 W030 00.0 - or N50 00.00 W030 00.0

NOAA/DoD
N50° 00.0’ W030° 00.0’

NOAA ARP
50°00’N-30°00’W

(Airport Reference Point) Arinc 424 specification: N50 00 0000 W030 00 0000

Full FMS Conventions: N5000.0W03000.0 N50W030

Abbreviated FMS entry format
5030N
(Waypoint formats for 5-character unnamed reporting points) Arinc Communications Addressing 50N030W and Reporting System entry format Flight Deck Communications N50 W030-Position report
Although there has been much progress made in integrating, prioritizing, and, when appropriate, inhibiting unnecessary alerts, the HF Team is concerned both that the number and complexity of warnings and alerts has grown too large and that existing warnings and alerts may not always be integrated into a consistent scheme. Multiple warnings and alerts may also mutually interfere or may interfere with flightcrew communication at critical times. Contributing to this problem are FAA regulatory standards that mandate the means by which a specific warning or alert must be implemented, regardless of whether it fits in with the warning or alerting philosophy adopted by the manufacturer. Examples of mandated warning systems that require distinctively different warnings include landing gear, takeoff configuration, overspeed, stall, Traffic Alert and Collision Avoidance System (TCAS), GPWS, and the predictive and reactive windshear alerting systems.


The more unique warnings there are, the more difficult it is for the flightcrew to remember what each one signifies. The result can be a confused and distracted flightcrew precisely at the time when prompt action may be necessary. Inappropriate use of color, sound, etc. may also cause confusion, as may several warnings and alerts going off in unison and perhaps conflicting with one another (e.g., the flightcrew of the Birgenair Boeing 757 that crashed into the sea shortly after takeoff from Puerto Plata, Dominican Republic may have been confused by conflicting stall and overspeed warnings coupled with erroneous airspeed information). Increasing levels of automation coupled with the evolving operational environment (e.g., Data link, the Future Air Navigation System, free flight) and new safety systems (e.g., predictive windshear and enhanced GPWS) make it more critical then ever that advisories, alerts, warnings, and status information be properly integrated.

Feedback Needs

Empirical research, incidents, and accidents suggest that flightcrews tend to detect unexpected automation behavior in these highly automated airplanes from observations of unanticipated airplane behavior (e.g., speed or flight path deviations or unexpected movement of a control) rather than from displays containing information on automation status/configuration.9 Since the information needed by the flightcrew to detect the undesired automation behavior is already available on cockpit displays, this observation suggests that current feedback mechanisms may be inadequate to support timely error detection.

Several incidents and accidents point to other vulnerabilities that are associated with the autoflight system masking system failures or other causes of in-flight upsets. These vulnerabilities result when the autoflight system initially masks the in-flight upset, then suddenly disengages or is unable to maintain control when it runs out of control authority. Because of the masking effect of the autopilot, these situations may not be adequately

9 Sarter, Nadine B. and David D. Woods. ‘How in the world did we ever get into that mode?’ Mode Error and Awareness in Supervisory Control. Human Factors, 37(1), 5-19, 1995.

Examples that illustrate these vulnerabilities include:

• A China Airlines Boeing 747 in 1985 lost power on one engine during cruise in autoflight. The captain was unaware of the engine failure, in part because the autopilot compensated for the resulting yaw until control limits were reached. Upon disengagement of the autopilot, the resulting transient caused a rapid roll and steep dive angle. The captain was able to successfully regain control of the airplane.

• An American Eagle Aerospatiale ATR-72 crashed near Roselawn, Indiana in 1994 after a severe icing encounter. The autopilot disconnected shortly after the ailerons deflected, initiating an abrupt roll to the right that the flightcrew was unable to arrest.

• A number of high altitude upset incidents have occurred on the Airbus A300-600 in which FMS performance data indicated an altitude capability very near the buffet limit. When turbulence was encountered, the autopilot would disconnect, leaving the flightcrew with an airplane out of trim, near buffet, and with marginal stability. Serious turbulence or flight control-induced “airplane-pilot coupling” incidents have also been encountered on the Douglas MD-11, involving a fatality in one instance. These incidents appear to be exacerbated by high altitude stability characteristics, flightcrew unfamiliarity with these characteristics, and autopilot interactions.

The type of feedback provided to the flightcrew is changing with the evolving technology in both the flight deck interface and the flight control systems. In many areas, tactile feedback is being replaced by visual annunciations. Although the same information may be present, its form has changed. One particular example of this change is illustrated by the use of non-moving autothrottles in Airbus A320/A330/A340 airplanes. In these airplanes the thrust levers do not move in response to changes in thrust commanded by the autothrust system. The tactile cues present in other airplanes (which Airbus suggests may be misleading because the thrust lever position is only an indication of the commanded thrust level) are replaced by additional visual cues (e.g., flight mode annunciations, a speed trend symbol on the PFD, and enhanced presentation of engine parameters) augmented by envelope protection features and aural alerts (on some airplanes) for low energy state.10

Another example of a change in the type of feedback provided in the A320/A330/A340 airplanes is the use of uncoupled sidesticks, which do not provide direct tactile feedback of a pilot’s control stick inputs to the other pilot, nor feedback as to the position or movement of the flight control surfaces. Because the uncoupled sidesticks make it more difficult to for flightcrews to discern the other pilot’s inputs (and there have been cases of inadvertent conflicting flightcrew inputs), there are additional flightcrew coordination issues to address. It is difficult to determine whether the changes in the type of feedback associated with the non-moving autothrottles and uncoupled sidesticks meet (or do not meet) the pilot’s needs, however, because of a lack of understanding and consensus of precisely what type and amount of feedback are necessary.

10For a discussion of the potential benefits and disadvantages of non-moving autothrottles, refer to SAE Technical Paper Series, number 912225, British Airways Airbus A320 Pilots’ Autothrust Survey, by Steve Last and Martin Alder; and National Aerospace Laboratory of the Netherlands, NLR TP 94005, Pilot Performance in Automated Cockpits: A Comparison of Moving and Non-Moving Thrust Levers, by H.H. Folkerts and P.G.A.M. Jorna.

Hazardous States of Awareness


Inattention, or decreased vigilance, is often cited in ASRS reports, and has been a contributor to operational errors, incidents, and accidents. Decreased vigilance manifests itself in several ways, which can be referred to as hazardous states of awareness. These states include:

• Absorption. Absorption is a state of being so focused on a specific task that other tasks are disregarded. Programming the FMS to the exclusion of other tasks, such as monitoring other instruments, would be an example of absorption. The potential for absorption is one reason why some operators discourage their flightcrews from programming the FMS during certain flight phases or conditions (e.g., altitudes below 10,000 feet).

• Fixation. Fixation is a state of being locked onto one task or one view of a situation even as evidence accumulates that attention is necessary elsewhere or that the particular view is incorrect. The “tunneling” that can occur during stressful situations is an example of fixation. For example, a pilot may be convinced that a high, unstabilized approach to landing is salvageable even when other flightcrew members, air traffic control, and cockpit instruments strongly suggest that the approach cannot be completed within acceptable parameters. The fixated pilot will typically be unaware of these other inputs and appear to be unresponsive until the fixation is broken. Fixation is difficult to self-diagnose, but it may be recognizable in someone else.

• Preoccupation. Preoccupation is a state where one’s attention is elsewhere (e.g., daydreaming).

Decreased vigilance can be caused or fostered by a number of factors, including:

• Fatigue. Fatigue has been the subject of extensive research and is well recognized as a cause of decreased vigilance.

• Underload. Underload is increasingly being recognized as a concern. Sustained attention is difficult to maintain when workload is very low.

• Complacency. Automated systems have become very reliable and perform most tasks extremely well. As a result, flightcrews increasingly rely on the automation. Although high system reliability is desired, this high reliability affects flightcrew monitoring strategies in a potentially troublesome way. When a failure occurs or when the automation behavior violates expectations, the flightcrew may miss the failure, misunderstand the situation, or take longer to assess the information and respond appropriately. In other words, over-reliance on automation can breed complacency, which hampers the flightcrew’s ability to recognize a failure or unexpected automation behavior.
The aviation industry should develop and implement a plan to transition to standardized instrument approaches using lateral navigation (LNAV) and vertical navigation (VNAV) path guidance for three-dimensional approaches. The use of approaches that lack vertical path guidance should be minimized and eventually eliminated.
 
References

Gras, Alain, Caroline Moricot, Sophie L. Poirot-Delpech, and Victor Scardigli. Faced with Automation. The pilot, the controller, the engineer. Paris, France: Publications de la Sorbonne, 1994.
Hofstede, G. Culture’s Consequences: International Differences in Work-Related Values. Beverly Hills, California. Sage, 1980.
Wiener, Earl, Barbara G. Kanki, and Robert L. Helmreich, ed. Cockpit Resource Management. San Diego, California: Academic Press, 1993.
Wiener, Earl and David C. Nagel, ed. Human Factors in Aviation. San Diego, California: Academic Press, 1988.
Wise, John A., V. David Hopkin, and Daniel J. Garland, ed. Human Factors Certification of Advanced Aviation Technologies. Daytona Beach, Florida: Embry-Riddle Aeronautical University Press, 1994.
Woods, David D., Leila J. Johannesen, Richard I. Cook, and Nadine B. Sarter. Behind Human Error: Cognitive Systems, Computers and Hindsight. CSERIAC SOAR 94-01, December, 1994.
Magazine/Newspaper Articles Blattner, Les. “FOQA Takes Off.” Air Line Pilot. November/December 1995, p. 41. Bresley, Bill. “777 Flight Deck Design.” Boeing Airliner. April-June, 1995, p. 1. Carbaugh, Captain Dave and Captain Skip Cooper. “Avoiding Controlled Flight into Terrain.” Boeing Airliner. April-June, 1996, p. 1.
Covault, Craig. “A310 Pitches Up, Dives on Orly Approach.” Aviation Week and Space Technology.
Demosthenes, F/O T.A. and Capt. J.G. Oliver. “A Pilot’s Perspective.” Air Line Pilot. June, 1991, p 22.
Donoghue, J. A. “Making Automation Behave.” Air Transport World. March, 1995, p. 5.
Donoghue, J. A. “Keepin’ the Shiny Side Up.” Air Transport World. October, 1995, p. 47.
Dornheim, Michael A. “Dramatic Incidents Highlight Mode Problems in Cockpits.”
Aviation Week and Space Technology. January 30, 1995, p. 56.
Dornheim, Michael A. “Modern Cockpit Complexity Challenges Pilot Interfaces.”
Aviation Week and Space Technology. January 30, 1995, p. 60.
“FAA Gains Access to Flight Data Recorder Information for Safety Purposes.” Aviation Daily. February 10, 1995.
Flint, Perry. “A Common Problem.” Air Transport World. March, 1995, p. 51.
Hughes, David. “Aiding Mode Awareness.” Aviation Week and Space Technology.
February 6, 1995, p. 52.
Hughes, David. “Former NASA Ames Experts Hold Key Airline Posts.” Aviation Week and Space Technology. February 6, 1995, p. 52.
Hughes, David. “Gulfstream Using Vertical Profile Display.” Aviation Week and Space Technology. February 6, 1995, p. 55.
Hughes, David. “Incidents Reveal Mode Confusion.” Aviation Week and Space Technology. January 30, 1995, p. 56.
Hughes, David. “Laptop FMS ‘Tutor’ Aids Automation Training.” Aviation Week and Space Technology. February 20, 1995, p. 39.
Hughes, David. “Studies Highlight Automation ‘Surprises’.” Aviation Week and Space Technology. February 6, 1995, p. 48.
Hughes, David. “CRM Library to Help Share Data, Save Money” Aviation Week and Space Technology. June 12, 1995, p. 161.
Hughes, David and Michael A. Dornheim. “Accidents Direct Focus on Cockpit Automation.” Aviation Week and Space Technology. January 30, 1995, p. 52. Learmount, David. “Lessons from the Cockpit.” Flight International. January 17, 1995, p. 24.Learmount, David. “Off Target.” Flight International. January 17-23, 1996, p. 24.

Nenhum comentário: