On November 12, 2001, American Airlines Flight 587, an Airbus A300-605R, crashed while departing JFK for the Dominican Republic. After takeoff, the jumbo jet twice encountered turbulence from a preceding Japan Airlines 747. The first officer of the American Airlines flight reacted to the initial turbulence encounter by moving the control wheel rapidly left and right. When the Airbus encountered the second turbulence, it was in a 23-degree bank, and in the following 6.5 seconds the first officer made five alternating full rudder pedal inputs, at which time the vertical fin of the airplane’s tail separated from the plane. All 260 lives on the flight were lost, as well as 5 additional lives on the ground.
The National Transportation Safety Board found that American Airlines’ pilot training program may have “reinforced the first officer’s tendency to respond aggressively to wake turbulence, encouraged the use of full rudder pedal inputs, and misrepresented the airplane’s actual response to large rudder inputs.” The report went on to refer to “a simulator exercise that provided unrealistic portrayals of an airplane response.”
In essence, a significant factor in the crash may have been pilot training based upon a simulator that was improperly programmed. But think about it: How much data do we truly have of jumbo jets, fully-loaded ones at that, flying at the far edge (and beyond) of their performance envelopes? Virtually none. No one is taking these big babies up and crashing them the way we crash cars filled with test dummies into concrete walls (cue the slow-motion film please!).
The truth is that we do NOT know how a jumbo airliner reacts under ultimate stress. So when we program the simulators to train pilots on handling arcane maneuvers, we are GUESSING!. If it makes you feel any better, call it a scientific projection. But no matter what you call it, until you’ve been there a few times and are able to come back with good data, it really is just a professional guess.
And that’s okay. It really is often the best we can do. But we must remember not to put too much faith in a projection just because we have dressed it up with a lot of technology and surrounded it with Ph.D.s speaking scientific-sounding jargon.
Much of the sub-prime financial crisis we’re experiencing has been caused by unwarranted confidence in “complex financial models” that were based on insufficient data or bold extrapolations of the past into a materially different future. In part it was a self-created disaster: One thing the past was missing (and thus the model) was data on a recession, where a significant percentage of recent home loans were sub-prime (in 2005, 2006, and 2007 almost 20% of loans were sub-prime, prior years were in the 6% to 8% range).
The model gave investors false assurances, which lead them to create a very different future from the past data the model was based on. As a result, the validity of the model’s predictions became more and more suspect as dramatically different future conditions diverged from past data.
Back to our American Airlines crash: The turbulence, while no doubt uncomfortable and perhaps even frightening to the passengers, appeared to be well within the tolerance of the aircraft. However, the first officer, secure in his simulator-induced belief that it was safe to do so, aggressively responded, leading to structural failure.
A heightened awareness of the limits of our knowledge at the outer edge of a model’s predictive ability might have lead to a milder response and saved the lives of 265 people. A little less blind faith, a little more intelligent skepticism, a little more willingness to question accepted dogma.
As the bumper sticker says: “Question Authority.” Or as any sports coach will tell you, it’s dangerous to start believing your own press clippings.
(For the source of the facts and some virtually verbatim language, I’m indebted to the article “Unintended Trouble,” on page 34 of the December 2007 issue of Aviation Week’s “Business and Commercial Aviation” magazine.)
0 Comments