Editor's note: Robert Mark is a commercial pilot, flight instructor and writer. He also publishes the industry blog, Jetwhine.com. He is the author of the forthcoming book "Loss of Control," which probes the issue of pilot confusion with aircraft automation
(CNN) -- In an op-ed on CNN.com, Robert Goyer suggested that it is easy to accidentally land an airplane at the wrong airport. And he is correct. The problem is that it shouldn't be, especially not with the profusion of technology at a pilot's fingertips designed to help prevent just these kinds of mistakes.
No doubt Southwest's landing at the wrong Missouri airport this week will be categorized as pilot error, as it was when a 747 crew landed at the wrong airport in Kansas in November. But labeling this all as an easy mistake or as pilot error is too simplistic.
Robert P. MarkIn doing so, we risk ignoring a growing threat to this nation's and the world's aviation safety, a growing disconnect between the technology created by our smartest engineers and technicians and the pilots who use it. Indeed, pilots sometimes ignore the technology, as Goyer and others have speculated could have been the case in the Southwest landing, while others are overwhelmed when the technology unexpectedly fails, like it did aboard the doomed Air France Flight 447 that crashed in the ocean in 2009.
America is good at fixing problems, though. When an airliner flying in the clouds on approach to Washington struck the top of a hill after descending too low, U.S. industry developed a fix: a ground proximity warning system, a talking cockpit box that alerts pilots to approaching hazards they couldn't easily see.
But in 2012, a Russian airline crew demonstrating a new jet received a terrain warning -- an aural cockpit signal warning the crew they were too close to the ground -- flying around Indonesia, but ignored it, believing it was a computer error. All 44 people aboard that aircraft died when the airplane struck the side of a mountain. The computer's warning had been real.
In February 2009, the captain of a Continental Airlines turboprop became confused when the autopilot of his airplane turned itself off while the airplane was slowing for the approach to Buffalo, New York. The pilot was so startled by the computer shutdown that he made a fatal flying mistake. He also believed the computer's messages were a mistake. Forty-nine people in that airplane died because the pilot was wrong.
All these aircraft were equipped with the latest technologies available to make flight as safe as humanly possible. And yet each time, the crew managed to figure out a way not to heed the warnings.
The Southwest and the Atlas Air pilots -- and their passengers, of course -- were just lucky no one was hurt. But what about the next time? The Southwest crew only averted disaster by a few hundred feet, narrowly missing a drop-off at the end of the runway that would surely have broken the airplane into many pieces.
Quest: This could have been avoided Fallout from the wrong airport landingAll these problems point to this newest threat: Pilot's confusion about what their computers are telling them, when they look at them, that is. This speaks to complacency to some extent. It's also well known that humans don't handle monitoring duties very well for very long. We grow bored rather easily.
But even labeling this a "human factors" problem is too easy. Like everything in our society these days, this complacency, this disconnect between operator and computer, is not a simple black and white problem that we can fix with another electronic box or an enthusiastic chat from the boss.
There's another overriding problem preventing us from digging deeply enough into the implications of this problem: The airline industry has become a victim of its own success, with an impressive air safety record. Before last year's crashes in San Francisco and Birmingham, Alabama, there had been not a single fatality between 2010 and 2012.
Because our record has been so good, many people inside and outside the industry, as well as legislators, regulators and certainly airline passengers may mistakenly believe that we've solved the aviation safety problem.
But just as hospital administrators would never tell patients that losing a few people now and again to infection should be considered an acceptable loss, we can't ignore the instances when something has come between our professional pilots and these technologies created to help save us from ourselves. And we in the industry are only now coming to believe this threat is real ourselves.
A recent study delivered to the FAA about automation confusion highlights some of these problems, but offered no timeline to solve them.
Passengers also need to advocate for their own safety by writing to their legislators, airline CEOs and regulators demanding that the automation confusion issue be put on the front burner now.
Like our pilots, aviation safety's threats are sending us warning messages. But right now we don't seem to be listening.
Follow us on Twitter @CNNOpinion
Join us on Facebook/CNNOpinion
{ 0 comments... read them below or add one }
Post a Comment