Neither Lieutenant Nathan Poloski’s body, nor his F/A-18 Hornet were ever found in waters almost three miles deep. All that was located in the Western Pacific after his fighter jet collided with another from the same aircraft carrier were his helmet and some pieces of debris. The pilot of the other jet ejected safely and was rescued shortly after.
The Navy accident report, all of eight pages long, was acquired by the New York Times under a Freedom of Information Act request.1 What remains a mystery is what exactly caused the accident, the report suggests. It was a clear afternoon with good visibility. Both pilots were healthy, properly rested and under no unusual stress. There were no mechanical problems with either aircraft.
And so what is left, you wonder?
One of the first extensive published research studies into “human error,” in 1947, put the label in quotation marks.2 Paul Fitts and Richard Jones, building on wartime pioneering work by engineering psychologists such as Alphonse Chapanis, had been wanting to get a better understanding of how features of pilots’ tools and tasks influenced the kinds of errors they made. Using recorded interviews and written reports, they built up a corpus of accounts of ‘pilot errors.’ They found that these ‘errors’ came from somewhere, that these were assessments and actions that made sense at the time. That was 1947.
Would the Navy’s current top aviator, Vice Admiral Mike Shoemaker, himself an F/A-18 pilot, have read Fitts and Jones? The two pilots involved in the midair, he opined in reflections on the report when closing the investigation on April 20th this year, should have exercised more of what his military calls “situational awareness, or S.A.” In this case, it would have meant not relying only on cockpit instruments but looking outside “to spot a looming catastrophe.”
Perhaps the Vice Admiral did read Fitts and Jones. Practically all Army Air Force pilots, Fitts and Jones had found, regardless of experience and skill, reported that they sometimes made ‘errors.’ While the eight-page report had originally only admonished (the dead) Poloski for losing S.A., the Vice Admiral broadened it to both pilots. He didn’t call what they did (or did not do) “pilot error,” he did call it a lack of situational awareness.
Is it 1947 yet?
Fitts and Jones called their paper Analysis of factors contributing to 460 “pilot-error” experiences in operating aircraft controls. Again, “pilot error” was in quotation marks, in the very title of the paper—denoting the researchers’ suspicion of the term. And it didn’t stop there: they even used the prefix “so-called.” This is how their paper opened: “It should be possible to eliminate a large proportion of so-called ‘pilot-error’ accidents by designing equipment in accordance with human requirements.” How much clearer would they’ve had to get? And it wasn’t just equipment. Subsequent human factors research extended the study of context to operational and organizational factors.
On September 12th, 2014, Poloski had been on a practice bombing mission. With 221 hours on the Hornet, he was less experienced than the other pilot, a Navy commander, who had taken on the plane. At 7,000 feet, Poloski turned west and slowed to about 300 miles an hour. Poloski’s jet caught up with the other plane and impacted its bottom left rear.
Did anyone ask sufficiently probing questions about the context in which his actions made sense? Poloski had told his mother shortly before deployment that he was looking forward to the mission. He was not about to go die in some accident. It turned out that he was not aware that the other pilot had chosen the same route. And that controllers on the carrier were occupied with landing aircraft. But “while there is no definitive evidence to suggest either pilot’s S.A. or lack thereof directly contributed to this incident, greater S.A. by all parties may have prevented the collision,” the Vice Admiral concluded.
Fitts and Jones did not call the episodes they studied “failures,” or talked about them in reference to any slippage from some implicit norm or standard (like “greater S.A.” Sure. Greater than what?). Instead, Fitts and Jones used the neutral term “experiences” in the write-up of their research results.
Is it 1947 yet?
The point, for Fitts and Jones, was not the “pilot error.” That was just the symptom of trouble, not the cause of trouble. It was just the starting point. The remedy did not lie in telling pilots not to make errors (or tell them not to lose S.A.). Rather, Fitts and Jones argued, we should change the tools, fix the operational and organizational environment in which we make people work, and by that we can eliminate the errors of people who deal with those tools. Skill and experience, after all, had little influence on “error” rates: getting people trained better or disciplined better would not have much impact. Rather change the environment, and you change the behavior that goes on inside of it. In this report, the focus is firmly (to speak with Don Norman) on what might have been (or not been) in the heads of the pilots, rather than in their world. The focus was on the pilots, not on the context.
Is it 1947 yet?
Is this an instance of the U.S. military unlearning or disinheriting the key things that Fitts and Jones taught them more than half a century ago? Today’s investigations, media and others frequently come up with new labels for ‘human error’ (“lack of situational awareness”). And they stop when they have satisfied their desire to blame the frontline operator for their “failures” to do this, that, or the other thing, or for lacking something which, in hindsight, is so obvious to show. The “catastrophe,” after all, was “looming.” All you needed to do was look up.
Is it 1947 yet? Then let’s start putting “human error,” by whatever name, in quotation marks. Because it is merely an attribution after the fact.
Is it 1947 yet? Then understand that this attribution is the starting point, not the conclusion of an investigation.
Is it 1947 yet? Then, like Fitts and Jones, let’s not use the word “failure” or “lack,” but rather something like “experience” to describe the episode where things went wrong.
Based on the helmet investigators found—which had a big crack in it, extending from the bottom right side up to the crown with a hole halfway—they concluded that Poloski must have suffered massive, fatal head trauma. We may surely hope it was swift. But to then admonish someone for a “lack of S.A.” whose perspective we never shared because we weren’t there, and with whom we’ll never be able to talk again?
Is it 1947 yet?
We could all learn a lot from the insights of Fitts and Jones, from their understanding, their open-mindedness and their moral maturity. Here’s two suggestions to Vice Admiral Shoemaker, and to many others who might feel tempted to blame the dead. First read Fitts and Jones, 1947. And then don’t write in a report what you wouldn’t say to the face of a mother who has just lost her 26-year old son.
- Schmitt E. Navy pilot’s death reflects hazards of job. International New York Times 2015 May 13;6.
- Fitts PM, Jones RE. Analysis of factors contributing to 460 “pilot error” experiences in operating aircraft controls Dayton, OH: Aero Medical Laboratory, Air Material Command, Wright-Patterson Air Force Base, 1947.