I’ve been thinking a lot about complacency since the crash of a Gulfstream IV on takeoff from Hanscom Field in Bedford, Mass., that left seven people dead. Any accident, especially one with fatalities, affects those of us who have spent our entire lives and careers in aviation. But one that occurs at an airport where we regularly work has a particular impact. The accident investigation continues, but the National Transportation Safety Board has already issued a preliminary report that raises some troubling questions about whether the pilots conducted a routine flight control check. According to the NTSB, the flight data recorder readout shows no flight control movement that would indicate that a check was done before takeoff. The preliminary report makes no mention of such a check recorded on the cockpit voice recorder (CVR) but does state that the CVR readout indicates crew comments regarding aircraft control before the crash. The preliminary report thus raises the possibility that no flight control check was done and that a flight control problem may have been involved in causing the accident.
While it is too early to conclude that such a check was, in fact, not done, or that a problem with the flight controls led to the accident, it is not too early to consider the possibility that a highly experienced crew somehow skipped such a routine check and the implications that has for the rest of us in aviation. If this crucial step was missed, could it have been because of a complacent attitude toward checklists? Checklists are the foundation of aviation standardization and, through standardization, aviation safety. Checklists are particularly important during taxi and takeoff and during any time of high-stress activity when a pilot’s attention to detail could be diverted. But it is not unheard of for even the most experienced and highly trained among us to get complacent about the use of written checklists, even when performing safety critical functions, be they in maintenance, flying or other areas of aviation work.
Counteracting Cultural Complacency
If we don’t take the safety implications of complacency seriously, we won’t be able to guard against it in our own work and in our own organizations. Incident and accident reports across a broad spectrum of aviation are pointing to complacency as a safety concern, and this accident might be one that puts the focus squarely on the issue.
Complacency has always been an issue for workers who perform repetitive, routine tasks, even if those tasks are considered safety critical. The repetitive and regular nature of the work can mask its safety significance and lead to casual errors and mistakes. This can be particularly true when workers begin to rely on their memories to complete checklist items. The pressure to move work to meet a schedule or performance criteria can cause people to take shortcuts. But even when there is no pressure to take shortcuts, the repetitive nature of the work, with no bad outcomes, can lead to taking those very same shortcuts. It has happened to me and I suspect to all of you. It can seem like such a chore to pull out a written checklist or list of procedures and use it step-by-step. This can be especially true in the maintenance environment when you’re using both hands to do the work, there’s poor lighting or you’re in awkward positions.
I am also concerned that aviation’s remarkable safety record can contribute to a complacent attitude, especially when it comes to basic routine tasks by employees at every level of an organization. How do you emphasize the importance of attention to detail when day after day nothing bad happens from a lack of attention? It is difficult enough to guard against individual complacency, but it is sometimes even more difficult when the culture of an organization becomes complacent about routine compliance with safety requirements. So, for example, if workers don’t routinely use checklists or other written procedures, and management doesn’t enforce their use, you can see how a culture of complacency would set in. The norm at that workplace becomes the de facto procedure, not the one specified in the manual or the regulation. This casual attitude toward strict adherence to procedures can exist even in workplaces with highly experienced and trained professionals.
There is also a more insidious type of complacency that can permeate an organization, and that is complacency about routine or even flagrant misconduct. I was recently reminded of that in reading the disturbing internal report prepared for General Motors on the now infamous faulty ignition switch that led to at least 13 deaths, record fines against GM, Congressional hearings, massive recalls and surely more litigation. The report’s findings on the corporate culture that existed at GM, unfortunately, are not unique to this one automobile company. A copy of the internal report can be found on the New York Times website (www.nytimes.com/interactive/2014/06/05/business/06gm-report-doc.html). It’s a culture many of us in the accident investigation business have seen in the aftermath of airplane crashes: a culture of indifference to misconduct and lack of accountability. Yes, we like to think we’re better than that in aviation. But there are rogue operators, and even good companies can go astray. The example of General Motors is one we need to ponder long and hard.
The report identified four areas of GM’s corporate culture that were problematic: resistance or reluctance to raising safety issues, at times out of concern that it would delay a new product launch or, by some, for fear of retaliation; a proliferation of committees and a lack of accountability symbolized by the “GM salute” (when someone crosses their arms and points outward, indicating responsibility belongs to someone other than themselves) and the “GM nod” (when everyone leaves a meeting nodding their heads in agreement but has no intention of following through); failure to share or gather knowledge/information, which led to disastrous consequences in the case of the faulty ignition switch; and finally the view that no action should be taken until the “root cause” was fully understood and a solution developed, which led to long delays in taking action on the faulty ignition switch.
I have seen this type of dysfunctional culture revealed in a number of organizations in the aftermath of accidents. But I have also seen it at the FAA, where I had a ringside seat for many years on various rulemaking committees and later as an NTSB member, and other government agencies. Corporate and government leaders would do well to study the GM report and see whether their corporate cultures could withstand scrutiny. And, if not, do something about it before it’s too late.