My notes and other stuff


Paper: Repentance as Rebuke: Betrayal and Moral Injury in Safety Engineering

Major nerdsnipe for the week: Woods and Dekker got together and wrote a paper titled Repentance as Rebuke: Betrayal and Moral Injury in Safety Engineering, which looks at the aftermath of the 737-MAX air disasters and the patterns observed in terms of moral injury and repentance across the industry and its observers. It's got quite an interesting structure and I'm not too sure how to summarize it, so here goes nothing.

A base assumption here is that if an engineer needs 'moral courage' to do their job, then there are already moral and systemic failures in the system that can't be fixed with front-line moral heroism alone. What the authors do is take one specific FAA safety engineer's message of repentance and rebuke, and explore (by contrasting it with concepts of moral injury, hubris, humility, decay and disaster, suggested remedies) and offer a framing for the rebuke from the engineer by doing so.

If you want to read the original engineer's text, it's available here, but the TL:DR; is:

Early in 2021, a Federal Aviation Administration (FAA) safety engineer based in Seattle, publicly denounced the federal regulator’s role in approving the Boeing 737MAX [...] What drove him to speak out while still employed was his renewed faith, deeply intertwined with an active repentance for his role in failing to prevent the disasters. [...] In a detailed letter sent to one of the families involved in the second crash, and several interviews, the safety engineer explained how he should have been, but was not, among the regulator’s specialists who assessed the MAX’s critical new flight control software. [...] Senior FAA managers had focused on fulfilling the demands of industry, while itself struggling to meet its regulatory mandate under conditions of deregulation and resource limitations [...] As a result, manufacturers did much of their own certification, a process that could leave senior engineers and unaware or ignorant of critical new systems or changes and additions to existing systems. Boeing had felt under pressure its employees to keep the implications of design and software changes to the 737 minor or invisible, and get it through regulatory approval without extra pilot training requirements.

The Repentance

The letter is an act of repentance: the engineer had feelings of guilt and remorse as the most experienced engineer in the Seattle office. He admitted to wanting the families to heal and felt he had failed the responsibilities of his role.

The authors state that this points at a difference between backwards-looking accountability (finding who to blame) and forward-looking accountability oriented towards setting and meeting goals:

A failure of forward-looking accountability, then, can be seen here as not deploying the system’s role or knowledge in the service of risk reduction and harm avoidance.

This is an interesting point because while the engineer is blaming himself (individual backwards-looking accountability), authors suggest re-casting this as a failure on forward-looking accountability at the system level.

The Moral Injury

The authors define moral injury as the consequence of having your ethical framework broken by the actions of others or even your own actions, or when transgressing your own moral compass. This can be accompanied with feelings of betrayal. A thing to be careful about then is that knowledge of the outcome and hindsight make it so people tend to single out their own omissions or inaction.

The issue there is that in doing so, you tend to overemphasize your own role and downplay a bunch of other contributing factors. It is supposed that the organization would have overridden whatever the person would have done to change the course of actions. People tend to organize their memories in counterfactualism ("what if I had done differently at this time?"), and it "implies that repentance is called for because it was a supposedly rational choice to not do the right thing."

Humility and Hubris

In this section, authors compare humility and hubris in engineering. They start by first establishing that technology is unruly, and always less reliable, ordered, and controlled than imagined. There are always a lot of trade-off goals for engineers, and they point out that in recent history, it is lack of failure that may send systems towards failure:

Success and failure in design are intertwined. Though a focus on failure can lead to success, too great a reliance on successful precedents can lead to failure. Success is not simply the absence of failure; it also masks potential modes of failure. Emulating success may be efficacious in the short term, but such behavior invariably and surprisingly leads to failure itself.

The risk, particularly, is of emulating your own past successes. In the case of the 737 MAX, the authors point out that Boeing made changes to the control software to make sure pilots wouldn't need re-certification, which seemed small and local. This in turn meant that the "hazard" was downplayed, which reduced the scope of tests (because earlier tests said you didn't need more tests), something they call a "self-licking ice cream cone".

The issue there is that testing has to be able to point out unexpected things, that you thought weren't necessary to check. The regulator's role is specifically to answer to whether sufficient testing has been done, that the system has been re-assessed enough, while the manufacturers tried to minimize disclosure and were undercutting engineers' ability to test their own work.

The Decay and Disaster

The authors state that the engineer repentance is not coming from their individual lack of heroism (to stand up and oppose the FAA's authority), but that the entire system upon which a regulator's forward-accountability had eroded.

Cutbacks and deregulation pushed the agency to have few options but to rely on manufacturers to have all the expertise they needed:

Erosion of the organization’s core functions continues even while people inside cling to the aesthetics of its former authority and an idealized image of its own character. The most important consequence of such

decay is a condition of generalized and systemic ineffectiveness. It develops when an organization shifts its activities from coping with reality to presenting a dramatization of its own ideal character. In [such an] organization, flawed decision making of the sort that leads to disaster is normal activity, not an aberration.

When the entire structure turns into a chain of command where lower-level employees can only offer feedback when asked for it (and "sit down and shut up" otherwise), desires for things to change can only be heard by those who are actually attentive and receptive.

The Remedies

A popular proposed solution is to ask engineers to try harder, to blow the whistle harder. There are lots of measures in place to make this safer or less threatening, but the authors make the argument that these won't necessarily be effective:

calls for engineers to speak up not only mischaracterize the relationship between them and their organization—whether private or government—but that these calls are misguided. [...] [Engineers make organizational goals as] their own goals. These are no longer decisions and trade-offs made by the organization, but problems proudly owned by individuals or teams of engineers.

Engineers take a point of pride in showing they can do more with less, and find solutions where others couldn't. Who would the engineers even speak up to if they share goals with the organization?

Threatening engineers with becoming villains if harm happens later, they argue, risks removing the psychological safety they need to openly discuss how to make systems safe. They in fact say that doing so is a capitulation of the system:

An engineer employed by a regulator for the explicit purpose of evaluating, assessing, checking and assuring the engineering of a manufacturer’s software or system, should not have to be told what a ‘test’ is, or rely on whistleblower protection, or on individual ethical heroics to probe further and be heard when something is uncovered. She or he should not have to be threatened with losing professional certification to feel compelled to look further and raise her or his voice. If that is what the field has come down to, then something much deeper, and wider, is amiss. Which, of course, is the argument made by some: a steady decades-long drift into disaster of extractive capitalism itself.

The Rebuke

The FAA engineer, rather than taking a self-destructive approach, turned to repentance and then constructive anger. His rebuke ("principled moral outrage") led him to a wider view of the events and its context.

The authors point out that Boeing had been on a long trend, since 2004, in transforming itself from a "great engineering firm" into a "company that makes money", in the words of the then new COO. A significant portion of the executives had their long-term incentives tied to shareholder returns. Academic literature at the time had started pointing out the risks of this. As the authors state:

Extractive financial pressure and engineering trade-offs became biased in a direction of maximal grandfathering, reduced need for oversight, less manpower allocated, accelerated approvals, no additional training requirements. [...] Nudging tradeoffs away from extractive tendencies requires a change in leadership posture—from corporate warrior to guardian spirit, listening to staff and customers, and committing not just to vague aspirations of accountability and transparency, but to the testable specificities of honesty and truth-telling. Repentance involves a commitment to tell the truth despite its costs.

They state that the aircraft would have turned a huge profit regardless of what they had done, even [I assume] if it had required additional certifications or not. Even then, they sucked more and more resources out of the project for share buybacks, where those who benefit the most were the decision-makers.

They conclude:

The rebuke’s real subject is indignation over the kind of corporatization, deregulation and financialization that makes some shareholders wildly rich while underprioritizing engineering norms, expertise and technical know-how. It is a principled moral outrage that inspired the safety engineer to step up in the first place—in his own quoted words ‘enraged by sinful greed’ while ‘the righteous perish’.

That about as hard of a slam as I can recall seeing in any paper.