Boeing 737 MAX is a reminder of the REAL problem with software
And that problem almost never is software.
Seven months ago, I mentioned the tragedies of the Boeing 737 MAX as one example of what can go wrong “when everything is software”.
In 2018 and 2019, two Boeing 737 MAX crashed, with hundreds of victims, because of severe problems in their computerized flight-control system, called Maneuvering Characteristics Augmentation System (MCAS).
Now, the Association for Computing Machinery has just published a detailed commentary of the official investigations of those crashes. That analysis explains how the major cause of those tragedies were:
- limits, or bugs, in the MCAS software itself, due to its complexity
- lack of adequate pilot training on how to fly with MCAS
The Boeing 737 MAX operations manuals (or the corresponding training procedures) contained no adequate explanation of how MCAS worked and how little time pilots had to respond.
Why? To make the Boeing 737 MAX look a mere upgrade
MCAS was a practically new, very complex piece of software. Boeing, however, wanted to downplay that fact. The reason is very simple: marketing MCAS and MAX as an upgrade, instead of something substantially new, saved money. A lot of money.
Selling an “upgrade” allowed Boeing to avoid “detailed scrutiny of MCAS and the 737 MAX” by the american Federal Aviation Administration. It also let Boeing obtain certification for the Boeing 737 MAX by… Boeing employees, instead of independent experts. At the same time, the “incremental” upgrade made airlines save “millions of dollars on pilot training in new simulators”.
What failed with Boeing 737 MAX was management, not technology
Quoting the ACM analysis, “Executives, managers, and engineers at Boeing were not stumped by the complexity or unpredictability of a new technology”.
In a series of decisions, the same people “put profits before safety, did not think through the consequences of their actions, or did not speak out loudly enough when they knew something was wrong”.
For the technical reasons of such charges, read the ACM report. Here, what is important is this take-home lesson:
When something goes wrong with software, in airplanes or everywhere else…
Very often, probably too often, the real fault is not in the software: it is inside the human beings that decided to create, use, manage, or sell that software.
Who writes this, why, and how to help
I am Marco Fioretti, tech writer and aspiring polymath doing human-digital research and popularization.
I do it because YOUR civil rights and the quality of YOUR life depend every year more on how software is used AROUND you.
To this end, I have already shared more than a million words on this blog, without any paywall or user tracking, and am sharing the next million through a newsletter, also without any paywall.
The more direct support I get, the more I can continue to inform for free parents, teachers, decision makers, and everybody else who should know more stuff like this. You can support me with paid subscriptions to my newsletter, donations via PayPal (mfioretti@nexaima.net) or LiberaPay, or in any of the other ways listed here.THANKS for your support!