A few software engineering principles:
- Software engineering 101: validate your inputs.
- Software engineering 201: when something goes wrong, provide useful data to the human.
- Software engineering 301: for life-critical decisions, avoid single point of failure.
Until today, I had thought that aviation was *good* at software engineering. But my faith is shaken by the New York Times description today of what went wrong with the Boeing 737 MAX. The article claims that
- There are two “angle of attack” sensors on the plane, but the software that controls the position of the nose of the aircraft relied on just one of them.
- A previous Seattle Times article had explained that the software then fought with the pilots, refusing to put the nose where they wanted it.
- Boeing sells an optional(!) indicator to tell the pilot what the angle of attack sensor is sensing.
- Boeing sells an option(!) to tell the pilots when the two sensors disagree with each other, a “disagree light”.
- And in the biggest “no shit sherlock” that I have ever read about software:
In the software update that Boeing says is coming soon, MCAS will be modified to take readings from both sensors. If there is a meaningful disagreement between the readings, MCAS will be disabled.
Gasp. How could the plane have been released without the two options, and without this check for disagreement?
Here is my question to aviation experts at Daily Kos (calling Major Kong).
Was I wrong all these years to think that aviation is a place where software engineering is usually done professionally, carefully, safely, and with attention to principles such as those at the top of this note?