The debacle shows that regulators should apply extra review to systems that take control away from humans when safety is at stake: Some blame the mess-up on the FAA’s decision to outsource some components of performance reviews to the manufacturer. The anti-stall software apparently forced down a plane’s nose based on data from only a single sensor, which meant one erroneous reading could cause failure. That’s an unusual practice in air safety, and one a thorough process might have prevented.
But perfecting such a process is more difficult than it sounds. Government already suffers from a dearth of qualified computer scientists, and the more advanced technology becomes, the more difficult it is to evaluate. Partnerships with engineering experts, perhaps through universities, could help, but even the designers of artificially intelligent systems sometimes cannot explain why they make the decisions they do.
These realities demand that regulators tread carefully when approving automated technologies whose bugs could kill. They also demand that humans remain as informed as possible about the systems on which they are relying. Professionals should be adequately trained, but software should also be transparent. The 737 Max anti-stall system reset itself every time pilots course-corrected while their planes plunged toward the ground. To fully disable it, pilots would have had to throw two additional switches — but to know to do that, they would also have had to know what was wrong.
Boeing charged a premium for a “disagree” light on the plane to inform pilots that its sensors were giving contradictory readings. Airlines also had to pay extra for a feature with a complete reading for each sensor. The company is now changing that practice.
The 737 crisis has implications for more industries than just aviation, from self-driving cars to medical care. Software has bugs. Extensive testing can preempt some problems, but it is almost impossible to root out all eventualities. Where the consequences of a machine’s failure are the most severe, humans cannot afford to stop paying attention.