A second fatal airplane accident involving a Boeing 737 Max 8 may have been a case of man vs. machine. The fact that the machine was designed to protect passengers points to a need for rethinking safeguards in an age of increasing automation.

Investigators are probing whether the Ethiopian Airlines flight that killed 157 people in a crash this month and the Lion Air flight that killed 189 people in Indonesia last fall were caused by the same malfunctioning software. The Federal Aviation Administration is known for its stringent safety rules, but the problem Boeing says it is now patching — an anti-stall system that officials suspect kicked into effect in Indonesia based on bad data — sneaked through anyway.

The debacle shows that regulators should apply extra review to systems that take control away from humans when safety is at stake: Some blame the mess-up on the FAA's decision to outsource some components of performance reviews to the manufacturer. The anti-stall software apparently forced down a plane's nose based on data from only a single sensor, which meant one erroneous reading could cause failure. That's an unusual practice in air safety, and one a thorough process might have prevented.

But perfecting such a process is more difficult than it sounds. Government already suffers from a dearth of qualified computer scientists, and the more advanced technology becomes, the more difficult it is to evaluate. Partnerships with engineering experts, perhaps through universities, could help, but even the designers of artificially intelligent systems sometimes cannot explain why they make the decisions they do.

These realities demand that regulators tread carefully when approving automated technologies whose bugs could kill. They also demand that humans remain as informed as possible about the systems on which they are relying. Professionals should be adequately trained, but software should also be transparent. The 737 Max anti-stall system reset itself every time pilots course-corrected while their planes plunged toward the ground. To fully disable it, pilots would have had to throw two additional switches — but to know to do that, they would also have had to know what was wrong.

Boeing charged a premium for a "disagree" light on the plane to inform pilots that its sensors were giving contradictory readings. Airlines also had to pay extra for a feature with a complete reading for each sensor. The company is now changing that practice.

The 737 crisis has implications for more industries than just aviation, from self-driving cars to medical care. Software has bugs. Extensive testing can pre-empt some problems, but it is almost impossible to root out all eventualities. Where the consequences of a machine's failure are the most severe, humans cannot afford to stop paying attention.

FROM AN EDITORIAL IN THE WASHINGTON POST