Logo

System Safety News and Commentary

Boeing 737 MAX accidents - Food for Thought

John Livingston, wiu60@aol.com

May 12, 2019

I have asked the chapter president, if I might post a column on the chapter web page from time to time offering comments on current safety related news and events. The objective is to provide summary information and related references that might be of interest to our chapter members.

For the past several weeks, there have been many news stories about the Boeing 737 MAX accidents, the design and the operation of that aircraft. I would caution that the official investigation is on-going and should provide the most accurate and complete assessment. Still there is much "food for thought" for system safety analysts in the current information. From the newspaper prospective, the Seattle Times investigative efforts are covered in several articles which are available on their website and offer a lot of related information. There is also an interesting prospective in an IEEE opinion article.

I would like to focus on two articles; one addressing the system safety assessment from the Seattle Times set and the other design failures and the role of the software developers assessment in the IEEE article.

The first article from the Seattle Times is Flawed analysis, failed oversight: How Boeing, FAA certified the suspect 737 MAX flight control system by Dominic Gates, Seattle Times aerospace reporter, March 17, 2019, updated March 21, 2019.

Based on inputs from current and former engineers directly involved with the evaluations or familiar with the document who shared details of Boeing’s “System Safety Analysis” of MCAS (Maneuvering Characteristics Augmentation System), the author identified the following shortcomings.

"The safety analysis:

"(1) Understated the power of the new flight control system, which was designed to swivel the horizontal tail to push the nose of the plane down to avert a stall. When the planes later entered service, MCAS was capable of moving the tail more than four times farther than was stated in the initial safety analysis document.

"(2) Failed to account for how the system could reset itself each time a pilot responded, thereby missing the potential impact of the system repeatedly pushing the airplane’s nose downward.

"(3) Assessed a failure of the system as one level below “catastrophic.” But even that “hazardous” danger level should have precluded activation of the system based on input from a single sensor — and yet that’s how it was designed."

The first point addresses the fact the MCAS was "improved" after test flights had raised questions about its effectiveness. Apparently the safety analysis was not properly updated.

The second point raises a concern about the analyst's basic understanding of the system and its operation.

The third point is directly related to the failure to assess the "upgraded" system. A failure of the original system was probably more easily corrected by the pilots. But even with that assumption, the analysis would need to address the steps the pilots needed to take and their training for such an event. Engineers design for success, not failure and certainly not catastrophic failure, System safety's job is to assure the worst case failure effects have been identified, evaluated and properly characterized.

The second article is How the Boeing 737 Max Disaster Looks to a Software Developer by Gregory Travis in IEEE Spectrum: Opinion Aerospace Aviation, 18 Apr 2019.

The author identifies three big strikes against Boeing's design of the 737 Max.

Strike 1 - The implementation of the upgrades for the 737 Max which resulted in a dynamically unstable airframe.

Strike 2 - Boeing then tried to mask the 737’s dynamic instability with a software system.

Strike 3 - The software (design) relied on systems known for their propensity to fail (angle-of-attack indicators) and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors, or even the other angle-of-attack sensor.

Mr. Travis faults the designers for ignoring basic aircraft design principles and he is critical of the software developer's knowledge and insight of the hardware functions being controlled by the software. He noted the cost and schedule pressure impacts on Boeing corporate and program management. His is also a cautionary tale for those responsible for government oversight.