IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings

Page created by Frances Miller
 
CONTINUE READING
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM
FOR AUTONOMOUS VEHICLE DESIGN
Attached are two excellent Seattle Times articles which provide the current thinking regarding
the Boeing 737 MAX design issue which very likely led to two separate air tragedies, Lion Air and
Ethiopian Airlines, roughly six months apart. In the Lion Air case, the finding is based on black box
analysis, whereas in the Ethiopian Airlines case, the black box analysis is on-going, but several
kinds of other data point to the same cause. Upon reading these articles and also hearing the
views of experienced 737 MAX pilots, it struck me that these findings also have design
implications for autonomous vehicles as well.
In the Boeing case, although there is a long tradition of giving the pilot complete control of the
aircraft, for the 737 MAX there is a new automatic flight control system, designed to act in the
background, without pilot input. This new flight control system, called MCAS (Maneuvering
Characteristics Augmentation System) was believed necessary because the MAX’s much larger
engines had to be placed farther forward on the wing than earlier 737’s, and this changed the
MAX airframe’s aerodynamic lift profile. MCAS is designed to activate automatically during the
flight situation of a high-speed stall and provide an extra movement downward of the airplane
nose. This is shown in the following graphic.

In the Boeing crash Lion Air case, the finding is that one of two angle of attack sensors shown is
frames 1 and 2 above, had failed and bad data was being fed to the to the MCAS control system.
               NOTICE: THIS DOCUMENT IS RSL HOLDINGS INC-COPYRIGHT 2019
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
This MCAS in turn caused excessive movements of the horizontal rudder, which led to oscillation
of the plane and led to its crash. It appears that one of the key lessons learned is that integrity of
these sensors needs to be guaranteed via both design redundancy and failure warnings. In the
MAX case, there was an attempt at redundancy by having two sensors, but there was apparently
no failure warning provided to the pilots.
The fact that a single faulty AOA sensor could bring down an airplane is absolutely unacceptable
in aviation safety, but there are also implications for a world which is involved in development
and implementation of many kinds of autonomous vehicles. This is especially important when
one considers the difference in knowledge and training between a 737 MAX pilot and the
operator of a car or a long-haul truck designed to be an autonomous vehicle.
Most cars on the road today require the driver to do practically everything—signaling, steering,
accelerating, braking, watching the traffic ahead, to the sides and to the rear. This is Level 0
motoring on the scale of autonomous vehicles devised by America’s Society of Automotive
Engineers (SAE). Vehicles equipped with rudimentary forms of driver-assistance, such as cruise
control or reversing sensors, are classified as Level 1.
Fitted with wide-angle cameras, GPS sensors and short-range radars, Level 2 vehicles adapt their
speed to the surrounding traffic automatically, maintain a safe distance from the vehicle ahead,
keep within their own lane, and even park themselves occasionally. For short stretches of time,
the driver may be separated from the steering wheel and pedals but must be ready to take full
control of the vehicle at any instant. Tesla’s Autopilot system is classed as Level 2 technology—
or was until it was rolled back recently to Level 1 for safety reasons.
Level 3 autonomous driving means that the driver must be vigilant and ready to intervene, the
car is responsible for all critical safety functions. This has a lot of engineers worried. Experience
has not been good with control systems that relegate the operator to a managerial role whose
only job is to intercede in the case of an emergency.
Over-reliance on automation and lack of understanding by human operators about when to
intervene have been cited as important factors contributing to problems. Some believe it might
be better to skip Level 3 altogether, and go straight to Level 4, even if it takes longer.
In theory, Level 4 technology should be safer. Such vehicles will carry out all critical driving
functions by themselves, from the start of the journey to the end, although they will be restricted
to particular roads. This means roads which are mapped in three dimensions and "geofenced" by
GPS signals.
In fully autonomous Level 5 motoring, the vehicles have to perform in all respects at least as well
as human drivers—in short, they must be capable of going anywhere, in every conceivable
weather condition, and be able to cope with the most unpredictable of situations, such as animals
bursting out of bushes and crazy people doing crazy things.
Autonomous vehicles use a variety of techniques to detect their surroundings, such
as radar, laser, GPS, odometry and computer vision. Advanced control systems interpret sensory
information to identify appropriate navigation paths, as well as identify obstacles and

               NOTICE: THIS DOCUMENT IS RSL HOLDINGS INC-COPYRIGHT 2019
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
relevant signage. Autonomous cars have control systems capable of analyzing sensory data to
distinguish among different cars on the road and then ultimately plan a path to the desired
destination. The most crucial piece of technology needed to make that happen is LiDAR which
uses pulses of laser light flashed from a rotating mirror on a vehicle’s roof to scan the
surroundings for potential obstacles. LiDAR provides an image in three dimensions and cannot
be dazzled by bright light or blinded by darkness. Clever algorithms enable LiDAR sensors to tell
whether an object is another vehicle or a wayward pedestrian.
RSL Holdings Inc. currently has a patent sale offering which is related to autonomous vehicles and
associated with the offering are several EOU charts showing probable use of the patent claims.
One of the patents in the offering US 8,446,267, provides for continuity of sensors and a sensor
array. The specific commercial sensor product mapped to ‘267 is a proximity sensor, widely used
in transportation and aerospace applications. This EOU chart is part of the RSL sale offering and
is available under NDA.
Vehicle manufacturers have been installing proximity sensors in vehicles for some time, where
these sensors are typically installed in the rear of the vehicle and configured to be activated when
the vehicle is placed in reverse. Some vehicle manufacturers have also installed proximity sensors
in the front of vehicles, which have been configured to detect objects within a predetermined
range of distances in front of the vehicle.
Many vehicle drivers have come to rely on the proximity sensors to the point where they take for
granted their proper functioning and this presents a considerable danger to pedestrians,
especially small children, who may not be visible from any vantage point of the driver of the
vehicle. If the driver relies on the proximity sensors to avoid small objects, not visible from the
driver's perspective, the result can be catastrophic should the proximity sensors unbeknownst to
the driver, fail to detect. There is an unfulfilled need for notifying drivers of the functional
integrity of the proximity sensors mounted on a vehicle.
As previously noted, patent US 8,446,267 in the RSL offering provides for continuity of sensors.
It is clear that implementation of autonomous vehicles will result in many, many sensors, and the
integrity of these sensors and their networks will be critical to the successful implementation of
autonomous vehicles. And the learnings from automation design and executions in the Boeing
737 MAX will represent a huge challenge when applied to autonomous vehicles. And in both
cases many human lives hang in the balance.

               NOTICE: THIS DOCUMENT IS RSL HOLDINGS INC-COPYRIGHT 2019
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
Pilots struggled against Boeing’s 737 MAX
control system on doomed Lion Air flight
Seattle Times
Dominic Gates
November 28, 2018

The recovered flight-data recorder, the so-called “black box,” of the Lion Air jet that crashed
into the sea Oct. 29 is displayed during a... (Tatan Syuflana / The Associated Press)
Data from the fatal Oct. 29 flight that killed 189 people, and from the prior day's flight of
the same jet, raises questions about three factors that seem to have contributed to the
crash.
A key instrument reading on Lion Air flight JT610 was faulty even as the pilots taxied out for
takeoff. As soon as the Boeing 737 MAX was airborne, the captain’s control column began
to shake as a stall warning.

And from the moment they retracted the wing flaps at about 3,000 feet, the two pilots
struggled — in a 10-minute tug of war — against a new anti-stall flight-control system that
relentlessly pushed the jet’s nose down 26 times before they lost control.
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
Though the pilots responded to each nose-down movement by pulling the nose up again,
mysteriously they didn’t do what the pilots on the previous day’s flight had done: simply
switched off that flight-control system.

The detail is revealed in the data from the so-called “black box” flight recorder (it’s actually
orange in color) from the fatal Oct. 29 flight that killed 189 people and the prior day’s flight
of the same jet, presented last Thursday to the Indonesian Parliament by the country’s
National Transportation Safety Committee (NTSC).

This data is the major basis for the preliminary crash-investigation report that was made
public Wednesday in Indonesia, Tuesday evening in Seattle.

The flight-recorder data is presented as a series of line graphs that give a clear picture of
what was going on with the aircraft systems as the plane taxied on the ground, took off and
flew for just 11 minutes.

The data points to three factors that seem to have contributed to the disaster:

       •   A potential design flaw in Boeing’s new anti-stall addition to the MAX’s flight-
           control system and a lack of communication to airlines about the system.
       •   The baffling failure of the Lion Air pilots to recognize what was happening and
           execute a standard procedure to shut off the faulty system.
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
•   And a Lion Air maintenance shortfall that allowed the plane to fly repeatedly
           without fixing the key sensor that was feeding false information to the flight
           computer on previous flights.
Anti-stall system triggered
Peter Lemme, a former Boeing flight-controls engineer who is now an avionics and satellite-
communications consultant, analyzed the graphs minute by minute.

He said the data shows Boeing’s new system — called MCAS (Maneuvering Characteristics
Augmentation System) — “was triggered persistently” as soon as the wing flaps retracted.

The data confirms that a sensor that measures the plane’s angle of attack, the angle
between the wings and the air flow, was feeding a faulty reading to the flight computer. The
two angle-of-attack sensors on either side of the jet’s nose differed by about 20 degrees in
their measurements even during the ground taxi phase when the plane’s pitch was level.
One of those readings was clearly completely wrong.

On any given flight, the flight computer takes data from only one of the angle-of-attack
(AOA) sensors, apparently for simplicity of design. In this case, the computer interpreted the
AOA reading as much too high an angle, suggesting an imminent stall that required MCAS to
kick in and save the airplane.

When the MCAS system pushed the nose down, the captain repeatedly pulled it back up,
probably by using thumb switches on the control column. But each time, the MCAS system,
as designed, kicked in to swivel the horizontal tail and push the nose back down again.

The data shows that after this cycle repeated 21 times, the captain ceded control to the first
officer and MCAS then pushed the nose down twice more, this time without a pilot response.

After a few more cycles of this struggle, with the horizontal tail now close to the limit of its
movement, the captain resumed control and pulled back on the control column with high
force.

It was too late. The plane dived into the sea at more than 500 miles per hour.

Previous crew handled similar situation
Remarkably, the corresponding black-box-data charts from the same plane’s flight the
previous day show that the pilots on that earlier flight encountered more or less exactly the
same situation.

Again the AOA sensors were out of sync from the start. Again, the captain’s control column
began shaking, a stall warning, at the moment of takeoff. Again, MCAS kicked in to push the
nose down as soon as the flaps retracted.

Initially that crew reacted like the pilots of JT610, but after a dozen cycles of the nose going
down and pushing it back up, they turned off MCAS using two standard cutoff switches on
IMPLICATIONS OF THE BOEING 737 MAX PROBLEM FOR AUTONOMOUS VEHICLE DESIGN - RSL Holdings
the control pedestal “within minutes of experiencing the automatic nose down”
movements, according to the NTSC preliminary investigation report.

There were no further uncommanded nose-down movements. For the rest of the flight, they
controlled the jet’s pitch manually and everything was normal. The jet continued to its
destination and landed safely.

Because the cockpit voice recorder has not yet been recovered from the sea bed, it’s a
mystery why the JT610 pilots didn’t recognize that it was the uncommanded horizontal tail
movements pushing the nose down.

Beside their seats a large wheel, called the stabilizer trim wheel, which rotates as the
horizontal tail swivels, would have been spinning fast and noisily. Such an uncommanded
movement, which could be triggered by other faults besides MCAS, is called a “runaway
stabilizer” and pilots are trained to deal with it in a short, straightforward procedure that’s
in the flight manual. Flicking two cutoff switches stops the movement completely.

Somehow, the pilots ignored the spinning stabilizer wheel, perhaps distracted by the shaking
of the control column — called a “stick shaker” — and the warning lights on their display
which would have indicated disagreement between the AOA sensors and consequent faults
in the readings of airspeed and altitude.

The NTSC preliminary report confirms that, shortly after takeoff, the pilots experienced
issues with altitude and airspeed data.

Still, their failure to shut off the automated tail movements is baffling.

“No one would expect a pilot to sit there and play tag with the system 25 times” before the
system won out, said Lemme. “This airplane should not have crashed. There are human
factors involved.“

Boeing design flaw?
However, even if the flight crew is found partly culpable, the sequence of this tragedy also
points to a potential design flaw in Boeing’s MCAS system.

The sequence was triggered by a single faulty AOA sensor. A so-called “single point of
failure” that could bring down an airplane is absolutely anathema in aviation safety
protocols.

Lemme, who designed flight controls at Boeing, said that although the AOA malfunction is a
single point of failure of the equipment — something airplanes are rigorously designed to
avoid — in the safety categories used for certification it represents a “hazardous” failure,
“not a single point catastrophic failure.”

The difference is when the pilots have at their disposal a straightforward way out of the
danger. For example, if one engine fails on an airplane, trained pilots know exactly what to
do to divert and land safely. If they don’t do it, of course the engine failure will bring down
the plane. But the proper pilot reaction is an expected part of the safety system.

Lemme said that, in adding MCAS to the MAX, the Boeing system design engineers must
have “made the judgment that a malfunction of the AOA sensor would be a ‘hazardous’
failure mode, not catastrophic, because the pilots can throw the cutoff switches.”

In aviation systems analysis for certification purposes, a hazardous failure must have a
probability of no more than one in 10 million. A catastrophic failure must have a probability
of less than one in a billion, which means it should never occur in the life of an airplane.

However, aside from the system design, Boeing must also answer questions about how
much information it gave to pilots about the new system for which they are assumed to
provide a safety backstop.

Capt. Dennis Tajer, chairman of the communications committee of the Allied Pilots
Association (APA), the union representing American Airlines pilots, said that airline pilots
“proudly stand as one of the layers of safety system success,” but he’s troubled that there
was nothing in the flight manual about the MCAS system.

“We are part of the safety system, yes. But you haven’t provided knowledge of the aircraft
system,” Tajer said. “Boeing is counting on the pilots as a second line of safety. But to not
inform them is to undermine your own philosophy.”

He contrasted the malfunction of MCAS on the Lion Air flight and the lack of knowledge
about the system before the accident to what happens when an engine fails in flight.

“I have an entire engine section in my manual. I know all about the system,” Tajer said. “We
have to have the information.”

He said that following the accident and the FAA airworthiness directive, “every 737 pilot in
the world is now aware this system is out there.” But the crew of JT610 lacked full
information.

A software fix
Lemme said the Lion Air crash will inevitably lead to a re-evaluation of the MCAS system
design.

In his view, it wasn’t a case of Boeing’s design engineers ignoring the consequences of a
single sensor failure. “It’s a case of overvaluing the pilots’ response.”

“I’m sure the systems designers that approved this assumed the pilot would hit the cutout
switches and move on,” Lemme added.

With hindsight, he said, when a calm assessment is done by engineers, they’ll probably
conclude that a single input shouldn’t be allowed to trigger the system.
He said MCAS is designed to kick in only in extreme circumstances that an airliner should
basically never face, something like a high-bank, high-stress turn, experiencing many times
the ordinary force of gravity and approaching stall.

It should only engage when the sensors are certain that’s the situation. “You need a second
input to make that judgment,” Lemme said. Some logic could also be inserted to consider
the reliability of the AOA readings when the plane is still on the ground.

Such a fix is relatively easy to install, since it will involve only software changes, he said.

Boeing in a statement said it is “taking every measure to fully understand all aspects of this
accident.”

“We will analyze any additional information as it becomes available,” Boeing said.

Ineffective maintenance
A third area of intense scrutiny as a result of the flight data is Lion Air’s maintenance
procedures.

The preliminary NTSC report states that the maintenance logs for the accident aircraft
recorded problems related to airspeed and altitude on each of the four flights that occurred
over the three days prior to Flight 610.

The logs indicate that various maintenance procedures were performed, but issues related
to airspeed and altitude continued on each successive flight. The logs indicate that, among
other procedures, on Oct. 27, two days prior to the accident flight, one of the airplane’s AOA
sensors was replaced.

On Oct. 28, the flight immediately prior to Flight JT610, the pilot in command and the
maintenance engineer discussed the maintenance that had been performed on the aircraft.
The engineer informed the pilot that the AOA sensor had been replaced and tested.

However, the issue clearly wasn’t fixed. As noted above, the same problems recurred during
that flight. The report also states that, after landing, the pilot on this prior flight reported
some of the experienced issues both on the aircraft maintenance log and to engineering.

Lion Air has a very poor safety record and has been accused of skimping on maintenance to
cut costs.

Lemme said that in an aviation safety analysis, timely maintenance to fix faults is required
to reduce a crew’s exposure.

“This plane flew repeatedly with faults that should have been repaired,” Lemme said. “That
increased the exposure of the faults to more flight crews, until it found a flight crew that
wasn’t able to handle the situation.”
Flawed analysis, failed oversight: How
Boeing, FAA certified the suspect 737
MAX flight control system
Seattle Times
Dominic Gates
March 17, 2019

A worker is seen inside a Boeing 737 MAX 9 at the Renton plant. The circular sensor seen at bottom
right measures the plane’s angle of attack, the angle between the airflow and the wing. This sensor
on 737 MAX planes... (Mike Siegel / The Seattle Times)
Federal Aviation Administration managers pushed its engineers to delegate wide
responsibility for assessing the safety of the 737 MAX to Boeing itself. But safety engineers
familiar with the documents shared details that show the analysis included crucial flaws.
As Boeing hustled in 2015 to catch up to Airbus and certify its new 737 MAX, Federal Aviation
Administration (FAA) managers pushed the agency’s safety engineers to delegate safety
assessments to Boeing itself, and to speedily approve the resulting analysis.
But the original safety analysis that Boeing delivered to the FAA for a new flight control
system on the MAX — a report used to certify the plane as safe to fly — had several crucial
flaws.

That flight control system, called MCAS (Maneuvering Characteristics Augmentation
System), is now under scrutiny after two crashes of the jet in less than five months resulted
in Wednesday’s FAA order to ground the plane.

Current and former engineers directly involved with the evaluations or familiar with the
document shared details of Boeing’s “System Safety Analysis” of MCAS, which The Seattle
Times confirmed.

The safety analysis:

   •   Understated the power of the new flight control system, which was designed to
       swivel the horizontal tail to push the nose of the plane down to avert a stall. When
       the planes later entered service, MCAS was capable of moving the tail more than
       four times farther than was stated in the initial safety analysis document.
   •   Failed to account for how the system could reset itself each time a pilot responded,
       thereby missing the potential impact of the system repeatedly pushing the airplane’s
       nose downward.
   •   Assessed a failure of the system as one level below “catastrophic.” But even that
       “hazardous” danger level should have precluded activation of the system based on
       input from a single sensor — and yet that’s how it was designed.
The people who spoke to The Seattle Times and shared details of the safety analysis all spoke
on condition of anonymity to protect their jobs at the FAA and other aviation organizations.

Both Boeing and the FAA were informed of the specifics of this story and were asked for
responses 11 days ago, before the second crash of a 737 MAX last Sunday.

Late Friday, the FAA said it followed its standard certification process on the MAX. Citing a
busy week, a spokesman said the agency was “unable to delve into any detailed inquiries.”

Boeing responded Saturday with a statement that “the FAA considered the final
configuration and operating parameters of MCAS during MAX certification and concluded
that it met all certification and regulatory requirements.”

Adding that it is “unable to comment … because of the ongoing investigation” into the
crashes, Boeing did not respond directly to the detailed description of the flaws in MCAS
certification, beyond saying that “there are some significant mischaracterizations.”
Several technical experts inside the FAA said October’s Lion Air crash, where the MCAS has
been clearly implicated by investigators in Indonesia, is only the latest indicator that the
agency’s delegation of airplane certification has gone too far, and that it’s inappropriate for
Boeing employees to have so much authority over safety analyses of Boeing jets.

“We need to make sure the FAA is much more engaged in failure assessments and the
assumptions that go into them,” said one FAA safety engineer.

Certifying a new flight control system
Going against a long Boeing tradition of giving the pilot complete control of the aircraft, the
MAX’s new MCAS automatic flight control system was designed to act in the background,
without pilot input.

It was needed because the MAX’s much larger engines had to be placed farther forward on
the wing, changing the airframe’s aerodynamic lift.

Designed to activate automatically only in the extreme flight situation of a high-speed stall,
this extra kick downward of the nose would make the plane feel the same to a pilot as the
older-model 737s.
Boeing engineers authorized to work on behalf of the FAA developed the System Safety
Analysis for MCAS, a document which in turn was shared with foreign air-safety regulators
in Europe, Canada and elsewhere in the world.

The document, “developed to ensure the safe operation of the 737 MAX,” concluded that
the system complied with all applicable FAA regulations.

Yet black box data retrieved after the Lion Air crash indicates that a single faulty sensor — a
vane on the outside of the fuselage that measures the plane’s “angle of attack,” the angle
between the airflow and the wing — triggered MCAS multiple times during the deadly flight,
initiating a tug of war as the system repeatedly pushed the nose of the plane down and the
pilots wrestled with the controls to pull it back up, before the final crash.

On Wednesday, when announcing the grounding of the 737 MAX, the FAA cited similarities
in the flight trajectory of the Lion Air flight and the crash of Ethiopian Airlines Flight 302 last
Sunday.

Investigators also found the Ethiopian plane’s jackscrew, a part that moves the horizontal
tail of the aircraft, and it indicated that the jet’s horizontal tail was in an unusual position —
with MCAS as one possible reason for that.

Investigators are working to determine if MCAS could be the cause of both crashes.

Delegated to Boeing
The FAA, citing lack of funding and resources, has over the years delegated increasing
authority to Boeing to take on more of the work of certifying the safety of its own airplanes.

Early on in certification of the 737 MAX, the FAA safety engineering team divided up the
technical assessments that would be delegated to Boeing versus those they considered
more critical and would be retained within the FAA.

But several FAA technical experts said in interviews that as certification proceeded,
managers prodded them to speed the process. Development of the MAX was lagging nine
months behind the rival Airbus A320neo. Time was of the essence for Boeing.

A former FAA safety engineer who was directly involved in certifying the MAX said that
halfway through the certification process, “we were asked by management to re-evaluate
what would be delegated. Management thought we had retained too much at the FAA.”

“There was constant pressure to re-evaluate our initial decisions,” the former engineer said.
“And even after we had reassessed it … there was continued discussion by management
about delegating even more items down to the Boeing Company.”

Even the work that was retained, such as reviewing technical documents provided by
Boeing, was sometimes curtailed.
“There wasn’t a complete and proper review of the documents,” the former engineer added.
“Review was rushed to reach certain certification dates.”

When time was too short for FAA technical staff to complete a review, sometimes managers
either signed off on the documents themselves or delegated their review back to Boeing.

“The FAA managers, not the agency technical experts, have final authority on delegation,”
the engineer said.

Inaccurate limit
In this atmosphere, the System Safety Analysis on MCAS, just one piece of the mountain of
documents needed for certification, was delegated to Boeing.

The original Boeing document provided to the FAA included a description specifying a limit
to how much the system could move the horizontal tail — a limit of 0.6 degrees, out of a
physical maximum of just less than 5 degrees of nose-down movement.

That limit was later increased after flight tests showed that a more powerful movement of
the tail was required to avert a high-speed stall, when the plane is in danger of losing lift and
spiraling down.

The behavior of a plane in a high angle-of-attack stall is difficult to model in advance purely
by analysis and so, as test pilots work through stall-recovery routines during flight tests on
a new airplane, it’s not uncommon to tweak the control software to refine the jet’s
performance.

After the Lion Air Flight 610 crash, Boeing for the first time provided to airlines details about
MCAS. Boeing’s bulletin to the airlines stated that the limit of MCAS’s command was 2.5
degrees.

That number was new to FAA engineers who had seen 0.6 degrees in the safety assessment.

“The FAA believed the airplane was designed to the 0.6 limit, and that’s what the foreign
regulatory authorities thought, too,” said an FAA engineer. “It makes a difference in your
assessment of the hazard involved.”

The higher limit meant that each time MCAS was triggered, it caused a much greater
movement of the tail than was specified in that original safety analysis document.

The former FAA safety engineer who worked on the MAX certification, and a former Boeing
flight controls engineer who worked on the MAX as an authorized representative of the FAA,
both said that such safety analyses are required to be updated to reflect the most accurate
aircraft information following flight tests.

“The numbers should match whatever design was tested and fielded,” said the former FAA
engineer.
But both said that sometimes agreements were made to update documents only at some
later date.

“It’s possible the latest numbers wouldn’t be in there, as long as it was reviewed and they
concluded the differences wouldn’t change the conclusions or the severity of the hazard
assessment,” said the former Boeing flight controls engineer.

If the final safety analysis document was updated in parts, it certainly still contained the 0.6
limit in some places and the update was not widely communicated within the FAA technical
evaluation team.

“None of the engineers were aware of a higher limit,” said a second current FAA engineer.

The discrepancy over this number is magnified by another element in the System Safety
Analysis: The limit of the system’s authority to move the tail applies each time MCAS is
triggered. And it can be triggered multiple times, as it was on the Lion Air flight.

One current FAA safety engineer said that every time the pilots on the Lion Air flight reset
the switches on their control columns to pull the nose back up, MCAS would have kicked in
again and “allowed new increments of 2.5 degrees.”

“So once they pushed a couple of times, they were at full stop,” meaning at the full extent
of the tail swivel, he said.

Peter Lemme, a former Boeing flight controls engineer who is now an avionics and satellite-
communications consultant, said that because MCAS reset each time it was used, “it
effectively has unlimited authority.”

Swiveling the horizontal tail, which is technically called the stabilizer, to the end stop gives
the airplane’s nose the maximum possible push downward.

“It had full authority to move the stabilizer the full amount,” Lemme said. “There was no
need for that. Nobody should have agreed to giving it unlimited authority.”

On the Lion Air flight, when the MCAS pushed the jet’s nose down, the captain pulled it back
up, using thumb switches on the control column. Still operating under the false angle-of-
attack reading, MCAS kicked in each time to swivel the horizontal tail and push the nose
down again.

The black box data released in the preliminary investigation report shows that after this
cycle repeated 21 times, the plane’s captain ceded control to the first officer. As MCAS
pushed the nose down two or three times more, the first officer responded with only two
short flicks of the thumb switches.

At a limit of 2.5 degrees, two cycles of MCAS without correction would have been enough
to reach the maximum nose-down effect.
In the final seconds, the black box data shows the captain resumed control and pulled back
up with high force. But it was too late. The plane dived into the sea at more than 500 miles
per hour.

Recovery work continues around the crater where the Ethiopian Airlines plane crashed shortly
after takeoff last week near Bishoftu, southeast of Addis Ababa. Flight data analysis is yielding
clues about the cause of the crash. (Yidnek Kirubel / The Associated Press)
System failed on a single sensor
The bottom line of Boeing’s System Safety Analysis with regard to MCAS was that, in normal
flight, an activation of MCAS to the maximum assumed authority of 0.6 degrees was
classified as only a “major failure,” meaning that it could cause physical distress to people
on the plane, but not death.

In the case of an extreme maneuver, specifically when the plane is in a banked descending
spiral, an activation of MCAS was classified as a “hazardous failure,” meaning that it could
cause serious or fatal injuries to a small number of passengers. That’s still one level below a
“catastrophic failure,” which represents the loss of the plane with multiple fatalities.

The former Boeing flight controls engineer who worked on the MAX’s certification on behalf
of the FAA said that whether a system on a jet can rely on one sensor input, or must have
two, is driven by the failure classification in the system safety analysis.
He said virtually all equipment on any commercial airplane, including the various sensors, is
reliable enough to meet the “major failure” requirement, which is that the probability of a
failure must be less than one in 100,000. Such systems are therefore typically allowed to rely
on a single input sensor.

But when the consequences are assessed to be more severe, with a “hazardous failure”
requirement demanding a more stringent probability of one in 10 million, then a system
typically must have at least two separate input channels in case one goes wrong.

Boeing’s System Safety Analysis assessment that the MCAS failure would be “hazardous”
troubles former flight controls engineer Lemme because the system is triggered by the
reading from a single angle-of-attack sensor.

“A hazardous failure mode depending on a single sensor, I don’t think passes muster,” said
Lemme.

Like all 737s, the MAX actually has two of the sensors, one on each side of the fuselage near
the cockpit. But the MCAS was designed to take a reading from only one of them.

Lemme said Boeing could have designed the system to compare the readings from the two
vanes, which would have indicated if one of them was way off.

Alternatively, the system could have been designed to check that the angle-of-attack reading
was accurate while the plane was taxiing on the ground before takeoff, when the angle of
attack should read zero.

“They could have designed a two-channel system. Or they could have tested the value of
angle of attack on the ground,” said Lemme. “I don’t know why they didn’t.”

The black box data provided in the preliminary investigation report shows that readings from
the two sensors differed by some 20 degrees not only throughout the flight but also while
the airplane taxied on the ground before takeoff.

No training, no information
After the Lion Air crash, 737 MAX pilots around the world were notified about the existence
of MCAS and what to do if the system is triggered inappropriately.

Boeing insists that the pilots on the Lion Air flight should have recognized that the horizontal
stabilizer was moving uncommanded and should have responded with a standard pilot
checklist procedure to handle what’s called “stabilizer runaway.”

If they’d done so, the pilots would have hit cutoff switches and deactivated the automatic
stabilizer movement.

Boeing has pointed out that the pilots flying the same plane on the day before the crash
experienced similar behavior to Flight 610 and did exactly that: They threw the stabilizer
cutoff switches, regained control and continued with the rest of the flight.
However, pilots and aviation experts say that what happened on the Lion Air flight doesn’t
look like a standard stabilizer runaway, because that is defined as continuous uncommanded
movement of the tail.

On the accident flight, the tail movement wasn’t continuous; the pilots were able to counter
the nose-down movement multiple times.

In addition, the MCAS altered the control column response to the stabilizer movement.
Pulling back on the column normally interrupts any stabilizer nose-down movement, but
with MCAS operating that control column function was disabled.

These differences certainly could have confused the Lion Air pilots as to what was going on.

Since MCAS was supposed to activate only in extreme circumstances far outside the normal
flight envelope, Boeing decided that 737 pilots needed no extra training on the system —
and indeed that they didn’t even need to know about it. It was not mentioned in their flight
manuals.

That stance allowed the new jet to earn a common “type rating” with existing 737 models,
allowing airlines to minimize training of pilots moving to the MAX.

Dennis Tajer, a spokesman for the Allied Pilots Association at American Airlines, said his
training on moving from the old 737 NG model cockpit to the new 737 MAX consisted of
little more than a one-hour session on an iPad, with no simulator training.

Minimizing MAX pilot transition training was an important cost saving for Boeing’s airline
customers, a key selling point for the jet, which has racked up more than 5,000 orders.

The company’s website pitched the jet to airlines with a promise that “as you build your 737
MAX fleet, millions of dollars will be saved because of its commonality with the Next-
Generation 737.”

In the aftermath of the crash, officials at the unions for both American and Southwest
Airlines pilots criticized Boeing for providing no information about MCAS, or its possible
malfunction, in the 737 MAX pilot manuals.

An FAA safety engineer said the lack of prior information could have been crucial in the Lion
Air crash.

Boeing’s safety analysis of the system assumed that “the pilots would recognize what was
happening as a runaway and cut off the switches,” said the engineer. “The assumptions in
here are incorrect. The human factors were not properly evaluated.”
The cockpit of a grounded Lion Air 737 MAX 8 jet is seen at Soekarno-Hatta International Airport
in Cengkareng, Indonesia, last week. The crash of an Ethiopian Airlines plane bore similarities to
the Oct. 29... (Dimas Ardian / Bloomberg)

On Monday, before the grounding of the 737 MAX, Boeing outlined “a flight control software
enhancement for the 737 MAX,” that it’s been developing since soon after the Lion Air crash.

According to a detailed FAA briefing to legislators, Boeing will change the MCAS software to
give the system input from both angle-of-attack sensors.

It will also limit how much MCAS can move the horizontal tail in response to an erroneous
signal. And when activated, the system will kick in only for one cycle, rather than multiple
times.

Boeing also plans to update pilot training requirements and flight crew manuals to include
MCAS.

These proposed changes mirror the critique made by the safety engineers in this story. They
had spoken to The Seattle Times before the Ethiopian crash.

The FAA said it will mandate Boeing’s software fix in an airworthiness directive no later than
April.
Facing legal actions brought by the families of those killed, Boeing will have to explain why
those fixes were not part of the original system design. And the FAA will have to defend its
certification of the system as safe.

Seven weeks after it rolled out of the paint hangar, Boeing’s first 737 MAX‚ the Spirit of Renton‚
flies for the first time Jan. 29, 2016, from Renton Municipal Airport. (Mike Siegel / The Seattle
Times)
You can also read