The perils of poor UI

Sunday 15th January 2012

You might remember that a couple of years ago, an Air France plane crashed into the Atlantic. Recently, Popular Mechanics ran an article which explained the causes of the accident using data from the aircraft’s black box. In the immediate aftermath of the accident, it was assumed that something on the aircraft must have failed as it passed through a storm. In fact, that turned out to be wrong; the aircraft was mostly fine, and the pilots “flew a perfectly good plane into the ocean”.

According to the article (which is fascinating, I really urge you to read it), the pitot tubes on the surface of the aircraft (airspeed sensors) became iced over, which meant that the pilots lost the airspeed indicator. Without this data the autopilot couldn’t fully function, and so it partially disengaged. While this went on, one of the pilots decided to put the plane into a climb, which caused it to stall (a sudden reduction in the amount of lift generated by the plane’s aerofoils). When this happened, the pilots tried to continue climbing; the wrong response, and it ultimately caused the plane to lose altitude.

The pilots on commercial aircraft such as this are highly trained, so why, when the plane started to stall, did one of them do precisely the opposite of what he should have done?

‘…the reason may be that they believe it is impossible for them to stall the airplane. It’s not an entirely unreasonable idea: The Airbus is a fly-by-wire plane; the control inputs are not fed directly to the control surfaces, but to a computer, which then in turn commands actuators that move the ailerons, rudder, elevator, and flaps. The vast majority of the time, the computer operates within what’s known as normal law, which means that the computer will not enact any control movements that would cause the plane to leave its flight envelope. “You can’t stall the airplane in normal law,” says Godfrey Camilleri, a flight instructor who teaches Airbus 330 systems to US Airways pilots.

But once the computer lost its airspeed data, it disconnected the autopilot and switched from normal law to “alternate law,” a regime with far fewer restrictions on what a pilot can do. “Once you’re in alternate law, you can stall the airplane,” Camilleri says.

It’s quite possible that Bonin had never flown an airplane in alternate law, or understood its lack of restrictions. According to Camilleri, not one of US Airway’s 17 Airbus 330s has ever been in alternate law. Therefore, Bonin may have assumed that the stall warning was spurious because he didn’t realize that the plane could remove its own restrictions against stalling and, indeed, had done so.’

In normal flight, the computer systems try to make it easier to fly the plane. But once the computers stopped getting inputs from some sensors, those systems disengaged and so altered the behaviour of the aircraft. And so it’s conceivable that efforts to make the plane safer by making piloting the aircraft easier – by simplifying the controls and handing some responsibility to the computers – may have actually contributed to this accident. There could be a number of reasons for that; the change from normal to alternate law may have been unintuitive or non-obvious to the pilots. Or perhaps its simply that taking some of the responsibility for flying the plane away from pilots for the majority of the time causes them to become complacent – to think that the plane couldn’t stall – or meant that they weren’t sure how to react when the computers couldn’t help them. How sensible is it to introduce inconsistent behaviour into any control system, let alone that for a commercial aircraft?

Hang on though, there’s more than one pilot flying the plane. When the aircraft began to stall one of them behaved incorrectly, but this is partly why there’s more than one pilot. Why didn’t the other pilot spot the mistake, and do something to solve it? Well, the Popular Mechanics article also picks up on another part of the plane’s control mechanisms which may have contributed to this:

‘Unlike the control yokes of a Boeing jetliner, the side sticks on an Airbus are “asynchronous”—that is, they move independently. “If the person in the right seat is pulling back on the joystick, the person in the left seat doesn’t feel it,” says Dr. David Esser, a professor of aeronautical science at Embry-Riddle Aeronautical University. “Their stick doesn’t move just because the other one does, unlike the old-fashioned mechanical systems like you find in small planes, where if you turn one, the [other] one turns the same way.” Robert has no idea that, despite their conversation about descending, Bonin has continued to pull back on the side stick.’

The two pilots didn’t know what each of them were doing. So one pilot was pulling back on the controls – the wrong thing to do in a stall – and the other one had no idea.

One one level, I’m simply amazed that this can happen, that the pilots can be unaware of what each other is doing. But then, I also imagine that this is a fairly stressful situation – being thrown around in an aircraft during a storm, with all sorts of alarms sounding – and that within that situation, irrespective of your training, it’s kind of easy to make a mistake.

What I mostly find interesting about this accident is that it was essentially caused by human error, and by the way that humans interact with the aircraft. By that, I mean that the pilots made several mistakes; they shouldn’t have been near the storm in the first place, and they should have acted differently once they reached the storm. But those human errors were, in part, brought about or exacerbated by the aircraft’s control systems.

In other words, this accident was in no small part caused by poor user interface design. The built-in inconsistency between normal and alternate law which possibly confused the pilots at a time when they didn’t have the capacity to deal with the confusion, and the asynchronous controls which hindered communication between the pilots. Because of these things, competent pilots flew a perfectly operable aircraft into the Atlantic Ocean.

These are things that most engineers probably wouldn’t think about. We’re technical people – that’s why we’re engineers – so we think about numbers, about science, about the basic mechanics that underlie how something works. But that’s not the only thing that’s important about a design; it’s also important to consider how people are going to use the thing you’re making. This is applicable to most designs, whether you’re making an aircraft or a building or a phone.

In this case we’ve seen something particularly interesting happen, since the designers have tried to make the plane easier to fly, by delegating some control to autopilots in normal law. But it appears that simplifying the controls might have actually contributed to the accident by confusing one of the pilots. This seems somewhat unintuitive; if we make something easier to use then it seems fair to reason that we also reduce the likelihood of someone using it wrong, and so make it safer. But it’s human nature to be lazy, so when you tell someone that they ordinarily don’t need to think about a particular variable, then they probably won’t think about it at all.

Now I don’t point this out to make an argument against any of the control systems that Airbus build into their aircraft (although, really, asynchronous controls? Isn’t that just obviously a bad idea?), or against making things simpler. Airbus probably know what they’re doing (“probably” being the operative word*). The point is that working out how someone will use something is just as important as figuring out how to make something work; it’s something that should be obvious, but that I suspect is often seen as a secondary consideration.

The designers of the aircraft obviously have thought about this, and their solution was to try to make it simpler to use by hiding some of the complexity from the pilots in normal law; but does that really make it simpler to fly the plane? Perhaps in normal flight, but I suspect that we’d like our aircraft to be designed with abnormal flight in mind as well. And in that situation, perhaps what would really make things simpler is a way of helping the pilots to deal with the complexity, rather than trying to shield them from it and presenting them with unnecessary changeability when that is not possible.

* The Airbus A380 is big, and so Airbus tried to make it as light as possible. To do that, they’ve used carbon fibre reinforced plastic in certain parts of the structure, notably the wings. Carbon fibre: light! strong! stiff! notoriously brittle! Er, hang on a minute…

When I read that they’d used composites, I wondered whether it’d be a great idea. My main concern was durability: would the material start to crack after a certain number of cycles? Imagine my absolute lack of surprise when it was reported recently that Qantas and Singapore Airlines have discovered that there are cracks on the wings of some of their A380s… Airbus say that it’s not important, the cracks are on non-critical parts of the aircraft (although.. really? I highly doubt that it’s designed to crack). I’m sure they’re right, but it’ll be an interesting one to watch.

Posted at 2:49 am | Posted In: Engineering Tagged:



Sunday 15th January 2012, 4:27 am

This sort of story actually makes me think that Airbus did the right thing in taking a lot of the decision-making out of human hands. Human beings are panicy and irrational, as can be seen from this guy totally losing his shit.

Personally, I’m more surprised there was no fall-back for the airspeed systems; naturally you’d lose the ability to measure the actual speed of the air, but it should still be able to measure velocity w.r.t. the earth using inertial navigation and GPS. Unless the wind is blowing very quickly, I’d imagine that would still do you some good.

That said, I can’t say I know much about aeronautical engineering. Will have to ask someone who does…


Sunday 15th January 2012, 11:10 am

This is a fascinating and well-written post :-)

And actually, I’d 100% agree with your analysis of the situation. The analogy which springs to my mind is the friend of ours (I’m sure you know whom!) who reached Further Maths A Level without being able to do times tables in her head. She knew she ought to be able to do it; she knew that it would make her life easier. But because with the exception of C1, the calculator was always there, she still didn’t.

And the number of kids in schools/tutorials I have seen with exactly the same approach is quite mind-boggling (for one who has always been good at and enjoyed mental maths!)


Sunday 15th January 2012, 7:20 pm

@ Andy:

Yeah, I agree that it’s not necessarily a bad thing to remove control from people. But doing so in a manner which is confusing or inconsistent might not be beneficial, as we’ve seen here..

Btw I think you’re right, you can use the GPS to get some sort of figure for the airspeed. Before GPS became prevalent, indicated airspeed was used to calculate ground speed for navigation purposes, and this is just that calculation in reverse. But airspeed is fairly critical to the aerodynamics of flight and so is (I would assume) an important piece of information for the autopilot to have in order for it to work correctly. Would a back-calculated value be accurate enough for the autopilot to use to help fly the plane? I don’t know; if you do ever ask someone who knows more about aeronautics I’d be interested to hear the answer.

@ Lucy:

Glad you liked it :-) Good analogy, although I have to admit that my mental arithmetic is fairly shocking as well…


Thursday 19th January 2012, 12:18 am

I suppose the other way of considering it is, how many planes would have been ditched if the computer was usually less involved? It’s possible that not absolutely perfect solution exists.

I often quite enjoy doing mental arithmetic; it wouldn’t hurt to teach it, as you really need to build up techniques to be effective with it. For instance if I was going to work out, say, 8 * 96, I’d say that’s the same as 8 * (100 – 4) = 800 – 32 = 770 – 2 = 768, which is not a set of steps you’d even think of doing with a calculator, or working it out longhand.

There’s a time and a place for both doing it yourself and letting it be done by a machine, though. I hardly ever bother doing my own integrations, for instance, unless I’m fairly convinced off the bat that it won’t be difficult. I justify this to myself by saying that often there isn’t an answer at all (even for common functions, like a Gaussian) or it’s so awful that it would take forever to do.


Thursday 19th January 2012, 12:45 am

Oh, I’m certain that an absolutely perfect solution doesn’t exist. They never do, engineering is all about trade-offs. In this case, I think a pretty good solution/improvement would be to make the controls synchronous; it wouldn’t have stopped the first pilot from doing something silly, but it might have made the second pilot realise a bit sooner.

Write a comment: