NZ Aviator

View Original

Impact of Automation and Design in Aviation: Human Factors

More dials, more knobs, more speed, and more stress. There's no doubt there's been significant advances in technology where we've gone from steam-driven dials to glass cockpits and the technology just keeps streaming in. So, it's easy for pilots to be overwhelmed. And a pilot's job has changed where they now must monitor and understand complex systems which takes them away from flying the aircraft, plus, certain makers of technology like GPS, keep adding bits and bobs because they can. But this means manuals have got thicker and menus longer and convoluted. So, it's vital pilots know and understand their aircraft and systems.

Automation has done wonders for safety, but if not managed properly, it can impinge safety. So, one of the risks that we see emerging currently is skill loss. The more you have automation do the task, the less you do it yourself. But, more so than that, you see that automation starts betting aeroplanes into a particular cocoon in which they feel protected or give the sense of being protected, where even the manufacturers will tell you "Oh, in the automation, you won't be able to stall the aeroplane" and pilot will go, "Well, I, myself, won't stall the aeroplane either". But what we see is that in combination, in a so-called "going sour" scenario, aeroplanes' automation, together with a pilot, is quite capable of, in fact, stalling an aeroplane or bringing it way outside of the envelope.

Photo / Marius Maasewerd

There's so many examples of how automation's made our life easier, but there's plenty of examples out there that we can find where automation has led to tears. It got too dependent, too complacent, and too reliant on it. I think so much of that comes back to the culture of the company, the training of the crew, and the situational awareness of the crew, to understand the limits of the automation and the limits of the human being sitting behind it and watching it.

Auto-pilots are fantastic, but if you get too dependent on them, and you find yourself up in some pretty seriously bumpy weather and then the auto-pilot spits a dummy and doesn't want to fly it anymore, and now it's on your hands and it's night and it's pouring rain, and you're top end of a cell, then if you aren't current then you've found yourself in a big heap of trouble. Incumbent on you to say, "Well, I need to stay current, to stay confident, I need to be able to hand-fly this thing in tough conditions", set some standards to keep yourself to that but also don't take the aeroplane in a place where if you lose the automation, will you be up to the task?

Long-term, I think we're going to see remarkable automation, we're seeing it now. Pilotless aircraft, we're seeing drones everywhere. So, automation, whether we like it or not, is a thing of the future. Design in automation is never going to end. I think the technology is embraced by everybody. And to ensure that there's no manual degradation of flying skills, technology is... Not taken over, but sort of has distracted the crew, is that, you know, we always can come back to those manual skills. So, we should always allow for those skills to continue to evolve, to practise those skills. When it comes to flying, we need to, yeah, continue to embrace automation, but also ensure that training and, indeed, recency processes are focused on the maintenance of those manual skills.

What's standing out from all the research in automation is that automation in the cockpit doesn't make human work go away, it doesn't replace human work. It changes human work. And so all of a sudden the pilot now has different tasks, tasks for which he or she, perhaps, is not even trained that well. So, one of the research insights is that rather than reducing workload, automation actually redistributes workload. It makes pilots busier in times that were traditionally already busy, like an approach or setting up for an approach, because now, all of sudden, the automation needs input as well or needs to be set up or, in fact, can spring surprises on pilots that then need to be managed and it gives pilots even less to do in those periods in which they already had little to do, like cruise.

Now, we know that people aren't very well-suited for long-term monitoring tasks, 'cause it is a very difficult thing for them to do, What research is also showing is that the workload transition from the low, low, the low, in which you don't do a lot and you're just monitoring, passively, the automation, to all of a sudden having to spring into action, is a very difficult transition because it calls on very different cognitive mechanisms in the human being.

So, for an automation surprise to happen, a couple of ingredients need to be present. First, the automation needs to be doing something that the pilot didn't program right there and then. It may have been programmed 20 minutes ago or a few hours ago or it may in fact be a mode transition that is related to something else that's happening in the automation. So, that's one. The other one is that typically this happens in periods where workload has gone up and where pilots are already busy or trying to manage other things and other changes. And thirdly, what we've seen is that automation, if it wants to spring a surprise, it isn't very clear about communicating its intentions. It just says, "I'm in this mode", and that's all you'll get to see. But, you don't know what it's going to do or what being in that mode really means.

One of the problems with the technology can be the fact that the functionality is so huge. Sometimes, not everything is critical to what you're doing for the flight and that functionality can be good, but it also means that the menus can become incredibly complex and layered. And, it's not unusual to get lost in those menu systems. And knowing your system is vital, so that if you suddenly have to make a change to an element of the data that you've loaded in or go and check some information, you know quickly how to get to that particular function. But if we just base all of our countermeasures in the future on training, we're in trouble. We need to take it back to design. We need to get good human factors thinking into the design of these systems. If the system is easy to use, half of our problems go away. We're going beyond the point where we're just strapping in technological tools to the cockpit. All the systems, even in general aviation aircraft are becoming incredibly highly integrated. Even pre-flight planning, where we can now pre-flight on an iPad, walk into the aeroplane and download that flight plan into our avionics suite. Think about the interface, think about the junctures where there's potential for data to be communicated wrongly, not through the technical side of the system but through the human side of the system. We need to make sure that at each point we're doing that in the best way for the human.

Design becomes very important that we have to think about the pilot that's actually going to be flying that aeroplane. What information do we present to the pilot? How do we present that information? With more complexity in glass cockpit aeroplanes or with advancements in design, we can put in more information. More information means more information processing is required at times to actually process all of that information. That level of complexity, if we don't teach people how to use it, that obviously introduces threats and opportunity for error as well.

If we go back to your initial INS or Inertial Navigation Systems, which was a precursor, really, to FMS or Flight Management Systems, data input errors have been around for donkey's years, as we know. Yeah, whether you're writing down numbers, putting it into a computer or putting it into a flight management computer, that error is the same. The consequence is different, so putting in the incorrect lat and long, or an incorrect waypoint. There was a very well-known accident with an American Airlines 757 going into Columbia, where the pilot's actually entered a waypoint, spelt it correctly, it just turned out to be that there was a more than one waypoint, with the same spelling. The one that they entered happened to be behind the airplane, so the airplane started to turn towards the mountains. We were talking about that complexity where they got drawn into really looking at "What have we done?" "Is that the correct spelling?" Rather than actually going back to basics and flying the aeroplane. You know, disconnect the autopilot or go to a heading mode or an altitude mode, if you need to. Fly the aeroplane first, work out the navigation second. The old Aviate, Navigate, Communicate. They got focused on the error, focused on the flight management system and the aeroplane ended up flying into a mountain.

Automation becomes dangerous when you're using it and you're not familiar with it. When the system starts to do something that it's programmed to do that you're not aware of it. So, if you have one expectation of how the aeroplane is going to perform or behave and the aeroplane does something completely different. If you're up in the cruise and you've got spare time to sort it out, that's fine, but if it's somewhere critical, like coming in on an instrument approach, approaching minima, then you don't have a lot of time to sort out the problem. Programming the fuelling is a fantastic bit of situational awareness, because you start getting alerts when your fuel runs low, but if you go flying and you haven't told the aeroplane that you're full of fuel, it thinks you've only got a third of a tank, it'll start giving you warnings and alarms. You've actually got heaps of fuel but the system thinks you don't. So, again if you're up in the cruise and it starts bleating at you, that's fine, you've got time to figure that out. But, if you start getting a whole bunch of warnings on short final and that distracts you from your primary task of flying the aeroplane, again, the system becomes a distraction and a danger, rather than something that supports your operation.

We need to develop two streams of expertise. I think we need to be experts in working with the automation but we need to be experts in working without it as well. It's naive to think we can just get rid of the automation and function as an expert without it without having practised. That's where we build expertise. We need to have well-developed backup options, find ways to practise those backup options. So, they're available to you without having to suddenly be out of your depth.

The pilot needs a foundation of knowledge of the system. So, that when these complex systems fail and they fail from the top-down, and the aircraft systems are glass half-full, in other words, the aircraft will tell you everything that goes wrong, but when you have 1,235 checklists in an A380, that can overwhelm you. So, the glass half-full approach when things go seriously wrong doesn't work. All that does is overload your mental model and your situation awareness collapses. You always need to know some of the basic rules of flight. You need to be able to multiply and divide in case your calculator doesn't work. Know your position in case that GPS fails. Have the chronic unease and a healthy scepticism for computers and automation. Have sense of reasonableness. Does this seem right? And if it doesn't seem right? Particularly, if in a crisis where many things are going wrong, the whole logic of the system might collapse, where one parameter feeds into another system and that parameter goes skew-whiff, the whole system might collapse. On QF32, we had an ECAM saying "Low pressure over the hydraulic oil reservoir" then we saw the oil level of the reservoir decreasing, then we see at the level of the oil getting to zero, then we see system loss, an ECAM system loss for that hydraulic system. So, we turned off the hydraulic pumps. All those indications were wrong. The A380, like a lot of aircraft, has a thing called the CAN, or CAN bus, and it's liable to errors where you cut both wires. What's the difference between no data and a zero? When you pull the contact off your pressure transducer in your car, the oil pressure goes to zero. In your gauge, does that mean you have zero oil pressure? Maybe not. So, be sceptical of automation. Always have a foundation knowledge, so when things go seriously wrong, if you can't understand it from the top-down, invert your logic and bring it up from the bottom-up. Build your aeroplane from the ground up, a bit like a Junkyard Wars that we see on TV. Just need a few wheels. Keep it simple. An aeroplane is just a flying lawnmower. It's not that much more complicated.

Having a simulator with that level of automation means that you can practise and get used to the automation before you actually get into the aircraft. And so, the automation I think is important. You've got to know what your whole system is capable of doing, so that you can stay ahead the aircraft too. You don't press the button to make it do something you're not prepared for it to do. Automation is important. Knowing how to set up the automation is important, and so practising it becomes important as well.

The thing I look at, and I'll get a lot of people my age that'll say, "The next generation, they can't stay focused". The reality is the next generation is exactly what we need. They are tech savvy. They can move information rapidly and quickly through multiple sources including apps. That is a great thing. So, and if you look at the technology, the automation today, within our modern aircraft, it's exactly what we want. The advantages of VR - it's cost-effective training that's experiential. So, a lot of what we want to do is if we can give you training that is more representative of what you can expect in the workplace and what you can build into VR, similar what you get in high-end simulators, are scenarios that make you feel some of the pressures and the tensions and so forth. So, VR is a great opportunity to make training more cost-effective, to significantly enhance the quality of the training and to make it more accessible to many for big benefits. And it's still emerging, we've still got a long way to go, but I'm aware of a number of organisations that have significantly enhanced the quality of their training through the use of VR.

Content and Video by Civil Aviation Safety Authority

CASA resource booklet 10 design and automation