The Paradox of Automation

This post from Marc Rubinstein (which was shortlisted for this month’s Post of the Month vote) on the lessons from the tragic crash of Air France Flight 447 ten years ago gathered together a number of interesting concepts. One of these was the so-called ‘paradox of automation’, which is the idea that:

‘…the more efficient the automated system, the more crucial the human contribution of the operators…If an automated system has an error, it will multiply that error until it’s fixed or shut down. This is where human operators come in. Efficient Automation makes humans more important, not less.’

Marc draws on William Langewiesche’s account of the Air France crash in Vanity Fair which noted how the growing capabilities of autopilot and fly-by-wire systems had dramatically improved airline safety records, but it had also protected pilots so efficiently from hands-on flying and the need to deal with different and unexpected challenges that when difficult situations did occur pilots had little direct experience to draw on. 

In his account of the same crash, Tim Harford describes the paradox of automation as having three strands (paraphrased here):

  1. Since automated systems can be so easy to operate, automatically correcting mistakes, they can accommodate incompetence and mean that an operator’s lack of skill or incompetence can be hidden for a long time
  2. Automatic systems can erode the skills of even expert operators over time by removing the need for practice
  3. When automated systems fail, they often fail in atypical circumstances or in ways that produce unusual situations, amplifying the need for a particularly skillful response. The more capable the automated system, the worse the potential situation might be.

The fly-by-wire systems in the Airbus 330 of the type involved in the Air France crash were so effective that they had actually served to create a new type of (potentially hidden) risk. This risk has the further potential to be amplified by existing human biases and frailties such as cognitive tunnelling. Harford refers to one of Earl Weiner’s laws of aviation and computerised flight: ‘Digital devices tune out small errors while creating opportunities for large errors’.

As automation becomes ever more pervasive this paradox has the potential to create more unexpected failures like the Air France crash. Which is perhaps why it will take many years to reach level five automation in self-driving cars, if at all.

发表评论

电子邮件地址不会被公开。 必填项已用*标注