Last week was the 50th anniversary John Glenn’s flight in which he became the first American to orbit the earth. The issues the flight had with automated controls make me think of how automation-related issues affect the world today.
The automated attitude control system used a control loop that would fire thrusters to correct the orientation of the capsule. The system used small thrusters for minor adjustments and large thrusters intended primarily for changing the capsules orbit to return to earth. Glenn found the control loop was at times using the large thrusters for orientation adjustments that could be done with the smaller jets. This wasted fuel that would be needed to return to earth. (See Chaper 6 of this NASA document for further reading.)
This situation reminds me of driving on a hilly highway using cruise control and automatic transmission. When the road exceeds a certain grade, the cruise control opens the throttle which causes the transmission to downshift out of overdrive to get more power. If I find this is causing needless shifts on small hills, I may disengage the cruise control and wish I could override the automatic transmission too.
Glenn took this approach and turned off the autopilot system. He experimented with controlling the thrusters completely manually and by using a semi-automated fly-by-wire mode. The fly-by-wire mode allowed Glenn to control the orientation manually with the computer working out how much to burn each thruster. This semi-automated mode turned out to be the best choice for attitude control. He used the fully-automated mode for re-entry but stayed ready to switch the autopilot off if necessary.
The paradox of automation is once an automated system is perfected it often performs worse than a less reliable system because human operators become dependent on the automation. This appears to be what happened in the Air France Flight 447 crash in 2009. A loss of airspeed indication caused the plane to switch from a fully automated mode to a semi-automated one. The fully-automated mode would not allow pilots to put the plane into an aerodynamic stall. This may have been why the pilots ignored an audible stall warning and didn’t even discuss the possibility the plane was in a stall.
It is especially hard to improve the safety in commercial aviation because it is already so safe. Maybe as automation improves further, we will just accept humans becoming less skilled because most of the time computers are more reliable. On the other hand, systems may be made more reliable using some sort of psychological techniques in which an automated system requests input from human operators to keep the human operators sharp and engaged in the details of the system.