MADISON, Wis.—In designing an autonomous car, leading developers disagree whether the vehicle should be without a driver or driver “optional.”
Toyota, for example, advocates what the company calls “the Mobility Team concept,” which Toyota says makes the car a partner, or “teammate” for the driver. In contrast, Google believes in a self-driving car with no steering wheel and no human driver. Tesla, meanwhile, pushes “autopilot” features, said to be inspired by aviation. Tesla’s autopilot, however, still wants a driver sitting in the driver’s seat.
This disparity in self-driving approaches illustrates the almost impossible tasks automakers face today in making infallible autonomous cars and predicting human behavior in interactions with highly automated systems.
Comparing flying with driving
Fortunately, automakers aren’t charting a new course here. The aviation industry has fully embraced automation in flight control and navigation systems since the mid-1970’s.
So, what have we learned? And how applicable are they for self-driving cars? Or, setting cockpit automation aside, is a direct comparison between flying and driving a matter of apples and oranges?
One thing we know is that automation has made flights safer.
The yearly fatal accident rate (per million departures) in commercial jet operations has drastically declined from 4 in mid-60’s to well below 1 in 2014, according to the latest Airbus statics shown above.
Safety isn’t the only outcome of automation, though. “Automation reduces our workload [for pilots], it makes flights a hell of a lot smoother for passengers, and saves fuel,” Richard Hartman, a retired commercial and military pilot, told EE Times.
Indeed, technology developers hit all these points when they advocate autonomous cars.
But human factor engineers argue that once a human gets involved, the outcome of more automation in a system proves not only complicated but also unpredictable.
Michael Clamann, senior scientist at Duke, who heads up the university’s driverless car research, told us, “Adding lots of automation does not make it [driving] easier. Instead, it changes it.”
Since 2002, before joining Duke, Clamann was a human factors engineer in the industry supporting government and private clients in domains that included aerospace, defense and telecommunication. He explained, first, automation changes a driver from a physical to a mental state. Second, once detached from the manual mode, the driver enters a “supervisory” control mode. “That changes a few things.”
Failures of human-automation interaction
Look no further than some airplane accidents that happened in the past. Automation experts say that many accidents labeled as human error by the Federal Aviation Administration and the national Transportation Safety Board might be better categorized as “failures of human-automation interaction.”
To read the rest of this article, visit EBN sister site EE Times.