Study Highlights Driver Confusion About Tesla's Autopilot System
Automakers including Tesla are using confusing names for their automated-driving systems, which can be tricky for drivers to grasp.
Vehicles are getting increasingly sophisticated, with more and more of them able to stay in a lane and maintain a set speed and following distance with minimal driver input. But this kind of automation has limitations that drivers seems to not be able to realize, according to a study by the Insurance Institute for Highway Safety (IIHS).
The study revealed how the names manufacturers use for these systems can send the wrong messages to drivers regarding how attentive they should be.
The automation available in vehicles (developed by SAE International) available for purchase today is considered Level 1 or 2, which applies to systems that can perform one or more parts of the driving task under supervision of the driver. An example of a Level 1 system is lane centering, in which lateral control of the vehicle is automated, or adaptive cruise control, in which longitudinal control — i.e. speed and following distance — is automated. Systems that can perform both of those functions simultaneously are Level 2 systems.
These systems are a far cry from Level 5 automation, in which the entire driving task can be performed without input from a human under all conditions.
Despite the limitations of today's systems, some of their names seem to overpromise when it comes to the degree to which the driver can shift their attention away from the road. One name in particular — Tesla's Autopilot — signals to drivers that they can turn their thoughts and their eyes elsewhere, an IIHS survey found.
For the survey, more than 2,000 drivers were asked about five Level 2 system names currently on the market. The names were Autopilot (used by Tesla), Traffic Jam Assist (Audi and Acura), Super Cruise (Cadillac), Driving Assistant Plus (BMW) and ProPilot Assist (Nissan). Participants were told the names of the systems but not the vehicle brands associated with them and weren't given any other information about the systems.
None of these systems reliably manage lane-keeping and speed control in all situations. All of them require drivers to remain attentive, and all but Super Cruise warn the driver if hands aren't detected on the wheel. Super Cruise instead uses a camera to monitor the driver's gaze and will issue a warning if the driver isn't looking forward.
Each participant answered questions about two of the systems chosen at random. They were asked whether particular behaviors were safe while using that technology.
When asked whether it would be safe to take one's hands off the wheel while using the technology, 48 percent of people asked about Autopilot said they thought it would be, compared with 33 percent or fewer for the other systems. Autopilot also had substantially greater proportions of people who thought it would be safe to look at scenery, read a book, talk on a cellphone or text. Six percent thought it would be OK to take a nap while using Autopilot, compared with 3 percent for the other systems.
At least a few Tesla owners have been misusing Autopilot in this way, with fatal results.
In March, a Tesla driver crashed into the side of a tractor-trailer in Florida. The Model 3 went completely under the truck, shearing off the Model 3's roof and killing its driver. A preliminary investigation by the National Transportation Safety Board found that Autopilot was engaged at the time of the crash, and the driver's hands were not detected on the steering wheel.
The same was true in the crash of a Tesla Model X in California one year before and a 2016 Florida crash of a Model S that also involved the side of a tractor-trailer.
"Tesla's user manual says clearly that the Autopilot's steering function is a ‘hands-on feature,' but that message clearly hasn't reached everybody," said IIHS President David Harkey. "Manufacturers should consider what message the names of their systems send to people."