Self-driving cars have now seen their first fatality, with the death of a Tesla Model S driver on a rural highway in Florida.
Automakers are racing each other to deploy self-driving features, urged on by a federal government that sees autonomous cars as a big advance in safety. Many analysts don’t expect the Tesla accident, which happened while the car was steering itself in autopilot mode, to slow down the technology’s rapid advance – even if a few wish it would.
And while the public appears split on robot cars – with some welcoming the idea while others view it with deep suspicion – drivers have a history of forgiving problems in new auto technology. Airbags, after all, are widely considered an important safety feature, even though they have on occasion killed people.
“It’s a black eye for the technology,” said Ron Montoya, consumer advice editor for Edmunds.com, the online car guide. “I think we’ll get past this. People die in cars every day, unfortunately, and we don’t stop driving. To this day, you can break your nose when an airbag deploys, but they’re still a preferred safety technology.”
With the National Highway Traffic Safety Administration investigating the Tesla accident, the company may come under pressure to temporarily deactivate its autopilot system, which the Palo Alto company introduced last year.
“They absolutely need to shut it down until they get a better handle on this,” said John Simpson with the nonprofit Consumer Watchdog. “It’s not appropriate to use your customers as human guinea pigs.”
In response, a Tesla spokeswoman said the company is constantly refining its technology but still expects drivers to pay attention. Some reports indicate the driver killed in the May 7 crash – Joshua Brown, of Canton, Ohio – may have been watching a Harry Potter film when his car slammed into a big-rig truck. (Tesla said the Model S’s touchscreen does not play videos. A DVD player was found in the car.)
“Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the spokeswoman said.
The tendency of drivers to let their eyes and minds wander has been one of the key issues bedeviling self-driving technology.
Google, which has been testing its own autonomous fleet on the streets of Silicon Valley for years, at one point ran an experiment in which it let employees take self-driving cars on their daily commutes, with their behavior behind the wheel recorded on camera.
Footage showed drivers at times rummaging through items in the backseat, their eyes nowhere near the road. That experience led Google to design a prototype autonomous car that didn’t even include a steering wheel – ensuring the car would do all the work.
Some Tesla owners have posted online videos of themselves making questionable use of autopilot, including a driver in the Netherlands who shot the video while sitting in the backseat, with no one behind the wheel. Earlier this week, an online video surfaced that appears to show a Model S driver sleeping while his car navigates stop-and-go traffic.
Tesla reminds its customers that autopilot is still in testing and needs an alert human backup at all times. But clearly, some drivers ignore those warnings. The fatal crash may remind them.
“If there are less YouTube videos of guys sticking their heads and arms out the windows with their feet on the dash, that will be a good thing,” said Karl Brauer, senior analyst at the Kelley Blue Book auto information service. “We’ve seen far too much of that already, and all of it suggests an accident like this was inevitable.”
According to Tesla, the crash happened while the car was driving down a divided highway. A tractor trailer tried to drive across the highway, perpendicular to the Model S. While the autopilot’s sensors and software are designed to spot obstacles, the system could not distinguish the truck’s white side against the backdrop of a bright sky, according to the company. Neither the driver nor the car applied the brakes.
(Tesla’s cars constantly record information about their trips and performance and relay that information to the company, data that can be used to help reconstruct accidents.)
Brauer noted that the crash appeared to have two components – human inattention and technical failure. Autopilot uses both radar and a camera, and neither seems to have picked up the truck.
“I don’t think they rolled out autopilot too early,” he said. “I do think they didn’t emphasize its limitations enough.”
Indeed, Tesla CEO Elon Musk in January even said the system was “probably better than a person right now.”
Tesla is used to its cars facing scrutiny.
The company is the first new U.S. automaker to reach full-scale production in decades. And its expensive, fast, sumptuous cars have made it a standard-bearer for the electric vehicle movement.
So any possible problem with those cars can draw attention.
Over a six-week span in 2013, for example, the battery packs of three Model S sedans caught fire after traffic accidents. Two of those accidents involved large pieces of metal debris striking the cars, while in the third, the driver raced through a roundabout in Mexico, smashed through two walls and hit a tree. The fires made national news, even though roughly 152,300 cars catch on fire in the United States in a typical year.
Feeling the company had been smeared, angered Tesla fans took to posting on Twitter photos of gasoline-powered cars engulfed in flames, with the hashtag #notaTesla.
None of the Model S fires, however, injured drivers, much less ended in death.
To date, the vast majority of known accidents involving self-driving cars have been minor. Google, for example, last year reported that its vehicles had been hit by other cars 14 times since 2009, and in each case, the other, human driver was at fault. Most of the collisions involved Google’s cars being rear-ended at low speed at traffic lights.
In February, a Google car did, however, veer into a public bus in Mountain View, again at low speed. No one was injured.
Should the autopilot fatality prove to be an isolated incident, most analysts don’t expect it to slow down the introduction of self-driving technology. Truly autonomous cars, those that can handle every task of driving for themselves, are still years from hitting the market, after all.
“Consumer memory is pretty short on things like this,” Brauer said. “But if this is the first in a series, or if there’s a really dramatic incident with multiple vehicles … that will certainly negatively impact public acceptance of this technology.”