BUSINESS

NHTSA probes Tesla self-driving cars after fatal crash

Melissa Burden, Keith Laing and Michael Wayland
The Detroit News

Federal regulators are investigating self-driving technology used in Tesla cars after a fatal crash involving a 2015 Tesla Model S that was operating with its automated driving system activated.

Tesla Model S

It is believed to be the first U.S. death in a vehicle engaged in a semi-autonomous driving feature, though the National Highway Traffic Safety Administration would not confirm that.

NHTSA said Thursday it received reports from Tesla about a crash in Williston, Florida, near Gainesville, on May 7 with the vehicle operating in its “Autopilot” mode.

NHTSA said preliminary reports say the crash happened when a semi-trailer rig turned left in front of the Tesla at a highway intersection. Police said the roof of the car struck the underside of the trailer and the car passed beneath. The car went off the road, striking two wire fences and a power pole before coming to a rest about 100 feet away. The driver was dead at the scene.

“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a blog posting Thursday.

“The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

As the result of the crash, NHTSA is now investigating about 25,000 Tesla cars and will “examine the design and performance of any automated driving systems in use at the time of the crash.” The safety regulator said it will gather more data about the Florida crash as well as information about automated driving systems.

“The opening of the preliminary evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles,” NHTSA said.

Driver killed in Tesla crash had started tech company

Driver an ex-Navy SEAL

The Florida Department of Highway Safety and Motor Vehicles, which operates the state’s highway patrol, identified the driver who was killed as Joshua Brown of Canton, Ohio. Brown, a former Navy SEAL, was 40.

Brown had posted several YouTube videos about Tesla’s Autopilot feature, including one in April that showed a near-miss when a truck almost side-swiped his car on the freeway. Tesla CEO Elon Musk tweeted a link to the video later that month, writing: “owner video of Autopilot steering to avoid collision with a truck.” The video has generated more than 1.7 million views.

Tesla said in its blog that Brown’s death is the “first known fatality in just over 130 million miles where Autopilot was activated” and that the number of miles driven without a fatality in Autopilot was better than the fatality rate of every 94 million miles in the United States and approximately every 60 million miles worldwide.

Analysts say the death report could hurt Tesla’s safety reputation and the company may opt for a voluntary recall or initiate a stop-sale of vehicles with the Autopilot feature. The incident also may add to apprehension among the public about self-driving cars.

“I’d like to say I didn’t see this coming, but it was inevitable based on the documented abuses of driver-assist technology we’ve been seeing on sites like YouTube,” said Karl Brauer, a senior analyst for Kelley Blue Book. “We do not yet have fully autonomous cars, though all the headlines about them might lead some people to think we do. This will be a big hit to Tesla’s reputation because the automaker has been seen as a leader in both passenger safety and advanced technology.”

Tesla declined to answer if it will disable Autopilot on its vehicles as a result of the accident, citing a statement that “Autopilot is designed to add a layer of safety to Model S and to ensure this, it’s important that customers use Autopilot responsibly and as intended.”

The automaker began making Autopilot hardware standard on every Model S (and now, Model X) beginning in September 2014 but the software wasn’t ready. The first Autopilot software update became available in late 2015 and was sent wirelessly to all cars. Owners were able to download the new features in a couple of hours.

The feature allows the car to steer within a lane, change lanes with the tap of a turn signal, and manage speed by using active traffic-aware cruise control. The system uses camera, radar, ultrasonic sensors and GPS.

Tesla said Autopilot is turned off every time the car is shut down and “requires explicit acknowledgment that the system is new technology and still in a public beta phase before it can be enabled.” When drivers activate Autopilot, it reminds them it is an “assist feature” and reminds them to keep their hands on the steering wheel.

The company said the system makes “frequent checks” to make sure a driver’s hands are on the wheel, providing visual and audible alerts if hand contact isn’t detected. The car will gradually slow down until hands are detected, according to the blog.

Tesla said it was saddened by the news of Brown’s death.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the blog said.

Crash a ‘wake-up call’

The crash raises questions about whether NHTSA will announce guidelines for autonomous vehicles in July as it had planned. Consumer advocates say the crash should be investigated and understood before rules are issued.

NHTSA Senior Administrator Mark Rosekind said in June the guidelines would be more flexible than existing rules. Automakers have pushed for NHTSA to release nationwide standards that states could follow for testing and implementation of self-driving vehicles.

John Simpson, privacy project director for the California-based nonprofit Consumer Watchdog, said: “The fatal crash of a Tesla while on so-called Autopilot should serve as a wake-up call for all who are rushing to deploy autonomous vehicles. The race to put robot cars with inadequate safety protections on the road is both foolhardy and dangerous.”

Missy Cummings, director of Duke University’s Humans and Autonomy Lab, said the Tesla crash shows the limitations of “quasi-automated cars.”

“This is one of those situations where the system didn’t know what it couldn’t see and neither did the human,” she said.

Cummings, who testified before Congress in March that boosters of the self-driving technology are exaggerating its readiness for widespread expansion, said research shows “humans simply do not pay attention when they think the system is good enough.”

She said Thursday: “I hope this accident re-centers the debate on quasi-automated modes, because that’s where the biggest problems are.”

Although she has pushed for a more extended testing period for self-driving cars, Cummings said she did not think proponents should give up on the technology because of the fatal accident: “If we stopped advancing aircrafts every time we had an accident, we’d still be taking trains.”

The National Highway Traffic Safety Administration said Thursday it has launched a preliminary investigation into a fatal highway crash involving a 2015 Tesla Model S that was operating with its automated driving system activated.

mburden@detroitnews.com

(313) 222-2319

Staff writer Michael Martinez contributed.