The fatal crash of a Tesla with no one apparently behind the wheel has cast a new light on the safety of semi-autonomous vehicles and the nebulous U.S. regulatory terrain they navigate.
Police in Harris County, Texas, said a Tesla Model S smashed into a tree on Saturday at high speed after failing to negotiate a bend and burst into flames, killing one occupant found in the front passenger seat and the owner in the back seat.
Tesla Chief Executive Elon Musk tweeted on Monday that preliminary data downloaded by Tesla indicate the vehicle was not operating on Autopilot, and was not part of the automaker’s “Full Self-Driving” (FSD) system.
Tesla’s Autopilot and FSD, as well as the growing number of similar semi-autonomous driving functions in cars made by other automakers, present a challenge to officials responsible for motor vehicle and highway safety.
U.S. federal road safety authority, the National Highway Traffic Safety Administration (NHTSA), has yet to issue specific regulations or performance standards for semi-autonomous systems such as Autopilot, or fully autonomous vehicles (AVs).
There are no NHTSA rules requiring carmakers to ensure systems are used as intended or to stop drivers misusing them. The only significant federal limitation is that vehicles have steering wheels and human controls required under federal rules.
With no performance or technical standards, systems such as Autopilot inhabit a regulatory grey area.
The Texas crash follows a string of crashes involving Tesla cars being driven on Autopilot, its partially automated driving system which performs a range of functions such as helping drivers stay in lanes and steer on highways.
Tesla has also rolled out what it describes as a “beta” version of its FSD system to about 2,000 customers since October, effectively allowing them to test how well it works on public roads.
Harris County police are now seeking a search warrant for the Tesla data and said witnesses told them the victims intended to test the car’s automated driving
Adding to the regulatory confusion is that traditionally NHTSA regulates vehicle safety while departments of motor vehicles (DMVs) in individual states oversee drivers.
When it comes to semi-autonomous functions, it may not be apparent whether the onboard computer or the driver are controlling the car, or if the supervision is shared, says the U.S. National Transportation Safety Board (NTSB).
California has introduced AV regulations but they only apply to cars equipped with technology that can perform the dynamic driving task without the active physical control or monitoring of a human operator, the state’s DMV told Reuters.
It said Tesla’s full self-driving system does not yet meet those standards and is considered a type of Advance Driver Assistance System that it does not regulate.
That leaves Tesla’s Autopilot and its FSD system operating in regulatory limbo in California as the automaker rolls out new versions of the systems for its customers to test.
NHTSA, the federal body responsible for vehicle safety, said this week it has opened 28 investigations into crashes of Tesla vehicles, 24 of which remain active, and at least four, including the fatal Texas accident, occurred since March.
NHTSA has repeatedly argued that its broad authority to demand automakers recall any vehicle that poses an unreasonable safety risk is sufficient to address driver assistance systems.
So far, NHTSA has not taken any enforcement action against Tesla’s advanced driving systems.
White House spokeswoman Jen Psaki said NHTSA is “actively engaged with Tesla and local law enforcement” on the Texas crash.
The NTSB, a U.S. government agency charged with investigating road accidents, has criticized NHTSA’s hands-off approach to regulating cars with self-driving features and AVs.
“NHTSA refuses to take action for vehicles termed as having partial, or lower level, automation, and continues to wait for higher levels of automation before requiring that AV systems meet minimum national standards,” NTSB Chairman Robert Sumwalt wrote in a Feb. 1 letter to NHTSA.
“Because NHTSA has put in place no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV control systems limitations,” the letter said.
REVIEWING REGULATIONS
NHTSA told Reuters that with a new administration in place, it was reviewing regulations around AVs and welcomed the NTSB’s input as it advanced policies on automated driving systems.
It said the most advanced vehicle technologies on sale required a fully attentive human driver at all times.
“Abusing these technologies is, at a minimum, distracted driving. Every State in the nation holds the driver responsible for the safe operation of the vehicle,” NHTSA told Reuters.
NTSB also says NHTSA does not have any method to verify whether carmakers have adopted system safeguards. For example, there are no federal regulations requiring drivers to touch the steering wheel within a specific time frame.
“NHTSA is drafting rules on autonomous vehicles, but it has been slow to regulate semi-autonomous vehicles,” said Bryant Walker Smith, a law professor at the University of South Carolina. “There is a growing awareness that they deserve more scrutiny priority and regulatory action.”
New York has a law requiring drivers to keep at least one hand on the wheel at all times but no other states have legislation that could prevent the use of semi-autonomous cars.
When it comes to AVs, 35 states have enacted legislation or state governors have signed executive orders covering AVs, according to the National Conference of State Legislatures.
Such rules allow companies such as Alphabet’s (GOOGL.O) Google and General Motors (GM.N), among others, to test their Waymo and Cruise vehicles on public roads.
But regulations differ by state.
AV regulations in Texas state that vehicles must comply with NHTSA processes, though there are no such federal regulations. The Texas Department of Public Safety, the regulator charged with overseeing AVs, did not respond to a request for comment.
Arizona’s transport department requires companies to submit regular filings to verify, among other things, that vehicles can operate safely if the autonomous technology fails.
While most automakers offer vehicles with various forms of assisted driving, there are no fully autonomous vehicles for sale to customers in the United States.
RED FLAGS
In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge for its Autopilot system: how to recognize when a parked police car’s emergency flashing lights are turned on.
“This is an example of a new task we would like to know about,” Karpathy said at a conference during a talk about Tesla’s effort to deliver FSD technology.
In just over a year since then, Tesla vehicles crashed into police cars parked on roads on four separate occasions and since 2016 at least three Tesla vehicles operating on Autopilot have been in fatal crashes.
U.S. safety regulators, police and local government have investigated all four incidents, officials told Reuters.
At least three of the cars were on Autopilot, police said. In one of the cases, a doctor was watching a movie on a phone when his vehicle rammed into a police trooper in North Carolina.
Tesla did not immediately respond to a request for comment.
Accidents and investigations have not slowed Musk’s drive to promote Tesla cars as capable of driving themselves.
In a recent Tweet, Musk said Tesla is “almost ready with FSD Beta V9.0. Step change improvement is massive, especially for weird corner cases & bad weather. Pure vision, no radar.”
Tesla also says it has used 1 million cars on the road to collect image data and improve Autopilot, using machine learning and artificial intelligence.
Tesla’s Karpathy said he has ridden in his Tesla for 20 minutes to get coffee in Palo Alto with no intervention.
“It is not a perfect system but it is getting there,” he said in a “Robot Brains” podcast in March. “I definitely keep my hands on the wheel.”