LOS ANGELES – Federal safety regulators send team to California to investigate fatal freeway accident involving a Tesla, just after authorities near Oakland arrested a man in another Tesla driving on a freeway with no one behind the wheel.
Experts say both cases pressure the National Highway Traffic Safety Administration to take action against Tesla’s TSLA,
partially automated driving system called Autopilot, which has been involved in multiple crashes with at least three fatalities in the United States.
“Details of whether the Tesla was in autonomous mode are still under investigation,” Officer Stephen Rawls, a spokesperson for the California Highway Patrol, said in an email Wednesday.
The Tesla driver, a 35-year-old man whose name has not been released, was killed and another man was seriously injured when the electric car struck an overturned semi-trailer on a highway. The injured man, a 30-year-old passing motorist, was struck by the Tesla while helping the semiconductor driver out of the wreckage.
“We have launched a special crash investigation for this crash. NHTSA remains vigilant in monitoring the safety of all vehicles and motor equipment, including automated technologies, ”the agency said in a statement Wednesday.
The investigation comes just after the California Highway Patrol arrested another man who authorities said was in the back seat of a Tesla traveling on Interstate 80 with no one behind the wheel.
Param Sharma, 25, is charged with reckless driving and disobeying a peace officer, the CHP said in a statement on Tuesday.
The statement did not say whether officials determined whether the Tesla was operating on autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it.
But it is likely that either the autopilot or the “Full Self-Driving” worked so that the driver was in the backseat. Tesla allows a limited number of owners to test its autonomous driving system.
Tesla, which disbanded its public relations department, did not respond to messages seeking comment on Wednesday.
The Fontana investigation, in addition to polls from two Michigan crashes earlier this year, shows NHTSA is taking a closer look at Tesla systems. Experts say the agency needs to master these systems because people tend to trust them too much when they can’t drive themselves.
“I think they’re very likely getting serious about this, and we might actually start to see stocks in the not too distant future,” said Sam Abuelsamid, senior mobility analyst for Guidehouse Insights, who tracks systems automated.
“I really think the growing number of incidents are adding more fuel to the fire to get NHTSA to do more,” said Missy Cummings, professor of electrical and computer engineering at Duke University who studies automated vehicles. “I think they’re going to be stronger about it. “
Tesla says on its website and in owner’s manuals that for both driver assistance systems, drivers must be ready to intervene at all times. But drivers have repeatedly zoned with Autopilot in use, resulting in crashes in which neither the system nor the driver has stopped for obstacles in the road.
The federal agency could declare the autopilot faulty and demand that it be recalled, or it could force Tesla to limit areas where the autopilot can be used on limited access highways. It could also cause the company to install a more robust system to ensure that drivers are attentive.
The auto industry, with the exception of Tesla, is already doing a good job of limiting where these systems can operate and self-regulate, Cummings said. Tesla seems to be heading in that direction. It now installs driver-facing cameras on recent models, she said.
Tesla has a system to monitor drivers to make sure they are paying attention by sensing the strength of the hands on the steering wheel.
The system will issue warnings and possibly stop the car if it does not detect the hands. But critics said Tesla’s system was easy to cheat and could take up to a minute to shut down. Consumer Reports said in April it was able to trick a Tesla into driving on autopilot with no one behind the wheel.
In March, a Tesla official also told California regulators that “full autonomous driving” is a driver assistance system that requires monitoring by humans. In notes released by the state’s Department of Motor Vehicles, the company could not say whether Tesla’s technology would improve to become fully autonomous by the end of the year, contrary to statements by the CEO of the company, Elon Musk.
In the back-driving case, authorities received several 911 calls Monday evening indicating that a person was in the back of the Tesla Model 3 as the vehicle was traveling on Interstate 80 across the San Francisco Bridge. -Oakland Bay.
A motorcycle officer spotted the Tesla, confirmed the lone occupant was in the back seat, took steps to stop the car, and saw the occupant move to the driver’s seat before the car left. ‘stop, the California Highway Patrol statement said.
Authorities said they cited Sharma on April 27 for similar behavior.
In an interview with The Associated Press on Wednesday, Sharma said he had done nothing wrong and would continue to ride in the backseat with no one behind the wheel.
Musk wants him to keep doing this, he said. “It was actually designed to be mounted in the backseat,” said Sharma. “I feel safer in the back seat than in the driver’s seat, and I feel safer with my car on autopilot, I trust my autopilot more than I trust everyone else on the road. “
He thinks his Model 3 can handle itself and doesn’t understand why he had to spend a night in jail.
“The way we are now, I can throw a standalone Tesla from Emeryville to downtown San Francisco from the backseat,” he said, adding that he had driven about 40,000 miles in Tesla vehicles without being in the driver’s seat.
Sharma’s comments suggest he is one of a number of Tesla drivers who rely too heavily on the company’s driving systems, Duke’s Cummings said.
“It shows people the thought process behind people who overly trust untested technology,” she said.