A Tesla car involved in a fatal crash on a southern California freeway last week may have been running on autopilot, according to the California Highway Patrol.
The May 5 accident in Fontana, a town 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration (NHTSA). This is the 29th case involving a Tesla that the federal agency has investigated.
In the Fontana crash, a 35-year-old man was killed when his Tesla Model 3 hit an overturned semi-trailer on a highway around 2:30 a.m. The driver’s name has not been made public. Another man was seriously injured when the electric vehicle struck him while helping the semiconductor conductor out of the wreckage.
The National Highway Patrol (CHP) announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system “was on.”
But on Friday, the agency retracted its previous statement. “To clarify,” according to a new statement, “there was no final determination as to how the Tesla drove or whether it contributed to the crash. “
At least three people have died in previous U.S. crashes involving the system.
The CHP first said it was commenting on the Fontana crash because of the “high level of interest” in Tesla’s crashes and because it was “an opportunity to remind the public that driving is a task. complex which requires the full attention of the driver ”.
The federal investigation comes after the CHP arrested a man who authorities said was in the back seat of a Tesla driving on Interstate 80 near Oakland with no one behind the wheel.
The CHP did not say whether officials determined whether the Tesla in the I80 incident was on autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it. But it’s likely that the autopilot or full autonomous driving worked for the driver to be in the backseat. Tesla allows a limited number of owners to test its autonomous driving system.
Tesla, which has disbanded its public relations department, did not respond to an email seeking comment. The company states in owner’s manuals and on its website that autopilot and autonomous driving are not fully autonomous and that drivers should be careful and ready to intervene at all times.
The autopilot struggled to deal with stationary objects and level crossings in front of Teslas. In two Florida crashes in 2016 and 2019, cars with on-duty autopilot rolled under semi-trailers, killing the men behind the wheel of the Teslas. In a 2018 crash in Mountain View, California, an Apple engineer driving on autopilot was killed when his Tesla hit a freeway barrier.
Tesla’s system, which uses cameras, radar and close-range sonar, also struggles to deal with stopped emergency vehicles. Teslas struck several fire engines and police vehicles stopped on the freeways with their flashing hazard lights on.
In March, NHTSA sent a team to investigate after an autopilot Tesla struck a Michigan State Police vehicle on I96 near Lansing. Neither the soldier nor the 22-year-old Tesla driver was injured, police said.
After fatal crashes in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a more robust system to ensure that drivers are attentive and to limit the use of autopilot to highways where it is. can work efficiently. Neither Tesla nor the security agency acted.