Tesla livre la version bêta 9 « Full Self-Driving » – .

Auto industry distances itself from Tesla in response to new crash reporting rule – .

Tesla has started sending out live software updates for its long-awaited “Full Self-Driving” beta 9, the driver assistance system certainly not autonomous but certainly advanced.

As promised by Elon Musk, the software update (2021.4.18.12) began downloading after midnight Friday, giving thousands of Tesla owners who have purchased the FSD option access to the feature, which allows drivers to use Autopilot’s many advanced driver assistance features. on local non-road streets.

Musk has been promising software v9 for, well, a while now. He said in 2018 that the “long-awaited” version of FSD would start rolling out in August. He did it again in 2019, proclaiming that “in a year” there would be “over a million cars with full autonomous driving, software, everything.” Earlier this month, he claimed that “the beta version of FSD 9 will be available soon.” So to say that Tesla fans have been anticipating this update for a while would be an understatement.

The real question is whether it is ready for prime time. To this, Musk gave a typical confused response, tweeting that “Beta 9 fixes the most well-known issues, but there will be some unknown issues, so please be paranoid. He added, “Safety is always the top priority at Tesla. ” Release Notes included with the update warn testers that “it can do the wrong thing at the worst time” and to avoid complacency. They also mention improvements to the driver’s cab camera monitoring for checking attention, along with updated and larger visualizations on the car’s screen (as seen above).

There is no doubt that Tesla is more willing than its competitors to test beta versions of its Autopilot driver assistance feature on its customers for the purpose of collecting data and fixing system bugs. And Tesla customers generally agree with that, regularly flooding mentions of Musk begging for admission into the company’s early access program for beta testers. This has contributed to Tesla’s public reputation as a leader in autonomous driving, although its vehicles consistently fall short of what most experts agree to define as a self-driving car.

Tesla warns drivers should keep their eyes on the road and their hands on the wheel at all times, although the automaker refuses to include a more robust driver monitoring system (like infrared eye tracking, for example) to ensure that its customers follow security protocols (although this may change). Autopilot is considered a “partially automated” Level 2 system by Society of Automotive Engineers (and attorneys for Tesla) standards, which requires drivers to keep their hands on the wheel and their eyes on the road. .

However, consumer advocates have proven that Tesla’s system can easily be tricked into thinking there’s someone in the driver’s seat, which has garnered renewed attention following a fatal crash. in Texas involving a Tesla, in which authorities said there was no one behind management. wheel.

But that hasn’t stopped some Tesla owners from abusing autopilot, sometimes going so far as to film and publish the results. Drivers have been caught sleeping in the passenger seat or in the backseat of their Tesla as the vehicle is driving on a crowded highway. A Canadian was charged with reckless driving last year after being pulled over for sleeping while traveling at a speed of 93 mph.

Since Tesla introduced autopilot in 2015, there have been at least 11 fatalities in nine crashes in the United States involving the driver assistance system. Internationally, there have been at least nine other fatalities in seven additional crashes.

Meanwhile, the US government requires automakers to report accidents involving autonomous vehicles or advanced driver assistance systems, often within a day of the incident. This is a major change that signals a stronger stance on these systems that are partially automated by regulators.


Please enter your comment!
Please enter your name here