This week, Tesla began offering its Full Self-Driving (FSD) update to a select group of customers, and the first reactions are now starting to show. The software, which allows drivers to use many of Autopilot’s advanced driver assistance features on local non-highway streets, is still in beta. As such, it requires constant monitoring while in operation. Or as Tesla warns in his introductory language, “he can do the wrong thing at the worst time.”
Frankly, it sounds terrifying – not because it looks erratic or flawed, but because of how it will inevitably be misused.
Early reactions to the software update range from “it was a little scary” to outright enthusiasm for CEO Elon Musk’s willingness to let his customers test out features that aren’t ready for wide release. This drive has helped Tesla maintain its position as a market leader in the cutting edge of electric and autonomous vehicle technology, but it also poses a huge risk to the company, especially if those early tests go wrong.
A Tesla owner going through the “Tesla Raj” handle posted a 10-minute video on Thursday that claims to show his experience with FSD. He says he used the feature while driving on “a residential street … without lane markers” – a feature Tesla’s autopilot was previously unable to do.
From the outset, there are marked differences in the way FSD is presented to the driver. The visuals displayed on the dashboard are more like training images of an autonomous vehicle, with transparent orange boxes depicting parked cars and other vehicles on the road and icons depicting traffic signs. The trajectory of the car is represented by blue dots extending in front of the vehicle. And various messages appear that tell the driver what the car is going to do, such as “Stop for traffic control in 75 feet.” “
The car also made several left and right turns on its own, which Raj described as “a little scary because we’re not used to it”. He also said the turns were “human” as the vehicle moved into the opposite lane to assert itself before entering the turn.
Another Tesla owner who lives in Sacramento, Calif., And tweets under the handle @ brandonee916 has posted a series of short videos claiming to show a Tesla vehicle using FSD to navigate a host of challenging driving scenarios, including intersections and a roundabout. These videos were first reported by Electrek.
Too cautious at a roundabout … I didn’t have to interact until the end of the process. Not bad for the first attempt! Go FSD BETA! pic.twitter.com/3gPkztUWgY
– Brandonee916 (@ brandonee916) October 22, 2020
The Tesla Raj and @ brandonee916 test vehicles run at moderate speeds, between 25 and 35 mph, which was very difficult for Tesla. Musk said the Tesla Autopilot can handle high-speed driving with its Navigate Autopilot feature and low speeds with its Smart Summon Park feature. (How Smart Summon works is subject to debate, given the number of Tesla owners reporting bugs in the system.) The company has yet to allow its customers to drive without human intervention on highways, like Cadillac with its competitor Autopilot Super Cruise. But at those average speeds, where the vehicle is more likely to encounter traffic lights, intersections, and other complexities, this is where Tesla ran into a lot of trouble.
For now, FSD is only available to Tesla owners in the company’s early access beta testing program, but Musk said he expects “wide release” before the end of 2020. The risk, of course, is that Tesla’s customers will ignore the company. FSD warnings and abuse to register performing dangerous stunts – much like they have been doing for years and continue to do regularly. This type of rule violation is to be expected, especially in a society where influence hunting has become a way of life for many people.
Tesla said autopilot should only be used by attentive drivers with both hands on the wheel. But the feature is designed to help a driver, and it’s not foolproof: There have been several high-profile incidents in which some drivers engaged autopilot, crashed and died.
“Public road tests are a serious responsibility and the use of untrained consumers to validate beta level software on public roads is dangerous and inconsistent with existing guidelines and industry standards,” said Ed Niedermeyer, communications director for Partners for Automated Vehicle Education, a group that includes nonprofits and audiovisual operators like Waymo, Argo, Cruise and Zoox. “In addition, it is extremely important to clarify the line between driving assistance and autonomy. Systems requiring human oversight by the driver are not autonomous and should not be qualified as autonomous. “