Tesla driver, 20, is found asleep driving his self-driving car at 93 mph

0
99


A Tesla driver has been charged after being found asleep at the wheel of his autonomous car traveling at 93 mph while using the “autopilot system” in Canada.

The man’s Model S Tesla was pictured with its seats fully reclined while roaring near the town of Ponoka, about 60 miles south of Edmonton, Alberta.

The driver and another passenger are said to be asleep in the two front seats of the car.

When the cops discovered the car traveling at around 86 mph, they turned on their emergency flashing lights – only to have the Tesla “automatically start accelerating” to 93 mph. The speed limit on this stretch of highway is around 68 mph.

“The car appeared to be autonomous, traveling at over 140 km / h, with both front seats fully reclined and both occupants appearing asleep,” said a statement from the Royal Canadian Mounted Police (RCMP).

It is not known why the car accelerated to 93 mph (exactly 150 km / h) as the cars pulled away when the police gave the chase.

The 20-year-old driver from British Columbia was charged with speeding and suspended from his 24-hour license for fatigue.

Tesla driver charged after being found asleep at the wheel of his 93 mph autonomous car in Canada

He was also later charged with dangerous driving and received a subpoena in December.

RCMP Sgt. Darrin Turnbull told CBC News: “No one was looking through the windshield to see where the car was going. I have been in the police force for over 23 years and the majority in traffic enforcement, and I am speechless.

“I’ve never, never seen anything like this before, but of course the technology wasn’t there.

Tesla Model S sedans have autopilot functions that include automatic steering as well as “traffic sensitive” cruise control. In this case, both functions seem to be in use.

Turnball added, “We think the vehicle was running on the autopilot system, which is really just an advanced driver safety system, a driver assistance program. You must always drive the vehicle.

“But of course, there are aftermarket things that can be done to a vehicle against the manufacturer’s recommendations to change or bypass the security system.

Pictured: a Tesla Model S, the same model that was caught hurtling down a Canadian highway as its driver used autopilot functions to take a nap (stock image)

Pictured: a Tesla Model S, the same model that was caught hurtling down a Canadian highway as its driver used autopilot functions to take a nap (stock image)

The autopilot feature will steer, accelerate and brake the car in its lane, according to Tesla’s website, but notes that the driver still needs to be careful. The function “does not make the vehicle autonomous,” he says.

RCMP Superintendent Gary Graham said in the statement: “While manufacturers of new vehicles have built-in protections to prevent drivers from taking advantage of new in-vehicle security systems, these systems are just that – additional security systems.

“These are not stand-alone systems. They still have the responsibility to drive.

In all Canadian provinces, use of the autonomous driving feature is illegal without the presence of an alert driver, with the Insurance Company of British Columbia (ICBC) declaring that the driver is responsible for the actions of the driver. vehicle when driver assistance is activated.

In July, Tesla CEO Elon Musk said he expected his company’s vehicles to be fully autonomous by the end of the year, saying he was already “very close” meet “level five” autonomy requirements, which does not require any contribution from a driver.

Tesla CEO Elon Musk has boldly claimed that Tesla cars could have 'level five' range by the end of the year. Level five range means the vehicle does not require driver intervention

Tesla CEO Elon Musk has boldly asserted that Tesla cars could have “level five” range by the end of the year. Level five range means the vehicle does not require driver intervention

Autonomous vehicle laws in Canada, the United States and the United Kingdom

Autonomous or autonomous vehicle laws are currently at different stages from country to country.

Canada has yet to pass comprehensive federal legislation targeting their use or the liability issues they raise.

Although the use of autonomous vehicles is legal, currently a human driver is required at all times to be able to take control of the vehicle when alerted to it.

Therefore, they must be aware and ready to drive the car at some point, and all existing laws – such as those regarding cell phone use or staying awake – still apply.

In the United States, the legality of using an autonomous vehicle varies from state to state, although no state categorically prohibits them.

As of 2020, Connecticut, District of Columbia, Illinois, Massachusetts, New Hampshire, New York, and Vermont all require a human operator in the vehicle.

Some states (Florida, Georgia, Nebraska, Nevada, North Carolina, North Dakota, Pennsylvania and Washington) require a human operator to be present depending on the level of automation of the vehicle.

Some states do not have autonomous vehicle laws or decrees.

These are Alaska, Kansas, Maryland, Missouri, Montana, New Hampshire, New Jersey, New Mexico, Rhode Island, South Carolina, South Dakota, West Virginia, Wyoming.

Some, like California, actively encouraged self-driving on-road testing as early as 2012.

In the UK, legislation to regulate the use of automated vehicles is currently under discussion, although this is specifically related to the use of certain safety devices.

Using an autonomous autopilot system on UK roads is illegal, but hands-free driving on some roads could become legal by next year.

This would mean that a driver should be prepared to take control of the vehicle, but can do things like check their phone while the power steering and cruise control are engaged on freeways.

Currently, Tesla’s have “level two” range, but Musk has claimed that current Teslas on the road can be upgraded to “level five” with a simple software update.

“I remain convinced that this year we will have completed the basic functionality for level five autonomy. There are no fundamental challenges left. There are a lot of little problems, ”he said in July.

“And then there’s the challenge of solving all these little problems and getting the whole system in place.

But IHS Markit analyst Tim Urquhart said that while level five autonomous driving is the industry’s ‘holy grail’, “Even though Tesla can reliably deploy the technology in a production environment, the regulatory environment in all major markets falls far short of allowing fully autonomous vehicles on the road.

The incident in Canada in July is not the only example of drivers caught in the act of over-reliance on Tesla’s autonomous driving function, some resulting in accidents.

In January, an Ontario driver was charged with reckless driving after police caught him flossing with both hands while driving while his vehicle was traveling on the highway at 83 mph.

Two months earlier, a Tesla had been filmed in Richmond driving the wrong way on a road in a parking lot – without a driver.

In the United States, a number of fatal crashes involving the “autopilot” function are currently under investigation by authorities, including one in which a driver used the function to play on his smartphone.

Last week, a TikTok video emerged of a Tesla car driving down a California highway on autopilot with no one in the driver’s seat as four passengers drink cans of seltzer and sing along to Justin Bieber.

The shocking images, posted on the TikTok account @BlurrBlake, showed three young men – alongside the stranger behind the camera – partying inside the vehicle as it flies down the freeway.

The car is said to have reached speeds of 60 mph, all without a human driver ready to take control of the vehicle, TMZ reported.

According to Tesla’s website, the vehicle’s autopilot system should be used “with a fully attentive driver, who has their hands on the wheel and is ready to take over at any time.”

The technology has been linked to four fatal crashes in the United States

In March 2019, Florida driver Jeremy Banner, 50, died when his Tesla Model 3 hit a trailer.

National Transportation Safety Board investigators said Banner had activated the autopilot function about 10 seconds before the crash and that the autopilot did not perform any evasive maneuvers to avoid the collision.

Three other fatal accidents date back to 2016.

LEAVE A REPLY

Please enter your comment!
Please enter your name here