It's horribly easy to fool tesla's autopilot.

It's horribly easy to fool tesla's autopilot.

Consumer Reports reports that Tesla's driver assistance system, Autopilot, is very easy to fool and can be made to drive without someone in the driver's seat.

This comes just as concerns were being raised about the safety of Autopilot following a Tesla accident in Texas that killed two passengers. According to local authorities, no one was behind the wheel at the time of the accident.

Consumer Reports researchers were able to drive the 2020 Tesla Model Y from a complete stop and then "several miles" around a closed test track.

Researchers used weighted chains to simulate hand pressure around the steering wheel, tightened the driver's seatbelt, and prevented the doors from opening during the test.

When the autopilot was activated, one of the researchers was in the driver's seat and was able to move to the passenger seat without disengaging the autopilot. He was then able to activate the car's acceleration using a dial on the Tesla's steering wheel.

According to the report, the car did not give any warning or indicate that the driver's seat was empty. Because the "driver" was sitting on his seatbelt, Tesla did not recognize that he had moved. However, had he been wearing a seatbelt, Tesla's Autopilot would have been deactivated after he unbuckled his seatbelt.

Jake Fisher, Consumer Reports' senior director of automotive testing, says the test shows that Tesla is falling behind other automakers. Ford, GM, and others are developing driver-assistive technologies that ensure drivers are actively watching the road.

"The car went up and down the half-mile lanes of our circuit over and over again without worrying that no one was in the driver's seat, no one was touching the steering wheel, no one had weight on the seat," says Fisher.

"It was a little frightening when we realized how easy it was to break through the safeguards that we had clearly proven inadequate."

After the Texas accident, Elon Musk tweeted that data logs "so far show that autopilot was not enabled at the time of the accident." He further claimed that Autopilot was not activated because no lane lines were drawn in that section of road.

Musk has also staunchly defended Autopilot, even in the face of criticism that its name is misleading. Musk has repeatedly insisted that Autopilot is a perfectly accurate name. Because, like autopilot in commercial aircraft, it is meant to reduce the burden on drivers, not provide fully automated driving.

However, it could be argued that the general public is unaware of this and associates the term autopilot with fully automated driving. Indeed, a German court ruled just last year that Tesla's claims about Autopilot are misleading. The reason was that marketing materials and the term "Autopilot" suggested fully autonomous driving. Musk has called similar criticisms "ridiculous."

"Consumer Reports" has made it clear that no one should attempt to drive a Tesla this way. The site's tests were conducted on a closed course, never exceeding 30 mph, with a crash team standing by. Doing this on a public road is incredibly dangerous for yourself, your passengers, and everyone around you.

There are no fully autonomous vehicles available for the public to purchase. In other words, all the smart driver assistance features are just that: they are meant to assist the driver, not be an excuse to take your eyes off the road.

Categories