US safety agency opens probe into Tesla autopilot after a series of crashes

US safety agency opens probe into Tesla autopilot after a series of crashes

The National Highway Traffic Safety Administration (NHTSA) is launching an investigation into Tesla's Autopilot driver assistance system following a series of crashes.

According to the Associated Press, NHTSA is looking into 11 crashes that have occurred since 2018 that have injured a total of 17 people and killed one.

In these cases, the Tesla vehicles were in Autopilot mode. Autopilot mode is a driver assistance mode in which the car is intended to drive partially automatically while being aware of surrounding traffic, highway lines, and other potential hazards.

It is not a futuristic autopilot where the driver falls asleep at the wheel or leaves the car to drive completely. The name of this feature may be confusing.

Most of these Tesla crashes have occurred after dark and involved emergency vehicle flashing lights, lit road flares, or illuminated arrow boards that could confuse a high-tech car's navigation system.

"The National Highway Traffic Safety Administration is committed to ensuring the highest standards of safety on our nation's roads," an NHTSA spokesperson told Tom's Guide.

"Consistent with the agency's core safety mission, and to better understand the causes of certain Tesla crashes, NHTSA has begun a preliminary evaluation of Tesla's Autopilot system and the technology and methods used to monitor, assist, and enforce driver engagement while using Autopilot. The company has initiated a preliminary evaluation of the following.

According to the Associated Press, there are 765,000 Teslas on U.S. roads with the Autopilot feature activated. (Autopilot comes with all new Teslas, but according to Tesla's website, cars built before September 2014 can pay to have Autopilot activated.) The NHTSA is investigating all Tesla Model S, 3, X, and Y are being investigated.

"NHTSA would like to remind you that no vehicle currently on the market is capable of automated operation," an NHTSA spokesperson said, "All vehicles on the market must be controlled by a human driver at all times, and all state laws impose the responsibility of driving the vehicle on the human driver ."

At this time, the National Transportation Safety Board (NTSB), another government agency that investigates traffic accidents, recommends that Tesla drivers limit their use of Autopilot to where they know they can operate safely.

The NTSB also urges Tesla to build better systems to ensure that drivers are paying attention. Earlier this year, Consumer Reports found that it is very easy to fool Tesla's Autopilot. Tesla later announced that it would use a camera in the rearview mirror to make sure the driver is awake.

In 2019, in Delray Beach, Florida, a Tesla Model 3 on Autopilot struck a semi-truck while crossing the road, killing the driver. Neither the car nor the driver had applied the brakes. Another accident earlier this year was also blamed on Tesla's Autopilot by a crossing semi-truck.

Earlier this year, the NTSB blamed the NHTSA for lax rules regarding autonomous driving technology; the NTSB said the agency failed to put safeguards and pressure on automakers to ensure that these systems function properly.

"The Department of Transportation (DOT) and NHTSA believe that they must first act to develop a strong safety foundation to support a framework for future automated vehicles (AVs)," the NTSB said in its letter.

"That foundation should include sensible safeguards, protocols, and minimum performance standards to ensure the safety of motorists and other vulnerable road users."

The Associated Press reached out to Tesla for comment, but the company disbanded its public affairs department late last year.

Tesla and CEO Elon Musk have stated that those who activate Autopilot must still be fully engaged, as the software is still under development and there are many real-world variables that the system may not fully account for.

Still, there is no shortage of people posting videos online of irresponsible behavior with the autopilot activated. In one video, a TikTok star was sleeping comfortably on a blanket and pillow while using the autopilot.

Among the cases being investigated by the NHTSA is one in which a Tesla crashed into an emergency vehicle parked on the side of the road.

It is unclear whether NHTSA will also investigate Tesla's new full self-drive mode. The full self-drive mode takes over more duties from the driver, such as changing lanes, changing direction, parking, etc.

.

Categories