Home > Auto > News > Tesla gets flak from US agency for 'basic safety issues' on Autopilot, FSD
New US National Transportation Safety Board head, Jennifer Homendy, pointed out that despite the full-self driving package, Tesla drivers require to pay attention to the road at all times. (REUTERS)
New US National Transportation Safety Board head, Jennifer Homendy, pointed out that despite the full-self driving package, Tesla drivers require to pay attention to the road at all times. (REUTERS)

Tesla gets flak from US agency for 'basic safety issues' on Autopilot, FSD

  • US National Transportation Safety Board head isn't happy about Tesla beta-testing autopilot upgrades on public streets.

  • She pointed out that despite the name, the current FSD package on Tesla vehicles only allow limited autonomy in certain situations.

Tesla's autopilot system is already under scanner from US regulators and safety agencies and now, the new US National Transportation Safety Board head, Jennifer Homendy, has said in an interview with The Wall Street Journal that the EV maker should tackle "basic safety issues" before expanding the Autopilot and Full Self Driving (FSD) features to more parts of the country.

She also expressed that she was not thrilled to know that the EV giant is beta-testing autopilot upgrades on public streets.

Similar Cars

Help us with your basic details!

Choose city
+91 | Choose city
Choose city
Choose city
By clicking VIEW OFFERS you Agree to our Terms and Privacy Policy

Dear Name

Please verify your mobile number.

+91 | Choose city

We have recorded your information for the latest offer on model . Stay connected for further latest offers.

Just like other critics of the Tesla autopilot system, Homendy also found issue with the way the EV manufacturer has named its driver assistance systems. “(The Full Self Driving label is) misleading and irresponsible, (leading some to) misuse and abuse it," she said.

(Also read | US safety agency seeks driver-assistance data from 12 automakers for Tesla probe)

She pointed out that despite the name, the current FSD package on Tesla vehicles only allow limited autonomy in certain situations, and also requires drivers to pay attention to the road at all times. 

Tesla uses FSD betas as a way to enhance its semi-autonomous features through testing in real-world conditions. The company and its CEO Elon Musk have time and again argued that the autopilot system and the FSD package are overall safer as compared to full manual control.

(Also read | From Ford to Toyota, Tesla and more, US opens probe into 30 million vehicles)

Currently, the US National Highway Traffic Safety and Administration (NHTSA) agency is probing twelve crashes involving Tesla vehicles with activated autopilot systems. These crashes have occurred since 2018 across San Diego, Miami and Massachusetts, and have resulted in 17 injuries and a single death. Most of these crashes also involved first responder vehicles that were parked at an emergency scene. The investigation covers 765,000 Tesla electric vehicles. 

A recent study based on MIT Advanced Vehicle Technology data showed that Tesla Autopilot leads to a significant dip in the driver's attention when activated. The study reinstates the opinions of many who might think that the feature leads to less careful driving.  

  • First Published Date : 20 Sep 2021, 04:44 PM IST