News Auto 04-26-2024 at 19:48 comment views icon

Tesla Autopilot and Full Self-Driving involved in hundreds of accidents and dozens of fatalities – NHTSA

author avatar
https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg *** https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg *** https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg

Vadym Karpus

News writer

The U.S. National Highway Traffic Safety Administration (NHTSA) investigated 956 accidents involving Tesla electric vehicles with Autopilot and Full Self-Driving (FSD) functions. Moreover, the investigation concerned only incidents that occurred between January 2018 and August 2023. In general, there were more accidents.

The NHTSA launched an investigation after several incidents in which Tesla cars crashed into stationary ambulances parked on the side of the road. Most of these incidents occurred after dark, when the car’s software ignored warning measures, including warning lights, flashing beacons, cones, and light arrows.

In these accidents (some of which also involved other vehicles), 29 people died. There were also 211 accidents in which the «frontal plane of the Tesla hit a» vehicle or obstacle in the path. These accidents, which were often the most serious, resulted in 14 deaths and 49 injuries.

In its investigation, the agency found that Autopilot — and, in some cases, FSD — were not designed to maintain driver engagement. Tesla says it warns its customers that they need to be attentive when using Autopilot and FSD, meaning to keep their hands on the wheel and eyes on the road. But the NHTSA says that in many cases, drivers become overly complacent and lose focus. And when it comes time to react, it’s often too late.

The agency found that in 59 crashes, Tesla drivers had enough time, «5 seconds or more», to react before hitting another object. In 19 of these accidents, the hazard was visible for 10 seconds or more before the collision. In reviewing crash logs and data provided by Tesla, the NHTSA found that in most of the crashes analyzed, drivers did not brake or steer to avoid the hazard.

The NHTSA also compared Tesla’s Level 2 (L2) automation features to products available in other companies’ vehicles. Unlike other systems, Autopilot removes the driver from the driver’s seat rather than assists in driving. This «prevents» drivers from being involved in the task of driving. Tesla stands out in the industry in its approach to L2 technology because of the mismatch between the low driver involvement and the authorized performance capabilities of Autopilot. Even the brand name «Autopilot» is misleading to consumers. Tesla products encourage drivers to think they are more capable and efficient than they actually are. Other manufacturers use words like «assist».

The NHTSA concludes that drivers using Autopilot or the more advanced system, Full Self-Driving, «were not sufficiently engaged in the task of driving the», and Tesla’s technology «did not adequately ensure that drivers were focused on the task of driving the».

The NHTSA recognizes that its study may be incomplete due to the «gaps in Tesla’s telemetry data. This could mean that there are many more accidents involving Autopilot and FSD than the Administration was able to identify.

Source: The Verge


Loading comments...

Spelling error report

The following text will be sent to our editors: