Report into Florida Tesla crash that claimed Autopilot might have cut crash rates is flawed, claims new analysis

clock • 2 min read

New analysis indicates that Tesla's Autopilot system may be linked to an increase in crashes

A US government report that exonerated car maker Tesla from blame over an accident involving its Autopilot technology has been criticised in a new report from a risk-management consultancy. 

The US government report was commissioned following a fatal accident in 2016. It found the driver at fault for failing to appreciate the shortcomings of Tesla's advanced driver-assistance system. 

According to the new analysis by Quality Control Systems Corporation (QCSC), Tesla's Autopilot system is not error-free. In particular, it adds that the Autosteer component of Autopilot, which keeps the vehicle in the same lane, may even increase the risk of crashes.

Following the fatal 2016 crash in Florida, which involved a Tesla Model S and a tractor trailer, the National Highway Traffic Safety Administration (NHTSA) investigated the safety of Tesla's Autopilot driving system.

In its report, which was submitted in 2017, NHTSA said that the Autopilot system of Tesla's Model S did not identify the tractor trailer making a left-hand turn in front of the car from a cross street. However, the report held the car driver responsible for the crash, stating that he failed to pay proper attention to the traffic.

The report added that the Tesla's Autosteer system installed on Tesla vehicles in 2014-2016 were not just safe, but may have helped cut crash rates by up to 40 per cent.

However, the new investigation, carried out by QCSC using data obtained via the Freedom of Information Act, concludes that the NHTSA analysis was wrong. It indicated that Tesla's Autosteer system may actually be linked to more crashes.

According to QCSC, the NHTSA analysis ignored the entire mileage driven by the 43,781 cars it studied. In fact, NHTSA used mileage data for just 14,791 vehicles, which inflated the crash rate (before the introduction of Autopilot) and led to NHTSA's erroneous conclusion.

NHTSA's analysis of mileage and airbag deployment had claimed that there were 1.3 airbag deployments per million miles before the introduction of Autopilot and 0.8 deployments per million miles afterwards.

But, QCSC found that airbag deployments actually went up from 0.76 per million (before Autopilot) to 1.21 per million (afterwards), an increase of 59 per cent.

Autosteer is the lane-keeping component of the Tesla's Autopilot system. It was first introduced in 2014, but was upgraded in 2016.

Earlier this year, Elon Musk claimed that Tesla will introduce autonomous driving technology in the near future - a development that some specialists believe is more likely decades away.

Indeed, experts believe that even if Tesla were to roll out self-driving capabilities in the near future, regulations would still require drivers to be fully responsible on the road, and to be able to take control of the vehicle when needed.

 

You may also like
Tesla plans to cut staff and freeze hiring next quarter, report

Corporate

The company has begun telling staff about the planned layoffs

clock 22 December 2022 • 3 min read
Twitter discontinues Covid misinformation policy under Elon Musk

Social Networking

Musk is also said to have reduced the size of the team responsible for stopping child sexual exploitation on the platform

clock 30 November 2022 • 3 min read
Musk fires executives after Twitter acquisition

Corporate

Twitter's CEO, CFO and chair have already left their posts.

clock 28 October 2022 • 3 min read

Sign up to our newsletter

The best news, stories, features and photos from the day in one perfectly formed email.

More on Software

The social engineering of the self: How AI chatbots manipulate our thinking

The social engineering of the self: How AI chatbots manipulate our thinking

We need structured public feedback to better understand the risks, says red teamer Rumman Chowdhury

John Leonard
clock 27 October 2023 • 4 min read
AI doesn't care what you think

AI doesn't care what you think

Want to understand hallucinations? Look at your family

Professor Peter Cochrane
clock 26 October 2023 • 3 min read
IT Essentials: The fungal IT network

IT Essentials: The fungal IT network

Shadow IT grows best in darkness and solitude

Tom Allen
clock 16 October 2023 • 2 min read