Road sign hack enables researchers to trick Tesla vehicle into speeding up by 50mph

And the hack was done using simple black electrical tapes

Researchers at McAfee recently tricked a Tesla vehicle into accelerating well beyond the speed limit simply by putting black tape over a speed limit sign on the road.

The findings highlight new challenges for autonomous driving systems as fresh security vulnerabilities are disclosed in the public domain by the researchers.

In the current study, McAfee researchers explored how they could trick a Tesla vehicle into misjudging a speed limit sign on the road. To achieve that, they placed visual distractions like black tape across the middle of the digit '3' on a 35 miles per hour (mph) speed limit sign.

The deception fooled the car's camera system into misreading the speed limit as 85 mph

According to the researchers, the deception fooled the car's camera system into misreading the speed limit as 85 mph because the tape made the middle part of '3' resemble an '8'. This caused the vehicle's cruise control system to automatically accelerate the vehicle's speed to 85 mph.

The tests were carried out using a 2016 Tesla Model X and Model S, which used camera systems from Mobileye.

According to researchers, Tesla's latest car models don't use Mobileye camera systems any more and they "don't currently appear to support traffic sign recognition at all".

"McAfee Advanced Threat Research has been studying what we call 'Model Hacking,' also known in the industry as adversarial machine learning," the researchers said.

This caused the vehicle's cruise control system to automatically accelerate the vehicle's speed to 85 mph

"Model Hacking is the concept of exploiting weaknesses universally present in machine learning algorithms to achieve adverse results. We do this to identify the upcoming problems in an industry that is evolving technology at a pace that security has not kept up with," they added.

McAfee says it disclosed the study findings to both Tesla and Mobileye last year, and both companies expressed interest in the research findings. However, Tesla has not indicated any plans so far to address the issue on existing models deployed in the field.

MobilEye, one of the leading vendors of Advanced Driver Assist Systems, argued that the use of tapes and stickers could confuse the human too and therefore the attack scenario didn't qualify as an "adversarial attack".

Model Hacking is the concept of exploiting weaknesses universally present in machine learning algorithms to achieve adverse results

This is not the first time that researchers have fooled a Tesla vehicle's systems. Last April, researchers from Tencent disclosed that they used similar stickers on street signs, causing a Tesla car to dangerously swerve from one lane to another.

Professor Dawn Song from the University of California at Berkeley has also claimed using stickers to trick a self-driving car into treating a stop sign as a 45 mph limit sign.

Earlier this month, researchers from Ben-Gurion University of the Negev in Israel said they performed a series of experiments to demonstrate that driverless cars can be fooled into perceiving projected images as real.

The researchers said that the design hole in autonomous cars could be exploited by attackers to launch phantom (depthless objects) attacks against them, without the need to be present at the attack scene.