Hackers could use billboards to trick self-driving cars

0
270
Hackers could use billboards and pictures of stop signs to disrupt self-driving cars.
Image: Unsplash | Will Porada

Self-driving cars are a hot topic right now thanks to innovative automakers like Tesla and other industry players like Waymo. As fully autonomous vehicles inch closer to reality, safety remains a key priority. Recent findings from a team of Israeli researchers could have concerning implications for cars that use computer vision systems to see the world.

They found that flickering images of stop signs on a digital billboard can cause self-driving cars to slam on the brakes. That could create serious problems on the road, including the risk of dangerous crashes.

Seeing the World Too Well

Plenty of human drivers have blown past a stop sign. Maybe it was hidden behind a pole or perhaps it was obscured by a box truck in the other lane. Stop signs aren’t perfect. On the other hand, humans wouldn’t hit the brakes because they saw a picture of a stop sign on a highway billboard.

Advertisement
Manage your supply chain from home with Sourcengine

Recently, self-driving car companies have introduced computer vision systems that let their vehicles see the world in stunning clarity. Tesla is one of them. Throughout 2020, the automaker has been working on allowing its Autopilot system to identify and respond to things like changing traffic lights and stop signs.

Unfortunately, that tech seems to have a serious flaw. The research team explained how the billboard issue affects self-driving cars in a statement to Wired. Yisroel Mirsky, a researcher from Ben Gurion University, says, “The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous.”

“The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why,” he adds.

Interestingly, the team wasn’t even researching billboards and digital signage when they began the project. They were focused on whether people could shine images onto the road to affect autonomous vehicles. Ultimately, however, connected billboards seemed like a more realistic threat.

Ghost Crime

The ease with which a hacker could pull off this stunt is alarming. Perhaps the hardest part would be finding a connected billboard close enough to the road that it is in the car’s field of vision. In today’s ad-heavy world that is less challenging than it seems.

From there, an attacker could simply create an ad that flashes a stop sign for a fraction of a second. That alone would hypothetically be enough to stop a self-driving car in its tracks in the middle of the road.

It’s worth noting that this flaw seems to affect even the most advanced self-driving systems on the road today. The research team was reportedly able to fool a Tesla running the latest version of Autopilot.

What makes this issue more concerning is the fact that it leaves almost no evidence behind. Unless a dash cam captures the billboard and the ensuing events, it would be practically impossible to trace.

“Previous methods leave forensic evidence and require complicated preparation. Phantom attacks can be done purely remotely, and they don’t require any special expertise,” says Ben Gurion researcher Ben Nassi.

This is certainly an issue that self-driving car companies will need to consider in the immediate future. Before cars can become autonomous, they need to be able to separate fact from fiction. An incorrect judgment could have life-and-death implications.

LEAVE A REPLY

Please enter your comment!
Please enter your name here