Wednesday, 18 October 2017
Auto

Self-driving cars can be fooled by fake cars, pedestrians and other bogus signals

A recent study has revealed that Uber, Google and Apple’s self-driven cars can be misdirected through fake signals via laser.

Jonathan Petit, scientist at a software security organization Security Innovation, has highlighted a huge vulnerability in the self-driven cars through LIDAR system which is the face of the car.

Petit would present his research paper at Black Hat Europe security conference in November where he will highlight how laser and pulse generator can misguide a self-driven car to believe that there is other traffic around.

It is very easy to fool a LIDAR (laser ranging phenomenon of an autonomous car) by creating echoes that appear to be vehicles in the surrounding. The process is quite easy if one has a low-power laser and some operating device (like Arduino or Raspberry Pi).

Petit has managed to attack the cars from a distance of almost 330 meters and his results are fruitful. It is rather a prank with a car, to give the car some acceleration or a sudden stop. He said that:

Poor inputs for an autonomous car will provide poor driving decisions.

LIDAR technology costs in tens of thousands and Google has adopted almost $70,000 / €62,800 LiDAR sensing unit on its self-driving vehicle.

The car can be attacked from any angle, from sides, front and back and this wouldn’t alarm the passenger of the threat. Petit says that the echo phenomenon may indulge the car, alarming it that something is very much close to it and ultimately it has to make some decision quickly.

This would not allow the car much time to consult its GPS even and the decision would be quite abrupt.

Fiat Chrysler has remarked that this would allow the hackers a complete control over the car using media services.

Autonomous cars have always been a subject of controversy and after this recent research, the reliability of these cars has been pushed again.