About 2 months ago, Uber suspended its Self-Driving program after one of its self-driving cars hit and killed a lady pedestrian in Arizona, a southwestern state in the USA.
After looking into the occurrence, Uber says the software used in the self-driving car is likely to be blamed for the collision, and largely, death. The software had one job —to detect objects/obstacles and react to them either by stopping or swerving— but failed.
The Information revealed that the car, through its sensors, actually detected the pedestrian, but the car’s software failed to see the need to move — it kept moving and killed the lady.
Uber self-driving vehicles are programmed in a way that they can detect and ignore “false positives” objects i.e obstacles that don’t warrant the car to stop moving. For example, Uber’s self-driving software will ignore a paper bag flying across that road — a false positive.
Perhaps the car took the lady on the pedestrian as a paper bag and decided immediate evasive action wasn’t necessary.
In addition, it is also believed that the reduction of LIDAR sensors (a very crucial sensor on self-driving cars) from seven to one on Uber’s new prototypes could also be a reason for the failure of the vehicle to aptly evade the obstacle (read: lady pedestrian).
After the incident, many of Uber’s partners and suppliers (Nvidia, Intel, Velodyne etc) on the self-driving project have all distanced themselves from the accident and blamed it on Uber’s software.
Uber is all alone in this, it seems.