Uber’s self-driving car might have ‘decided’ to ignore passerby in accident


Back in March, ride-hailing company Uber’s self-driving car was involved in a fatal car crash in Arizona, killing a pedestrian. New reports show that this crash happened because the car ‘decided’ not to swerve to prevent the crash.

Sources told The Information, that the self-driving car’s accident occurred due to a software fault set up to ignore the objects on the road. The report informs that the vehicle’s software reportedly ‘decided’ not to take evasive action and flagged the detection as a ‘false positive’ leading to the crash. It ‘saw’ the woman but decided that ‘it didn’t need to react right away’.

As per the report, the reason why a system would act in such a way is because sometimes there are situations where computers powering autonomous car might see something it believes does not pose threat to such a cardboard piece. Uber reportedly set that threshold so low that the system saw a person crossing the road with a bicycle and thought that immediate action wasn’t required.

While the car contained a safety driver on board who is supposed to tackle such situations, he was apparently seen looking down during the crash in the released footage,

Uber’s spokesperson said in a statement, “We’re actively cooperating with the National Transportation Safety Board (NTSB) in their investigation. Out of respect for that process and the trust we’ve built with NTSB, we can’t comment on the specifics of the incident. In the meantime, we have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture. Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

Uber’s car hit a 49-year-old pedestrian, Elanie Herzberg, in Arizona on March 18, resulting in her death. This led to suspension in all self-driving tests from the company.