Uber self-driving car ‘failed to recognise’ pedestrian


The likely cause of the incident in which an Uber autonomous car struck and killed a pedestrian in the US State of Arizona was a software reaction failure, according to reports.

The car was travelling at 40mph in autonomous mode when it collided with the woman in March. She later died from her injuries.

The Information has reported that the car’s sensors detected the woman as she pushed her bicycle across the road, but the software that manages reactions was tuned too far toward false positives, or objects such as litter, which can be ignored.

Uber suspended its autonomous vehicle testing programme in the US state following the incident, and reportedly settled with the victim’s family out of court.

Uber and the US National Transportation Safety Board (NTSB) are investigating the incident.

In a statement, Uber said: “We’re actively cooperating with the NTSB in their investigation. Out of respect for that process and the trust we’ve built with NTSB, we can’t comment on the specifics of the incident.”

“In the meantime, we have initiated a top-to-bottom safety review of our self-driving vehicles programme, and we have brought on former NTSB chair Christopher Hart to advise us on our overall safety culture. Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

Subscribe to the Claims Weekly newsletter and receive the latest claims news and analysis every Monday:


About Author

Mark Dugdale is the editor of Claims Media. Mark welcomes articles, letters or feedback from readers and can be reached via mark.dugdale@barkerbrooks.co.uk