Did Uber’s Driverless Car Decide Not to Avoid a Fatal Accident?

Did Uber’s Driverless Car Decide Not to Avoid a Fatal Accident

In March, a self-driving Uber SUV in suburban Tempe struck and killed a pedestrian, sending shock waves throughout the auto and technology industries. Uber, which was road-testing fully autonomous vehicles in and around Phoenix, Pittsburgh, San Francisco and Toronto, abruptly stopped the testing. The victim, 49-year-old Elaine Herzberg, who was walking a bicycle outside the lines of a crosswalk, later died. At the time it was not clear whether the Uber vehicle, which had a backup driver aboard, or the victim was at fault. It now appears that the vehicle “saw” Ms. Herzberg, but “decided” not to avoid hitting her.

According to theverge.com, the vehicle’s software was “tuned in such a way” that it read the data its sensors reported as a “false positive.” Therefore, the computer did not “think” that evasive action was necessary. The next question that arises is why the human backup driver did not react. The Tempe Police Department has released video footage that shows the backup driver glancing down moments before the crash. Uber has been cooperating with the National Transportation Safety Board and the National Highway Traffic Safety Administration in the investigation.

Ms. Herzberg was not the first self-driving fatality, but she was the first for a fully autonomous vehicle. In 2016, a Tesla Model S with autopilot technology crashed into the side of a tractor-trailer, killing the driver. There are similarities and differences between the crashes. Whereas the Uber SUV seems to have detected Ms. Herzberg, the Tesla failed to read the white side of the trailer as an obstacle, perceiving it as open sky. The Tesla owner should have been acting as his own backup driver, but was looking down at an entertainment video prior to the crash.

Safety experts have been cautiously optimistic about the potential for self-driving cars making our roads safer and helping to reduce the number of annual traffic deaths, which hovers at 40,000 nationally. Supporters point out that technology does not consume alcohol or drugs, does not get drowsy, and does not take its mind off the road. As Bryant Walker Smith, a University of South Carolina law professor who studies self-driving vehicles has said, “We should be concerned about automated driving; we should be terrified about human driving.”

Uber CEO Dara Khosrowshahi has reportedly said the company is “absolutely committed to self-driving cars.” At Marcari, Russotto, Spencer & Balaban, we applaud any technology that can make the roads safer, but question whether companies have put the public at risk unnecessarily by rushing to road-test autonomous vehicles before they’re ready. It’s great that Uber can build a car to detect a human crossing the street. But that car shouldn’t be on the road until it sees a human being as a reason to stop.

Marcari, Russotto, Spencer & Balaban represents clients in traffic accident cases throughout Virginia, North Carolina, South Carolina and West Virginia. Our attorneys have more than 200 years of combined experience. Call us at (888) 351-1038 or contact us online to schedule a free consultation.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*