General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsUber reportedly thinks its self-driving car killed someone because it 'decided' not to swerve
Uber reportedly thinks its self-driving car killed someone because it decided not to swerveThe cars sensors saw her, but may have flagged the detection as a false positive
https://www.theverge.com/2018/5/7/17327682/uber-self-driving-car-decision-kill-swerve
By Sean O'Kane | May 7, 2018, 2:41pm EDT
Uber has discovered the reason why one of the test cars in its fledgling self-driving car fleet struck and killed a pedestrian earlier this year, according to The Information. While the company believes the cars suite of sensors spotted 49-year-old Elaine Herzberg as she crossed the road in front of the modified Volvo XC90 on March 18th, two sources tell the publication that the software was tuned in such a way that it decided it didnt need to take evasive action, and possibly flagged the detection as a false positive.
The reason a system would do this, according to the report, is because there are a number of situations where the computers that power an autonomous car might see something it thinks is a human or some other obstacle. Uber reportedly set that threshold so low, though, that the system saw a person crossing the road with a bicycle and determined that immediate evasive action wasnt necessary. While Uber had an operator, or safety driver, in the car who was supposed to be able to take control in a failure like this, the employee was seen glancing down in the moments before the crash in footage released by the Tempe Police Department.
~ snip ~
In the wake of the crash, signs have emerged that Ubers self-driving program was potentially fraught with risk. For one thing, Uber had reduced the number of safety drivers in its test cars from two to one, according to a New York Times report. This explained why the driver who was in the car that killed Herzberg was alone.
Then in late March, Reuters discovered that Uber had reduced the number of LIDAR sensors on its test cars. (LIDAR is considered by most to be critical hardware for autonomous driving.) All this was happening in an environment with little oversight from the government in Arizona. Emails obtained by The Guardian in the weeks after the crash detailed a cozy relationship between Uber and Arizona Governor Doug Ducey that may have allowed the companys test cars to hit the road even earlier than previously thought.
~ snip ~
grantcart
(53,061 posts)I believe that I would not have been able to stop.
Last Friday,two blocks from my home I passed a bicyclist who was,struck in the bright morning sun. The body was still in the street.
Cycles and pedestrian fatalities are at such a high rate in AZ that I am sure that technology can improve safety hete.
FrodosNewPet
(495 posts)That is NOT a poorly lit area. And even if it was, LiDAR and RADAR can work in pitch black conditions.
pnwmom
(108,976 posts)farther than a human being in low light conditions, and a human would have at least tried to swerve, or hit the break. This car did neither.
One problem is that they had many fewer sensors than other vehicles, and the positioning of the existing ones wasn't optimal for that crash.
mr_lebowski
(33,643 posts)Globally, they probably do so on the order of 1,000,000,000,000 times every single day. That's a quadrillion, for those of you counting zeros. I could be off a bit, but ... it's probably on the low side if anything.
It's a bit hilarious to see an article trying to make a Luddite-based joke about the concept of 'computers deciding something', as if they're incapable of doing so ... what, cause they're not 'people'? Is the the gist? If so ... LOLOLOLOL.
In fact, that's damn near ALL THEY DO is 'decide shit', roughly a bajillion times, every single day.
SWBTATTReg
(22,112 posts)programming anything as complicated as driving a vehicle through traffic through a 360 degree environment is very daunting as Uber found out. There is a rule in coding that 90% of coding is easy, it's the 10% devoted to logic that's the hardest and prone to error and trial numerous times, if it can even be fixed, or logically defined.
Takket
(21,560 posts)pnwmom
(108,976 posts)for hours under these conditions, where the car is doing the driving. That's why other manufacturers are working on driver-less vehicles.
NCTraveler
(30,481 posts)Thats solid. Lol
Amir Efrati doesnt have a clear history or anything. So much of our opinion is derived from thinly veiled propoganda.
FrodosNewPet
(495 posts)But getting back to the original topic...
Neither Uber, nor anyone else, should be shortcutting the technological development and sensor redundancy for putting the safest possible cars on the road.
Considering the massive disruption, the incredible loss of jobs that this tech will bring, "a little bit safer" is not enough. Perfection in safety is not achievable, but it should always be the goal.