Preliminary Report Released for Crash Involving Pedestrian, Uber Technologies, Inc., Test Vehicle
Source: National Transportation Safety Board
WASHINGTON (May 24, 2018) The National Transportation Safety Board released Thursday its preliminary report for the ongoing investigation of a fatal crash involving a pedestrian and an Uber Technologies, Inc., test vehicle in Tempe, Arizona.
The modified 2017 Volvo XC90, occupied by one vehicle operator and operating with a self-driving system in computer control mode, struck a pedestrian March 18, 2018. The pedestrian suffered fatal injuries, the vehicle operator was not injured.
The NTSBs preliminary report, which by its nature does not contain probable cause, states the pedestrian was dressed in dark clothing, did not look in the direction of the vehicle until just before impact, and crossed the road in a section not directly illuminated by lighting. The pedestrian was pushing a bicycle that did not have side reflectors and the front and rear reflectors, along with the forward headlamp, were perpendicular to the path of the oncoming vehicle. The pedestrian entered the roadway from a brick median, where signs facing toward the roadway warn pedestrians to use a crosswalk, which is located 360 feet north of the Mill Avenue crash site. The report also notes the pedestrians post-accident toxicology test results were positive for methamphetamine and marijuana.
In its report the NTSB said Uber equipped the test vehicle with a developmental, self-driving system, consisting of forward- and side-facing cameras, radars, Light Detection and Ranging, navigation sensors and a computing and data storage unit integrated into the vehicle. The vehicle was factory equipped with several advanced driver assistance functions by the original manufacturer Volvo Cars, including a collision avoidance function with automatic emergency braking as well as functions for detecting driver alertness and road sign information. The Volvo functions are disabled only when the test vehicle is operated in computer control mode.
The report states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
In the report the NTSB said the self-driving system data showed the vehicle operator engaged the steering wheel less than a second before impact and began braking less than a second after impact. The vehicle operator said in an NTSB interview that she had been monitoring the self-driving interface and that while her personal and business phones were in the vehicle neither were in use until after the crash.
All aspects of the self-driving system were operating normally at the time of the crash, and there were no faults or diagnostic messages.
(This Uber self-driving system data playback from the fatal, March 18, 2018, crash of an Uber Technologies, Inc., test vehicle in Tempe, Arizona, shows when, at 1.3 seconds before impact, the system determined emergency braking was needed to mitigate a collision. The yellow bands depict meters ahead of the vehicle, the orange lines show the center of mapped travel lanes, the purple area shows the path of the vehicle and the green line depicts the center of that path.)
The NTSB continues to gather information on the Uber self-driving system, the vehicle interface, the vehicle operators personal and business cell phones, the vehicle operator, the pedestrian and the roadway.
The preliminary report contains no analysis and does not discuss probable cause. The information in the report is preliminary and subject to change as the NTSBs ongoing investigation progresses. As such, no conclusions about probable cause should be drawn from the information in the preliminary report.
The preliminary report is available online at https://goo.gl/2C6ZCH.
Read more: https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx
crazylikafox
(2,752 posts)the system was dependent on the operator manually braking, but the system was not designed to alert the operator?
WTF???
sdfernando
(4,925 posts)I can understand there could be a conflict with the built-in auto systems and the Uber system...but not alerting the operator when there is a problem is just plain stupid. Was this done on purpose or was it just missed? either way it is a drastic failure!
PoliticAverse
(26,366 posts)braking ability? If true that seems like an accident waiting to happen if the operator takes their eyes off the road.
crazylikafox
(2,752 posts)NutmegYankee
(16,199 posts)People seem to think these systems are perfect. As someone in engineering, that is not the case.
cstanleytech
(26,236 posts)fast enough as the system itself only determined emergency braking was needed 1.3 seconds before impact.
Kablooie
(18,610 posts)That was plenty of time to stop.
The fact nothing in the system was enabled to stop the car when a hazard is detected is simply criminal.
The car should not have been on a public street if it had to rely solely on the driver to avoid an obstacle when everything else was automated.
Jedi Guy
(3,175 posts)The operator was apparently watching the displays. It seems to me that going over the data would be the job of the engineers and programmers. The operator was there as a backup to the autonomous system. In that case, the system wouldn't need to warn the operator, who should've had eyes on the road the entire time, and who should be ready to intervene if required.
Like I said below, I think the pedestrian was most at fault here, but there's a share of blame for the operator, too.
Jedi Guy
(3,175 posts)Given the conditions described, I doubt a human would have been able to react fast enough. It sounds harsh but the lady who was hit seems to be most at fault, as she had no business crossing the road at that location and had taken no precautions to make herself more visible in low lighting conditions, and she clearly wasn't paying attention to her surroundings.
I'd be curious to learn how many other pedestrians have been struck at or near that location. I'm willing to bet quite a few, given the presence of the sign warning people to use the crosswalk a short distance away.
If anyone else is at fault, it'd be the person in the car, who would have known automatic braking wasn't enabled, and who should have been watching the road instead of the displays. All the same, that person most likely couldn't have reacted quickly enough.
Hopefully this doesn't deter development of autonomous cars.
honest.abe
(8,614 posts)This vehicle did neither. It ran over the pedestrian at full speed with no avoidance maneuvers whatsoever. Even a drunk driver would have done better than that.
Jedi Guy
(3,175 posts)So that explains the lack of a braking attempt. And if there was not enough time to brake, I don't think there would've been enough time to swerve. Given the circumstances, I think the only way this could have been avoided was clairvoyance, which neither people nor machines can do.
Alternately, the pedestrian could have decided not to jaywalk. I wouldn't hold a human driver's feet to the fire over this incident, so I won't hold a computer's feet to the fire, either, lack of feet notwithstanding.
honest.abe
(8,614 posts)Whoever decided to disable auto braking is more to blame than anyone or anything.
Jedi Guy
(3,175 posts)It makes perfect sense. That the outcome was less than perfect doesn't change the reasoning.
honest.abe
(8,614 posts)Indeed.
honest.abe
(8,614 posts).. while the vehicle is under computer control to reduce the potential for erratic vehicle behavior."
That makes zero sense.
Jedi Guy
(3,175 posts)If the computer isn't fully tested yet, you don't want it slamming on the brakes in 60-mph traffic because of a false-positive obstacle reading. That's how you get multiple car pileups.
honest.abe
(8,614 posts)it shouldn't be out on the road.
Jedi Guy
(3,175 posts)Also, testing on actual roads has to start somewhere. Lastly, I'd be willing to bet that autonomous cars have better safety records than human drivers on a road-hours basis.
Autonomous cars are coming, and probably sooner rather than later. Expecting flawless performance is not reasonable, particularly in a situation like this one, where a human driver would have had the same outcome.
Jim__
(14,063 posts)Does that mean she was under the influence of methamphetamine at the time? Is 49 very old to be doing methamphetamine? I would have thought that it is, but I don't actually know.
cannabis_flower
(3,764 posts)use meth.