Uber self-driving test car involved in crash in Arizona
Source: TechCrunch
Posted 6 hours ago by Natasha Lomas
More bad news for Uber: one of the ride-hailing giants self-driving Volvo SUVs has been involved in a crash in Arizona apparently leaving the vehicle flipped onto its side, and with damage to at least two other human-driven cars in the vicinity.
~ snip ~
Uber has also confirmed the accident and the veracity of the photos to Bloomberg. Weve reached out to the company with questions and will update this story with any response. Update: Uber has now provided us with the following statement: We are continuing to look into this incident and can confirm we had no backseat passengers in the vehicle.
TechCrunch understands Ubers self-driving fleet in Arizona has been grounded, following the incident, while an investigation is undertaken. The company has confirmed the vehicle involved in the incident was in self-driving mode. Were told no one was seriously injured.
Local newspaper reports suggest another car failed to yield to Ubers SUV, hitting it and resulting in the autonomous vehicle flipping onto its side. Presumably the Uber driver was unable to take over the controls in time to prevent the accident.
~ snip ~
Read more: https://techcrunch.com/2017/03/25/uber-self-driving-test-car-involved-in-crash-in-arizona/
PatrickforO
(14,558 posts)the possibility of hacking. Because, face it, we really can't have anything nice without someone screwing it up. I mean, I'd dearly love to go to work in a driverless car, but I can do without having a hacker take control and driving me off a cliff.
OnlinePoker
(5,716 posts)It doesn't have to be a SDC and you have no control.
http://www.cbsnews.com/news/car-hacked-on-60-minutes/
bucolic_frolic
(43,027 posts)I do not believe they will be virtually crash proof. Ever.
They may be as safe as bridges, and we know they collapse once in awhile.
As usual the public and the media are buying into the hype.
The Edsel was great. Enron was safe. The economy is at a permanently high plateau.
paleotn
(17,876 posts)fully aware that humans are most certainly NOT crash proof and far, far more dangerous than software. I don't have to worry about software checking their I Phone, or flipping radio stations, or arguing with the kids in the back seat, or getting sleepy, or spilling their coffee in their lap or putting on their makeup, or......on and on it goes. In this particular case, it was a human failing to yield that caused the crash.
bucolic_frolic
(43,027 posts)I cannot do so in a driverless car that relies on computers that can be hacked or subject to outages from various
problems - electronic, satellite, mechanical/electrical, roads, storms, wireless interference, weather. I know, many
of those things affect normal vehicles. I would bet the snafu's, when they happen, will involve massive pileups
along the lines of the worst black ice/.blizzard incidents on major highways. Far more dangerous than escalators
or elevators.
This will never be foolproof, and will prove far more dangerous than today's driving.
Humans like to put the pedal to the metal. Nothing like a throttle.
paleotn
(17,876 posts)Truth is, systems are far more predictable, controllable, and safer than humans. Many current automobiles are just as hackable and can be made unsafe even with a human operator. Both conventional and autonomous vehicles can be air gapped, greatly reducing the hacking threat. The technology is being developed to adapt autonomous vehicles to adverse weather and......they won't go waaaaay to fast for weather conditions like idiot humans, causing multi car pile ups. Personally, I've got better things to do driving to work than going too fast and ending up in an accident. I'll let the onboard system take care of the throttle. It's got nothing to prove except get me where I'm going safely.
bucolic_frolic
(43,027 posts)I am blocking you.
mainer
(12,017 posts)the humans are always the weak link
FrodosNewPet
(495 posts)In the meantime, though, we are going to have to share the roads. So SDCs need to be designed with that fact in mind.
As a driver, I consider that, at any split second, anyone on the road is going to do the most dangerous thing possible, and try to be aware of my options. Knowing how to crash is just as important as knowing how to drive.
Egnever
(21,506 posts)No one knows how to be tboned well.
FrodosNewPet
(495 posts)What's important is knowing when to run off the right side of the road versus changing lanes.
When to brake versus when to floor it.
When to run the red light to keep from getting rear-ended.
It's not JUST about awareness. It is about the actions you should take.
JustABozoOnThisBus
(23,315 posts)But, I wonder if a half-awake Uber driver would have seen the other driver's mistake, and taken an intelligent or instinctive action to avoid the crash?
Self-driving cars won't be able to avoid all accidents. I hope to have a self-driving car some day, before I'm too old to drive. My car will have selective "modes", from "Driving Miss Daisy", to "Late for work", to "Road Rage Mad Max", and ultimately, "BMW Owner".
FrodosNewPet
(495 posts)According to internal Uber reports, humans have to take control on average once every 0.8 miles.
[hr]
Uber's Self-Driving Cars Are Off to a Rocky Start
http://www.popularmechanics.com/cars/technology/a25734/ubers-self-driving-cars-rocky-start/
By David Grossman | Mar 17, 2017
Uber's self-driving cars are having a little trouble getting down the road on their own. During the week of March 8, the 43 active self-driving Uber cars on the road only drove an average of about 0.8 miles before the safety driver had to take the wheel, according to internal documents acquired by Recode.
Uber uses a metric called "miles per intervention," and according to Recode, it records every single time a driver has to take control of a self-driving car for any reason. "Critical interventions" are also tracked, which only count the times a driver takes control to avoid causing harmbasically whenever they have to grab the wheel to avoid hitting something. Most often, however, the driver has to assume control of the vehicle while it's navigating unclear lane markings, overshooting a turn, or driving in inclement weather.
The average of 0.8 miles before a driver has to take control is a minor decrease in performance from earlier in the year; the cars were were driving 0.9 miles in January. Uber has been testing self-driving cars in a variety of locations. In Pittsburgh, cars are driving semi-autonomously with drivers ready to take the wheel at all times, which is where the new metrics came from.
Uber is also testing autonomous car technology in Arizona. The rider experience there has been described as "not great." Cars in Arizona are only getting 0.67 miles on average before a human needs to take control and only two miles between "critical" events.
~ snip ~
Plucketeer
(12,882 posts)I can vaguely remember the last time I used a pay phone, but I can not recall the last time someone pumped my gas for me.
OnlinePoker
(5,716 posts)It was in Oregon 6 years ago and I got out of my car and started the process. All of a sudden I'm getting yelled at to get back in my car...I'm not authorized to touch the pump.
Plucketeer
(12,882 posts)when I first came to Calif in '82. I'm not sure tho.
not fooled
(5,801 posts)Error-zona is letting Ubug test-drive these on AZ roads and highways, exposing AZ's citizens to this hazard, after CA had the good sense to kick them out.
Let AZ drivers suffer the consequences of this dangerous experiment because freedumb, gotta kiss corporate ass, etc. dont'cha know in this puke-infested state.