Tesla driver in fatal 'Autopilot' crash got numerous warnings - U.S. government
Source: Reuters
A man killed in a crash last year while using the semi-autonomous driving system on his Tesla Model S sedan kept his hands off the wheel for extended periods of time despite repeated automated warnings not to do so, a U.S. government report said on Monday
The National Transportation Safety Board (NTSB) released 500 pages of findings into the May 2016 death of Joshua Brown, a former Navy SEAL, near Williston, Florida. Brown's Model S collided with a truck while it was engaged in the "Autopilot" mode and he was killed.
A Tesla Inc spokeswoman Tesla spokeswoman Keely Sulprizio declined to comment on the NTSB report. In 2016, the company said Autopilot "does not allow the driver to abdicate responsibility," however.
Brown family lawyer Jack Landskroner said in an email the NTSB's findings should put to rest previous media reports that Brown was watching a movie at the time of the crash, which he called "unequivocally false."
[font size=1]-snip-[/font]
Read more: http://www.reuters.com/article/us-tesla-crash-idUSKBN19A2XC?il=0
TECHNOLOGY NEWS | Mon Jun 19, 2017 | 7:33pm EDT
By David Shepardson | WASHINGTON
tinrobot
(10,916 posts)That name feels like false advertising.
It is nothing more than a fancy cruise control/lane assist.
bitterross
(4,066 posts)Or not stop ignoring the messages for that matter. This idiot won't be around to breed more idiots.
I do software support for a living. Do you know how many people will call in and admit they payed no attention at all to all the warning messages in the software? That they just clicked on "okay" or "yes" just to get to the next screen?
Then they get really upset when I can't magically undo the damage they've done by ignoring the messages the software gives them. It is the same principle here.
And, you don't have to be a rocket scientist to know that there is no such thing as true and complete auto-pilot on cars yet.
I have no sympathy whatsoever for him. For his loved ones who lost someone - yes. Not for him.
canetoad
(17,190 posts)What is the software that damages computers if you fail to heed the installation dialogue boxes?
eggplant
(3,913 posts)Formatting a drive, emptying your recycling bin, saving over a file with the same name.
canetoad
(17,190 posts)This is damage.
Hassin Bin Sober
(26,343 posts)You just made the case that the system is not safe for the general public.
bitterross
(4,066 posts)What you are saying, if you don't realize it, is that we have to dumb down the whole world to accommodate the for the least capable persons. Deprive capable people of things because others are too stupid to handle it.
I don't think that is a good solution.
Hassin Bin Sober
(26,343 posts)We give a driver's license to anyone from a 16 year old with two functioning limbs and a weekend course in a strip mall, to 105 year olds who can make out shapes at twenty paces.
The entire auto industry, traffic safety industry and consumer product laws are based on just about the lowest common denominator.
LanternWaste
(37,748 posts)Your premise then applies to traffic lights as well.
Zorro
(15,749 posts)The major car manufacturers are playing catch-up to that technology.
PoindexterOglethorpe
(25,902 posts)that the driver had a history of reckless driving, maybe speeding. He rather sounded like a classic Darwin Award kind of guy.
I'm sorry he crashed, but he seems to have totally the one responsible, not the car.
JI7
(89,271 posts)just because people are careless and just because they see the car is self driving ok they will keep taking chances.
if humans were different and they all still completely paid attention as they should then it would work. but people are not like this . just look at people still texting and driving. they know it can be dangerous but when they are actually out there driving they feel like they could get away with it.
so you need 100 percent self driving for it to work.