Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

sl8

(13,665 posts)
Sat Mar 31, 2018, 07:44 AM Mar 2018

Tesla says Autopilot was engaged during fatal Model X crash

Source: The Verge

'Tesla Autopilot does not prevent all accidents'

By Andrew J. Hawkins@andyjayhawk
Mar 30, 2018, 10:05pm EDT

Tesla says Autopilot was engaged at the time of a deadly Model X crash that occurred March 23rd in Mountain View, California. The company posted a statement online late Friday, after local news reported that the victim had made several complaints to Tesla about the vehicles Autopilot technology prior to the crash in which he died.

After recovering the logs from the crash site, Tesla acknowledged that Autopilot was on, with the adaptive cruise control follow distance set to a minimum. The company also said that the driver, identified as Apple engineer Wei Walter Huang, had his hands off the steering wheel and was not responding to warnings to re-take control.

The driver had received several visual and one audible hands-on warning earlier in the drive and the drivers hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.


Read more: https://www.theverge.com/2018/3/30/17182824/tesla-model-x-crash-autopilot-statement



From http://abc7news.com/automotive/i-team-exclusive-victim-who-died-in-tesla-crash-had-complained-about-auto-pilot/3275600/

[div class"excerpt"]I-TEAM EXCLUSIVE: Victim who died in Tesla crash had complained about Autopilot

By Dan Noyes
Wednesday, March 28, 2018 06:48PM

MOUNTAIN VIEW, Calif. (KGO) -- "Walter was just a straight up, caring guy," Shawn Price told the ABC7 News I-Team. Friends and family are mourning the death of Apple engineer Walter Huang after he crashed in his Tesla in Mountain View Friday.

The ABC7 I-Team has word of a major development in the investigation. His family says Huang complained about the Tesla's auto-pilot "before" the accident. Dan Noyes has an exclusive report.

Walter Huang's family tells Dan Noyes he took his Tesla to the dealer, complaining that -- on multiple occasions -- the auto-pilot veered toward that same barrier -- the one his Model X hit on Friday when he died.

...


22 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Tesla says Autopilot was engaged during fatal Model X crash (Original Post) sl8 Mar 2018 OP
self driving cars are about as stupid an idea as... samnsara Mar 2018 #1
If man were meant to fly, he'd grow wings! Dennis Donovan Mar 2018 #2
And don't most commercial airliners... agtcovert Mar 2018 #3
Tesla's system is sadly deficient FrodosNewPet Mar 2018 #4
On the threshold of this technology Plucketeer Mar 2018 #7
Apples and oranges LeftInTX Mar 2018 #10
Aircraft autopilots are nothing like the Tesla A/P rickford66 Mar 2018 #18
That may well be the case... agtcovert Apr 2018 #19
Where did I say to stop ? rickford66 Apr 2018 #20
And don't try to do it on the cheap. FrodosNewPet Apr 2018 #21
No one is using that as an excuse or an argument. LanternWaste Apr 2018 #22
This message was self-deleted by its author paleotn Mar 2018 #5
As a programmer for 20 years I can state firmly our roads will be much safer with computers harun Mar 2018 #11
As God is my witness, I thought cattle could follow traffic signals. JustABozoOnThisBus Mar 2018 #12
Autopilot is an assistive technology... Adrahil Mar 2018 #14
I don't blame the company gyroscope Mar 2018 #17
Ah yes.... Plucketeer Mar 2018 #6
Fake news, they replaced the horse with steam power! Lets go back to steam engines! Canoe52 Mar 2018 #9
" on multiple occasions" SonofDonald Mar 2018 #8
Exactly... Adrahil Mar 2018 #15
In a completely unrelated incident, three women died in Detroit JustABozoOnThisBus Mar 2018 #13
Darwin Award gyroscope Mar 2018 #16

samnsara

(17,604 posts)
1. self driving cars are about as stupid an idea as...
Sat Mar 31, 2018, 07:50 AM
Mar 2018

...when our local Rodeo Board decided to have a cattle drive down the middle of Main Street.

agtcovert

(238 posts)
3. And don't most commercial airliners...
Sat Mar 31, 2018, 08:27 AM
Mar 2018

have autopilot to boot? Just sayin'.

I believe it will take some time, but autonomy will ultimately prove much safer in concert with a human in the vehicle. I commute ~70 miles a day round trip. I see accidents and...well, crazy every day. There has to be a better way.

(Of course, if we had something resembling public transport..train, etc. infrastructure, I'd jump at that in a heartbeat).

FrodosNewPet

(495 posts)
4. Tesla's system is sadly deficient
Sat Mar 31, 2018, 08:54 AM
Mar 2018

Elon Musk is opposed to LiDAR, preferring a system strictly based on optical cameras and radar. That is a serious shortcoming of Tesla's current limited system, much less their future autonomous plans. Radar is good for ranging, but sadly lacking in resolution. Optical is subject to a range of issues including light conditions (either when it is too dark or it is blinded by the light), lens flaring, dirty lenses, fog, rain, snow, etc.

The best project out there (Waymo) still only has about 5600 miles average between human engagements. And that is in a small number of carefully controlled places and situations. Uber, before the SDC testing shutdown, only had 13 miles average.

99% of driving is fairly easy for an SDC. But that last 1% is the most critical, and it is a chaotic BEAR to get right.

 

Plucketeer

(12,882 posts)
7. On the threshold of this technology
Sat Mar 31, 2018, 11:29 AM
Mar 2018

We're gonna see wrinkles. Anyone remember the DeHavilland Comet - the first commercial airliner to go into regular service? It was ill-designed passenger windows that had them breaking up in flight. And it took several such tragedies before that flaw was realized. Did it forever doom the future or jet airliners??? And what about that big new era of ocean liner that was "unsinkable"?

While I know nothing of what this drive was like - if he had complained that the autopilot system was quirky - AT THIS VERY SPOT in times past! WHY was he trusting it there again? Too bad there isn't a driver cam aimed at the driver to help determine what the driver was actually doing at the time of a crash.

LeftInTX

(25,117 posts)
10. Apples and oranges
Sat Mar 31, 2018, 11:53 AM
Mar 2018

When autopilot is engaged the skies are not crowded. Commercial pilots are trained professionals who have all sorts of instruments, so if autopilot gets them off course, they can intervene. They are also "on the job" and focused on their flight more than their destination. Commuters are focused on their destination etc.

Yeah, I agree public transport is the way to go.

I think with "self driving" cars, the "self driving" needs to be secondary to an attentive driver.

rickford66

(5,521 posts)
18. Aircraft autopilots are nothing like the Tesla A/P
Sat Mar 31, 2018, 05:41 PM
Mar 2018

Aircraft A/P's hold an altitude, climb or descend at a specified rate of climb, and maintain straight flight between waypoints (artificially specified points on the Earth). At a waypoint the A/P may turn the aircraft to a new heading according to a programmed flight plan. All of this is in three dimensional "empty" space. TCAS is a system which warns of probable collision with another aircraft, It directs the pilot to manually climb or descend, at specified rates of climb, to avoid the possible collision. The present systems only work in vertical modes. Comparing any of this to the infinite combinations of driving conditions isn't realistic. Self driving cars are many years away from being safe.

agtcovert

(238 posts)
19. That may well be the case...
Sun Apr 1, 2018, 11:26 PM
Apr 2018

How long did it take to get the system you described to be safe?

I agree that the complexity and situations are different on many levels, my point here is there's an underlying principle that's underscored all these efforts (the first car, the first flight, collision warning, autopilot): the undertaking may have been dangerous, almost undoubtedly cost human life, and yet we continue. Merely saying something isn't safe and (understanding translation is lost without face-to-face communication) intimating we should stop because the task is perceived as too complex, impossible, too costly is not borne out when I look at our history. We keep trying, we adapt.

It may well be a decade or more before these systems are fully safe. The only way from the current state to that future state is to continue the endeavor.

rickford66

(5,521 posts)
20. Where did I say to stop ?
Mon Apr 2, 2018, 12:07 AM
Apr 2018

I was responding to aircraft have autopilots comment. Aircraft A/Ps are no where's near what's needed for a self driving car. Aircraft A/Ps operate within a limited range of very well known inputs in practically empty spaces. While self driving cars will have to respond to an infinite changing unpredictable crowded environment. I'm sure they will get safer as time goes on, but I doubt they will replace human drivers on public roads. The car will certainly react much quicker than a human, but only for situations predicted by the s/w designer. For an infinite combination of situations, I'd prefer a human behind the wheel. Can the car tell whether a tree limb is about to fall on it or if a puddle is really a hole filled with water or if a kid 30 feet to the side is going to throw a rock at the windshield or if someone yells a warning etc. etc. "It may well be a decade or more before these systems are fully safe" ? Even though many aircraft can takeoff, fly and land without a pilot, the A/Ps do malfunction occasionally. They will never be pilotless and driverless cars will never be "fully safe", but I never said to stop developing them. As a real time simulation engineer I think it would be fun and a challenge. I pretty much avoided flying simulators on motion after experiencing a few real bad glitches. With all the hours of testing it happens.

FrodosNewPet

(495 posts)
21. And don't try to do it on the cheap.
Mon Apr 2, 2018, 12:22 AM
Apr 2018

There needs to be transparency. There needs to be performance standards under the law. There needs to be security. And there needs to be clear definitions of liability.

Right now, the SDC industry goes not have nearly enough of any of those.

Yes, keep working on it. For all the challenges and danger, there ARE some amazing benefits.

But we cannot use the excuse of "humans suck at driving" as an excuse to all an insufficient technology control multi-ton machines at high speed in an unrestricted environment.

 

LanternWaste

(37,748 posts)
22. No one is using that as an excuse or an argument.
Mon Apr 2, 2018, 09:29 AM
Apr 2018

"But we cannot use the excuse of "humans suck at driving" as an excuse..."

No one is using that as an excuse or an argument.

No one is claiming that the system is 100% ready for implementation.

Yet you consistently act as though those arguments are the default position. They are not.

Response to Dennis Donovan (Reply #2)

harun

(11,348 posts)
11. As a programmer for 20 years I can state firmly our roads will be much safer with computers
Sat Mar 31, 2018, 12:43 PM
Mar 2018

doing the driving than people.

Will take some time though.

 

Adrahil

(13,340 posts)
14. Autopilot is an assistive technology...
Sat Mar 31, 2018, 04:31 PM
Mar 2018

My new care has adaptive cruise control and lane-keeping assist. You CAN drive hands off reliably most of the time. But it is a driver relief mode, not a driver replacement mode. Don't blame technology for its misuse.

 

gyroscope

(1,443 posts)
17. I don't blame the company
Sat Mar 31, 2018, 05:32 PM
Mar 2018

I blame the people who are dumb enough to put their lives on the line using experimental autonomous tech which is still an unproven technology. the sad thing is they are not only putting their own lives in danger but also everyone else's on the road. which makes us all little more than human guinea pigs for the auto industry (and corporate America) whether we choose to be or not.

Canoe52

(2,948 posts)
9. Fake news, they replaced the horse with steam power! Lets go back to steam engines!
Sat Mar 31, 2018, 11:39 AM
Mar 2018

Just think we’ll bring back all those steam locomotive jobs, be hugely great for the economy.

SonofDonald

(2,050 posts)
8. " on multiple occasions"
Sat Mar 31, 2018, 11:39 AM
Mar 2018

"The autopilot veered toward that same barrier"

And yet he continued to rely on it.

A person has died but that statement makes me wonder.

I drive a high performance car that's built to handle, I can take my hands off the wheel at 80 mph and it tracks dead straight in my lane although I've only done it twice and kept both hands less than an inch away.

I know it will go straight but I don't trust it whatsoever, I'm driving the car and after 43 safe years of operating a vehicle I still check all the lights including the flashers, brakes and emergency brake every time I leave the house.

Because anything can fail at anytime.

He knew something was wrong but kept operating the car as if nothing was.

An accident waiting to happen.

 

Adrahil

(13,340 posts)
15. Exactly...
Sat Mar 31, 2018, 04:34 PM
Mar 2018

For some reason the LKAS system in my new car always wants to veer off a local road at the same spot. But I use the system like it is intended to be used... as a relief mode that I monitor. When it veers, I grab it and stay on the road.

JustABozoOnThisBus

(23,321 posts)
13. In a completely unrelated incident, three women died in Detroit
Sat Mar 31, 2018, 01:57 PM
Mar 2018

2:00am, 100mph in a 45 zone, three womend died in the rollover. Seatbelts were irrelevant.
If these women had been in a Tesla or Uber in auto-drive mode, they would likely be alive.
Hundreds, thousands, die in human-driven auto crashes each year.

While tragic, one or two deaths is no reason to stop research on self-driving vehicles.
The results of human-driven deaths should provide impetus to this development.

I'd say the development should be government-driven, if we had a responsible government at this time, but alas ...

https://www.freep.com/story/news/local/michigan/wayne/2018/03/29/redford-crash-pole-killed-accident/468891002/

 

gyroscope

(1,443 posts)
16. Darwin Award
Sat Mar 31, 2018, 05:09 PM
Mar 2018

So the driver was aware there was a problem with the autopilot but yet he kept using it??
why would you risk your life using something you know is faulty??

for an Apple engineer this guy doesn't seem too bright. engineers may have book smarts but perhaps not enough common sense. another recent example of this is the pedestrian bridge collapse in Miami. the engineers knew there was a problem with the bridge but failed to shut down traffic under the bridge until it was too late.

Latest Discussions»Latest Breaking News»Tesla says Autopilot was ...