Tesla on Autopilot slams into parked fire truck on freeway
Source: The Mercury News
A Tesla Model S reportedly on Autopilot smashed into the back of a fire truck parked at a freeway accident scene Monday morning, authorities said.
The union representing Culver City firefighters whose truck was hit around 8:30 a.m. on Interstate 405 in Culver City tweeted that the Tesla driver said he had been using Teslas Autopilot system, which performs automated driving tasks.
The California Highway Patrol and Culver City Fire Department confirmed the southbound Tesla had struck the fire truck, but could not immediately confirm whether the vehicle had been on Autopilot.
The fire truck had been parked in the left emergency lane and carpool lane, blocking off the scene of a previous accident, with a CHP vehicle behind it and to the side, said Culver City Fire Department battalion chief Ken Powell.
~ snip ~
Read more: https://www.mercurynews.com/2018/01/22/tesla-on-autopilot-slams-into-parked-fire-truck-on-freeway/
The term "AUTOPILOT" is a misnomer that leads people into believing it has far more capabilities than it really does. Even though there are warnings that the driver should stay attentive, people are lulled into a false sense of security and begin to zone out if they are not an active participant in at least the majority of the car's operation.
Beyond that, Elon Musk wants fully autonomous operation based on visual cameras and RADAR alone (no LiDAR), which is truly inadequate for safe operation in a fairly standard range of operating conditions - never mind extremes.
FrodosNewPet
(495 posts)http://ktla.com/2018/01/21/man-arrested-on-suspicion-of-dui-on-bay-bridge-told-chp-his-tesla-was-on-autopilot/
Posted 10:26 PM, January 21, 2018, by Erika Martin
A man who was found passed out behind the wheel of his Tesla on the Bay Bridge in San Francisco on Friday was arrested on suspicion of drunken driving, officials said.
The man told officers his Tesla had been set on autopilot, the California Highway Patrol said in a tweet.
Authorities determined his blood-alcohol concentration was more than two times the legal limit, according to CHP. He was subsequently taken into custody.
~ snip ~
No it didnt drive itself to the tow yard, the CHP joked on Twitter.
tymorial
(3,433 posts)still_one
(92,115 posts)the owners manual that the driver MUST always be in control of the car, and NOT depend on the automatic systems
harun
(11,348 posts)Long live Tesla.
Blue_Tires
(55,445 posts)the public's orgasmic anticipation of self-driving cars...
Adrahil
(13,340 posts)But even if he isn't, do you expect automated systems to never have an accident? I mean, how many people die in auto accidents every year with vehicles driven by humans?
Igel
(35,293 posts)Accidents happen because we're distracted or don't pay attention; perhaps we're sleepy or inebriated or upset.
They happen because we drive at speeds that are too high--whether above the speed limit or too fast for the road conditions.
They happen because we drive unsafely, swerving in and out of traffic, tailgating (which isn't even a thing for most of the under-20s I know, some actually think it's natural to stop at a red light with more distance between them and the car in front of them than there is when they're doing 70 mph on the freeway).
In the event of a car wreck right in front of you, if a car swerves into you faster than you can safely steer away from him (either because it's too sharp a turn or because there's another car in the way) the accident will happen, fully autonomous and with all the whistles and bells imagined these days or not. On the other hand, the claim is that when they are all like that, some of these conditions won't happen.
Puzzledtraveller
(5,937 posts)but I think it follows a particular philosophy and what the future of human existance should be. I am not as pro-machine, but perhaps it's because of my love of Frank Herberts Dune series where humans eventually rebelled against computers and machines that replaced and eventually enslaved us. The irony is that I am using one right now to convey this.
LanternWaste
(37,748 posts)I'm certain many thought the same in regards to the internal combustion engine.
Adrahil
(13,340 posts)We can choose to ignore it, or be overwhelmed by it, but we cannot stop it. The only real option is to help direct it.
rlegro
(338 posts)...who might become too enfeebled to drive on the highway for vision limits or reflexes see auto-automobiles as a godsend.
Historic NY
(37,449 posts)the idiot blames the car. Sailing through life until you cover over the cliff.
William Seger
(10,778 posts)I'd guess that the system's sensors can't detect a stopped vehicle soon enough to completely stop from 65mph. That requires quite a distance.
Hugin
(33,112 posts)Where the Highway Patrol are questioning the driver of an RV which had been involved in an accident...
"I don't know what happened, Officers. I set the Cruise Control then went into the back to have a beer and I came to in the wreckage."
Wisdom is always trumped (yes, I used that word on purpose) by stupidity.
Cold War Spook
(1,279 posts)and we will all be safe because computers never crash.
TexasBushwhacker
(20,164 posts)El Mimbreno
(777 posts)Than fully automatic cars is fully automatic weapons. Or maybe the other way around. Let's take a drive with Google Maps: A lap around the ball diamond warning track, off a 10 foot embankment, up a dry creek bed and into someone's garage. All things I've seen marked as streets or roads by Google.
And in this case, I'd bet the driver was texting.
Adrahil
(13,340 posts)El Mimbreno
(777 posts)LanternWaste
(37,748 posts)Your summation is inaccurate. From the article.
"Tesla, after the incident, said Autopilot is intended for use only with a fully attentive driver. The Model S owners manual has numerous warnings that attention to the road is vital while using Autopilot and other Tesla semi-autonomous driving functions."
Additionally, you appear to be the only making the argument that people are being duped into believing the cars are fully autonomous, and therefor dangerous-- but you offer no hard data to support the allegation.
alarimer
(16,245 posts)Because it isn't.
Zorro
(15,733 posts)and it remains the most sophisticated traffic-aware driving assistant on the road today.
alarimer
(16,245 posts)Regardless of sophistication, Tesla needs to take responsibility and either disable it or call it something else.
Because, if they do not, they are responsible for deaths that occur as a result of misuse.
Hassin Bin Sober
(26,319 posts)Even a non-lawyer like me knows calling it an autopilot will be used against them in court regardless of how many disclaimers they can produce.
I suspect the lawyers are overruled by Musk and the marketing department-- supported by the bean counters who say their liability will be less than potential gains.
It's a business decision. Just like Ford and the Pinto.
Woe betide if some documentation shows up like it did in the Pinto case.