Two dead in Tesla crash in Texas that was believed to be driverless
Source: Reuters
April 19, 2021
4:22 AM CDT
Reuters
2 minutes read
Two men died after a Tesla (TSLA.O) vehicle, which was believed to be operating without anyone in the driver's seat, crashed into a tree on Saturday night north of Houston, authorities said.
There was no one in the drivers seat," Sgt. Cinthya Umanzor of the Harris County Constable Precinct 4 said.
The 2019 Tesla Model S was traveling at a high rate of speed, when it failed to negotiate a curve and went off the roadway, crashing to a tree and bursting into flames, local television station KHOU-TV said.
After the fire was extinguished, authorities located 2 occupants in the vehicle, with one in the front passenger seat while the other was in the back seat of the Tesla, the report said, citing Harris County Precinct 4 Constable Mark Herman.
Read more: https://www.reuters.com/business/autos-transportation/two-dead-tesla-crash-texas-that-was-believed-be-driverless-wsj-2021-04-18/
JI7
(89,237 posts)From what I understand people are still supposed to be in front seat and paying attention as if they WERE driving .
But people think they would be able to get away without having to pay attention and do things like this .
Miguelito Loveless
(4,451 posts)The system specifically tells you that you cannot do this. When you engage the system, it checks every 30 seconds for drivers hands on the wheel, and warns you when it does not detect driver input. In order to wind up in this situation, you must actively set out to defeat the safety system.
This was either drunken stupidity, or someone trying to make a viral video.
I have been driving a Model 3 almost 3 years and it is the safest car I have ever owned. It has saved me from other careless drivers on numerous occasions.
aggiesal
(8,906 posts)joshcryer
(62,265 posts)The technology is so far behind what people think it's not even funny. I can understand cars making many tens of thousands if not millions of miles in highway conditions without an accident, but in urban areas it is a freaking difficult problem. And I think people get a false sense of security.
I mean from what I hear the two deceased were older individuals with very smart backgrounds, 59 and 69. Whoever was driving climbed into the freaking back seat to presumably take a nap or some other such demonstration, and it got them both killed. The car disengages without someone in the drivers seat so it was certainly being modified with weight to keep the seat bearing load, and the steering wheel needs a hand on it, so he rigged that, too. It is a totally stupid thing that he did. All because, in my opinion, he was mislead by the cars performance on the highway.
Video I was talking about, shows just how, utterly, awful its sensing is. It is not ready. It is probably fine for a long cross country trip and likely has saved a few drowsey drivers lives, but it is not ready at all for actual self driving.
oldsoftie
(12,485 posts)Let the accident lawsuits begin. Maybe 50 yrs from now the tech can handle it who knows. But i doubt it
Miguelito Loveless
(4,451 posts)And doing a stunt where you leave the driver seat while the car is in motion requires deliberately circumventing safety systems and ignoring warnings.
Auggie
(31,130 posts)LittleGirl
(8,277 posts)You cant possibly allow driven and driverless cars using the same lane. Safety!
Miguelito Loveless
(4,451 posts)there are NO "driverless" vehicles on the roads in the US. The Tesla system is NOT a driverless system, it is a driver assist system (like lane keeping, adaptive cruise control, emergency braking, collision warning). It can be circumvented and misused by a determined idiot. People get annoyed because the ystem is called "autopiilot", but if you read the FAA rules on the definition of autopilot systems you will see that are not "pilotless" systems and the pilots are NOT allowed to leave the flight deck while the system is engaged.
No system is "fool proof" as fools are so ingenious.
LittleGirl
(8,277 posts)in the future, I suggest that they have driver-less vehicles in one lane so as not to interfere with driven vehicles.
Miguelito Loveless
(4,451 posts)despite some folks optimistic predictions. But a dedicated lane makes sense.
LittleGirl
(8,277 posts)at least 5-10 years before we get there.
Miguelito Loveless
(4,451 posts)a barrier (though it is a big one) than simply re-writing traffic law.
LittleGirl
(8,277 posts)The safety of passengers in driverless vehicles will lobby for safe conditions for their journeys. Getting killed in a driverless car will be a death sentence for the business model so they will have to protect their passengers or not have any business. Right?
I really havent spent a lot of time thinking about safety for this industry but we need to start considering the implications and like bike lanes, make sure its safe. Otherwise, it will not survive.
Miguelito Loveless
(4,451 posts)no matter how safe it is, compared to the current status quo, any fatality gets HUGE headlines, and Wall Street gets upset, and people will file lawsuits. It will take time for Tesla to accumulate enough data for any meaningful analysis, but in the meantime "X People Die in Tesla" generates lots of clicks.
Major Nikon
(36,818 posts)Miguelito Loveless
(4,451 posts)along with Johnny Dangerously, Top Secret, Love At First Bite, and Young Doctors in Love.
Polybius
(15,328 posts)I still have never seen it. Good movie?
Miguelito Loveless
(4,451 posts)Dabney Coleman, Michael McKean, Sean Young, Harry Dean Stanton, Taylor Negron, Hector Elizondo, and a ton of soap opera cameos. It is more of a riff on "General Hospital" and "The Doctors" type soaps.
Another movie that got clobbered by coming out at the same time as a blockbuster, was "Used Cars" with Tim Matheson, his first non-Disney R rated part (PG-13 these days). VERY funny, nice cameo by Al Lewis (Grandpa Munster). It had the misfortune of coming out the same week as "Airplane!".
oneshooter
(8,614 posts)LittleGirl
(8,277 posts)Response to LittleGirl (Reply #6)
oneshooter This message was self-deleted by its author.
bucolic_frolic
(43,027 posts)I was always reprimanded for criticism of driverless vehicles on DU. "Safer than humans." And I would say you know, unforeseen events, weather, and Americans being what they are, people playing chicken with the AI machines. Imagine the payout for having an accident with an Elon Zombie! So not glad to see, but predictable, flaws do exist.
By the way, there are mutual funds traded with AI and without human oversight. Just saying.
Miguelito Loveless
(4,451 posts)bucolic_frolic
(43,027 posts)If it can be used in an improper manner, the public will do it, and manufacturers must design with that in mind.
Miguelito Loveless
(4,451 posts)where I work, punchpresses have sensors to keep people from mangling limbs. They are designed to inhibit operation of the system if the sensors detect an object (like a hand) in a danger zone. People have circumvented these systems and it is NOT the fault of the manufacturer when people get hurt as a result.
Pilots have crashed planes on autopilot by deliberately disobeying rules in their use. You can't fix stupid.
mac2766
(658 posts)While a driver is able to remove their hands from the steering wheel for a brief period of time, the car auto-senses that there are no hands on the wheel and requires the driver to place their hands back. Also, with the technology, the car would not be speeding.
I'm completely skeptical of the report.
Miguelito Loveless
(4,451 posts)this has been done. I work in a shop that has sensors to prevent a punch press from operating if yours hands are in a defined danger zone. I have caught people defeating this system for convenience.
I have been driving a Model 3 for almost three years, and the system gets better every few months. It recognizes stop signs, stop lights, and speed limit signs, and reacts appropriately to them. At anytime I can override the system and make it run stop signs/lights, or exceed the posted. speed limit. The choice is mine. This seems to me a "Hey, hold my beer and watch this" moment.
mac2766
(658 posts)More a user issue. The article should have specified that.
People are having a difficult time adapting to the change in technology as it is. An article that leads a reader to mistrust the technology even more isn't going to help. We need this technology, and fast. The more negativity is presented about it, the slower Americans are going to adopt it. Europe and Asia don't seem to be having that problem.
Just my opinion.
Miguelito Loveless
(4,451 posts)is that the driver was showing off to a friend and punched it to demonstrate the phenomenal acceleration (0-60 in under 5 seconds, for the basic models) and lost control of the car, striking the tree. The impact damaged the car to such an extent that the battery pack was damaged, the frame deformed to the point the front doors would not open. Injured, the driver climbed into the back seat, but could not open the back doors (due to injury, smoke inhalation, and/or structural damage), so the driver and passenger die, one in the front passenger seat, the driver in the back seat.
More evidence is needed to confirm or disprove this theory.
rickyhall
(4,889 posts)MineralMan
(146,248 posts)"It's good for the environment and OK for you."
NBachers
(17,080 posts)pimpbot
(939 posts)Idiots gonna idiot. They will just put a bowling ball or something in the seat. Ideally the inside facing camera could be used to detect a body and eye motion, but people were already complaining about privacy with the cameras.
I compare this accident to a DUI accident. We will never completely stop people from being dumbasses and doing stupid things.
AllaN01Bear
(17,944 posts)hands on steering wheel before you hit accept?
JohnnyRingo
(18,614 posts)The Tesla is a Level Two autonomous car and requires driver input and oversight.
People should know that, but it doesn't help that Tesla implies it has some kind of autopilot.
csziggy
(34,131 posts)Because that is all it is.
Autopilot on planes is advanced enough some systems can even land the planes. A significant difference between planes and cars is that planes do not have negotiate roads, they just have to avoid other planes and stay in the air.
MichMan
(11,864 posts)PatrickforB
(14,557 posts)I've been reading about cybersecurity, and the reality here is that it is a real battle for us to keep up with hackers, and our system is currently vulnerable in a number of places. These driverless cars are part of the Internet of Things (IoT), and as such, vulnerable to cyberattack. Be mindful that Russia shut down Ukraine in 2017 with the Petya hack, and we set Iran's nuclear capability back at least a decade with the Stuxnet hack.
So information security is a BIG deal across all industries, and until these driverless vehicles are somehow rendered hack-proof, I'm not going to be jumping up and down to get a driverless car, even though that would be wonderful because I dislike the head games other drivers play, and would much rather use commute time in reading or other work.
csziggy
(34,131 posts)Or defeat the sensors that remind a driver to keep their hands on the wheel. Both of which seem to have happened in this case.
Mr. Sparkle
(2,927 posts)Kablooie
(18,605 posts)FSD (Full Self Driving)
It's unlikely that the crashed car had this feature.
FSD is much safer than regular autopilot particularly on surface streets.
Duppers
(28,117 posts)SunSeeker
(51,502 posts)If all it takes is buckling the seat belt, seems like that is pretty easy to abuse.
Mr. Sparkle
(2,927 posts)Kablooie
(18,605 posts)The owner's brother in law was driving the car and only drove a few yards on a neighborhood street before crashing at high speed.
The most likely scenario was that the brother in law wanted to try out Tesla's famous super fast takeoff which really pushes you back into your seat. He did not put on his seatbelt, floored the car and lost control. The crash threw him into the back seat.
Reasons for this scenario:
A driver without a seatbelt can be thrown into the back seat in a severe crash. There is a video online of this happening.
To start the car there must be weight on the seat and the brake must be depressed. Only someone in the driver's seat can do this.
Once you start driving, one press on the right lever starts cruise control. Two presses starts autopilot if it's available. It is not available on many streets and not available when you first start the car. A driver unfamiliar with the system may press the stalk once and think he is in autopilot mode when he is only in cruise mode which does not steer.
If you are on a neighborhood street cruise control starts at 10 mph and can pickup speed but always at a reasonable rate.
None of the automatic features allow you to drive from 0 to 60 in a short distance.
The car will automatically apply brakes just before an imminent crash but driver's input will override automatic features.
The fire is concerning though.
Tesla is one of the highest rated cars for accident safety and it is specifically designed to prevent battery fires but in severe accidents it obviously can catch fire.
SunSeeker
(51,502 posts)I agree, the fire is more concerning than what these two idiots did.
ToxMarz
(2,162 posts)Buckeye_Democrat
(14,852 posts)... that I'd use that feature in some situations.
There's a big truck stop in my town now, close to the grocery store where I usually shop, and I often have to start hitting the brakes as soon as I see yet another semi-truck pulling out. If I hadn't reacted immediately after a truck had started lurching forward, I doubt that I could've slowed down in time to avoid hitting them.
Never see any cops around there, of course! They can pull into traffic repeatedly because of their added revenue to the city, I suppose.