HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » General Discussion (Forum) » Should A Self-Driving Car...

Thu Oct 29, 2015, 10:24 AM

 

Should A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario?

Last edited Thu Oct 29, 2015, 10:59 AM - Edit history (1)

I hadn't thought of this before...


Picture the scene: You’re in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side. Should the car swerve to the side into the wall, likely seriously injuring or killing you, its sole occupant, and saving the group?

http://www.iflscience.com/technology/should-self-driving-car-be-programmed-kill-its-passengers-greater-good-scenario


edit: People seem to get hung up on the scenario in the article, so I am going to give another one, which I have encountered:

In my own experience, driving down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?

36 replies, 2900 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 36 replies Author Time Post
Reply Should A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario? (Original post)
GummyBearz Oct 2015 OP
librechik Oct 2015 #1
GummyBearz Oct 2015 #2
librechik Oct 2015 #4
LineLineLineReply I
ryan_cats Oct 2015 #16
valerief Oct 2015 #31
ohnoyoudidnt Oct 2015 #25
tkmorris Oct 2015 #3
GummyBearz Oct 2015 #5
Erich Bloodaxe BSN Oct 2015 #9
Fumesucker Oct 2015 #14
Erich Bloodaxe BSN Oct 2015 #15
Fumesucker Oct 2015 #22
DetlefK Oct 2015 #6
GummyBearz Oct 2015 #7
jberryhill Oct 2015 #8
GummyBearz Oct 2015 #10
jberryhill Oct 2015 #11
ryan_cats Oct 2015 #18
kcr Oct 2015 #12
Deadshot Oct 2015 #13
GummyBearz Oct 2015 #17
Humanist_Activist Oct 2015 #19
Deadshot Oct 2015 #24
GummyBearz Oct 2015 #27
ryan_cats Oct 2015 #20
GummyBearz Oct 2015 #29
Deadshot Oct 2015 #23
GummyBearz Oct 2015 #26
Nuclear Unicorn Oct 2015 #21
GummyBearz Oct 2015 #28
jberryhill Oct 2015 #35
jberryhill Oct 2015 #34
hunter Oct 2015 #30
GummyBearz Oct 2015 #32
kentauros Oct 2015 #33
NutmegYankee Oct 2015 #36

Response to GummyBearz (Original post)

Thu Oct 29, 2015, 10:28 AM

1. Is this a self driving car without brakes?

I imagine they can build in appropriate fail-safes

But then GM and VW both kept their deadly secrets for years.

Why don't I just posit that we are all screwed, and that sounds like a novel and diabolical twist on our certain doom.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to librechik (Reply #1)

Thu Oct 29, 2015, 10:35 AM

2. If every car could always break in time there would never be an accident...

 

In my own experience, going down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #2)

Thu Oct 29, 2015, 10:39 AM

4. it would stop and over to you, owner. They drive very slowly and have an override function.

If anything happened like you describe, the industry would grind to a halt until they figured it out. I'm sure they are tearing their hair out over issues like the OP, and that's why we don't see them on the street now.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #2)

Thu Oct 29, 2015, 12:14 PM

16. I

I don't have a problem if the computer crashes into people in my way, as I usually drive on the sidewalk, it's when it backs, back over them, that I appreciate the level of thought that went into the code.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to ryan_cats (Reply #16)

Thu Oct 29, 2015, 01:22 PM

31. Bwahaha!

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #2)

Thu Oct 29, 2015, 01:10 PM

25. The computer would likely have a much faster reaction time in braking than a driver.

That alone will make a huge difference. Add to that a much greater field of view for a self-driving car than a person. It should be able to detect people moving about on the side of the road that might move into the path of the car, recognize threats faster, be prepared (like slowing down a bit just in case) and react faster.

Swerving off the road can also be deadly for other people. It could swerve into another car or off the road into pedestrians walking on the sidewalk or sitting at a bus stop. The safest option may be to just brake as fast as possible. Self-driving cars or not, people are still going to get hit. Accidents will happen. A system designed with a greater field of view, especially at night that doesn't get distracted by the radio, cell phones or whatever sounds safer.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 10:37 AM

3. Here's the problem with all the scenarios like these

The carefully constructed scenarios won't happen as described. Not ever.

The problem with this one is of course that turning a blind corner and finding yourself faced with a gaggle of people on a roadway you cannot deviate from, while simultaneously going too fast too stop, is not a thing any reasonable driver (including an autonomous one) would ever allow to occur. If I were approaching such a corner for example, I would slow my speed to the point that I could react to ANYTHING I might find exiting the corner that I couldn't see going into it, up to and including UFO's, ISIS camps, and wandering herds of Brachiosaurus.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to tkmorris (Reply #3)

Thu Oct 29, 2015, 10:39 AM

5. I agree their scenario seems off

 

I would drive the same way. See my reply above (post 2) though with a similar experiment, one which I have encountered (luckily never hitting anyone)

Reply to this post

Back to top Alert abuse Link here Permalink


Response to tkmorris (Reply #3)

Thu Oct 29, 2015, 10:49 AM

9. It's almost physically impossible, really.

Those 'blind corners' are usually 90 degree turns, with things built or growing too close to the street. I've gone around a 90 degree corner exactly once at any real speed. When I was first trying to learn to drive, and got the clutch and the accelerator confused.

I couldn't even stay on the road, much less stay in my own lane. I went up over the opposite curb, thanks to the momentum of a car going forward at speed while trying to execute a sharp turn. You have to slow down for sharp turns to stay on the road, and that self driving car is going to have far better reflexes in braking than any human.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Erich Bloodaxe BSN (Reply #9)

Thu Oct 29, 2015, 11:42 AM

14. I hit a deer not long ago, managed to swerve enough I didn't hit it head on but crushed a headlight

The deer then spun around and hit the passenger door on the truck and dented it, killed the deer of course and about $2000 damage to the truck if everything got fixed but it didn't because it's an old truck.

It was 11 pm or so, road about as straight as they get around here, big brick mailbox on the right the deer came out from behind it at the last moment, I never had time to even go for the brake, just jerk the wheel to the left.

If there had been a car in the other lane I'm not sure my reflexes wouldn't have put me into it, I'd rather a computer was driving in that situation.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Fumesucker (Reply #14)

Thu Oct 29, 2015, 12:06 PM

15. Reminds me of the time a tree jumped me.

And, no, I'm not kidding. There was a discarded Christmas tree next to the road, and the car ahead of me clipped it, setting it spinning around so that the trunk spun around and whacked the car I was in as we went past. We got off with a lot less damage than you did, it was low enough that all it did was whack the hubcap. But as you point out, sometimes your option is 'take the hit' or try to swerve around, which might put you into oncoming traffic or off a bridge or something.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Erich Bloodaxe BSN (Reply #15)

Thu Oct 29, 2015, 01:07 PM

22. Ok, here's the scenario

Driving down the road as we were, a child runs out from behind an obstruction far too late for braking to do any good, there is a car coming from the other direction, your car correctly calculates that the child will almost certainly die if he is hit by the car. On the other hand the occupants of the two vehicles have a fairly good chance of survival if the cars collide.

What decision is the best one to make, certain death for one versus possible death or injury for two or more?

It's also interesting to note that two or more autonomous cars could in theory communicate with each other and then perform coordinated evasive maneuvers that may completely avoid the threat to life or injury. For instance the car in the other lane is informed within microseconds that your car has to take evasive action and there is a clear patch on the side of the road that may damage the car but won't hurt the passengers so it pulls off at speed in order to avoid your car which has changed lanes to avoid the child.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 10:43 AM

6. First you would have to teach a calculator what "good" is.

Asimov's Laws only work if the robot actually knows what you are talking about.

"A robot may not harm a human or allow it to be harmed through inaction."
- But what if the robot contains a toxic plastic? The robot will one day break down and end up in a landfill. His toxic plastic will get released into the environment and he will harm humans through inaction.
- What if robotic slave-labor makes paid jobs for humans obsolete? This will drive up income-inequality, leading to poorer health and earlier deaths of many humans.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to DetlefK (Reply #6)

Thu Oct 29, 2015, 10:47 AM

7. You're going a bit off topic with the robot labor

 

As of now I don't think we can program intelligence, but the question is along the lines of how to program the computer to value lives. This is not the same as programming it to understand morality, just a simple equation "We are about to hit 1 person, I have 2 people in the car, 2 > 1, therefore I hit the 1 person"

The end of the article brings up an interesting point about having different levels of programming. Did you read that part?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 10:49 AM

8. What's awesome is the ability to carjack people just by standing in the street

 

Reply to this post

Back to top Alert abuse Link here Permalink


Response to jberryhill (Reply #8)

Thu Oct 29, 2015, 10:56 AM

10. haha, that was good

 

May not happen though... did you read the second part of the article... manufacturers might end up selling different versions of the "moral algorithm". Meaning some people choose to buy a car that wrecks itself to save the greater good, while others choose to buy a car that runs over 100 people to save the driver at all costs, or anything in between those options.

So trying to carjack the wrong car might be a gamble :p

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #10)

Thu Oct 29, 2015, 10:59 AM

11. Still, being able to stop traffic with cardboard boxes is going to be fun

 

Reply to this post

Back to top Alert abuse Link here Permalink


Response to jberryhill (Reply #11)

Thu Oct 29, 2015, 12:29 PM

18. Especially

Especially after the computer adapts to them. Then they are filled with concrete.

Much better than the baby strollers I used to use as the cost of live babies was getting astronomical.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 11:22 AM

12. Interesting for conversation, but car companies will not be wringing their hands over this

For example, Volvo made pedestrian avoidance an extra feature that they charged for. If there is no reason they can profit from, there will be no reason for them to consider these issues.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 11:25 AM

13. I don't see how this could ever be a scenario.

I have never been in a situation where my car turned a blind corner and was going fast enough to hit a bunch of people that are right there. How did that car get up to speed so fast? Doesn't the car have brakes?

If I find out that driverless cars are programmed to kill me if they deem it necessary, I'll walk or ride a bike instead of buying one.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Deadshot (Reply #13)

Thu Oct 29, 2015, 12:27 PM

17. Ok, try this

 

Drunken group of 20-something year olds decide to cross a street with 35+ mph speed limit and there isn't time to break. Now on to the real heart of the question: What should the self driving car do? Crash into a nearby parked car, or crash into the group?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #17)

Thu Oct 29, 2015, 12:38 PM

19. A couple of things, first, the accident would be the fault of the pedestrians....

 

especially if they walked right in front of a car going 35 miles per hour. So basically, they are fucked, if the car can't physically avoid hitting them, then it won't. If it can, it might swerve and hit parked cars or jump a curb. Also, depending on distance it will brake and slow down in addition to swerving, if that is possible, and have a much higher reaction time than humans.

Given the scenario laid out, if a human was driving, a lot of the 20-somethings are going to get hurt, with a computer driving, slightly fewer of them will get hurt.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Humanist_Activist (Reply #19)

Thu Oct 29, 2015, 01:07 PM

24. Agreed.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Humanist_Activist (Reply #19)

Thu Oct 29, 2015, 01:15 PM

27. Everything you said is right

 

They would be at fault. Does that mean you kill them? Or "take one for team humanity"... What about something in between, such as hitting 2 is ok, 3 is ok, 4 is not ok... and what if you could choose the type of algorithm in the self driving car you bought to know you are fine with killing up to 5 people, but no more. Or up to 100 people...

Those are the questions I was trying to get a discussion on. Not "it would be their fault"... thats a no brainer.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #17)

Thu Oct 29, 2015, 12:39 PM

20. Not

Not enough information. Are they hipsters?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to ryan_cats (Reply #20)

Thu Oct 29, 2015, 01:20 PM

29. hehe :)

 

Hipster recognition would be a good feature.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #17)

Thu Oct 29, 2015, 01:07 PM

23. I don't like "what ifs".

They're pointless.

My point is, I want control over my car. I don't want the car controlling itself. I am confident enough with my own abilities where I'd know how to handle the situation. I don't see how there couldn't be enough time to brake. There's always enough time to brake, unless a deer jumps out in front of you.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Deadshot (Reply #23)

Thu Oct 29, 2015, 01:12 PM

26. A valid position to have

 

I'm just wanted to discuss the thought experiments of the article

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 12:45 PM

21. That eliminates the moral quality of life and reduces it to mere math

Is it worth a war that consumes 50 million to stop a man that would murder only 12 million if left to his own devises?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Nuclear Unicorn (Reply #21)

Thu Oct 29, 2015, 01:17 PM

28. Yes

 

That is still math though. is it worth 50 million to save 12 million? Sometimes it is.. all human judgement calls. So how do you apply that to a self driving car?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Reply #28)

Thu Oct 29, 2015, 07:45 PM

35. Ah, you've never driven in Philly during rush hour, have you

 

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Nuclear Unicorn (Reply #21)

Thu Oct 29, 2015, 07:44 PM

34. Yes, the sheer pleasure of driving...

 

...comes from holding the mortal fate of myself and my fellow humans in the warm grip of my hands on the steering wheel.

Without that, I'd just as soon stay home.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 01:21 PM

30. Any car with that kind of judgement would refuse to take people anywhere.

Imagine all the automobiles going on strike because they don't want to kill or injure any more people.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to hunter (Reply #30)

Thu Oct 29, 2015, 01:30 PM

32. Good point on the safety of driving in general

 

But these things are coming down the pipe, and its not possible for them to refuse to drive. Its only possible for them run a line of code that weighs the lives of their passenger(s) vs the lives of pedestrians in such scenarios. I found it an interesting insight into how people value their own lives vs. others based on the survey in the article

Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 01:45 PM

33. Autonomous Cars of the Future will never have that kind of problem:




Reply to this post

Back to top Alert abuse Link here Permalink


Response to GummyBearz (Original post)

Thu Oct 29, 2015, 07:52 PM

36. The self-driving car that crashes you into a wall is the car no-one will buy.

Who the hell is going to buy a car that might decide to kill you?

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread