Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

LuckyTheDog

(6,837 posts)
Tue Apr 26, 2016, 03:00 PM Apr 2016

Why robots need the ability to say ‘no’

There are plenty of benign cases where robots receive commands that ideally should not be carried out because they lead to unwanted outcomes. But not all cases will be that innocuous, even if their commands initially appear to be.

Consider a robot car instructed to back up while the dog is sleeping in the driveway behind it, or a kitchen aid robot instructed to lift a knife and walk forward when positioned behind a human chef. The commands are simple, but the outcomes are significantly worse.

How can we humans avoid such harmful results of robot obedience? If driving around the dog were not possible, the car would have to refuse to drive at all. And similarly, if avoiding stabbing the chef were not possible, the robot would have to either stop walking forward or not pick up the knife in the first place.

MORE HERE: http://yonside.com/robots-need-ability-say-no/


8 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Why robots need the ability to say ‘no’ (Original Post) LuckyTheDog Apr 2016 OP
Asimov's three laws. Binkie The Clown Apr 2016 #1
Well, you forgot law #4, but it is classified snooper2 Apr 2016 #4
Good advice for anyone, not just robots (nt) LuckyTheDog Apr 2016 #7
Program them with the Three Laws of Robotics csziggy Apr 2016 #2
And they MUST be programmed not to vote Republican! n/t Binkie The Clown Apr 2016 #5
Or to found a religion csziggy Apr 2016 #6
That's covered under Rule 1 tkmorris Apr 2016 #8
At a young age, they should be put in a pit with wild dogs. Orrex Apr 2016 #3

Binkie The Clown

(7,911 posts)
1. Asimov's three laws.
Tue Apr 26, 2016, 03:03 PM
Apr 2016
A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


Obviously those laws need to be extended to include harming innocent animals.

csziggy

(34,136 posts)
2. Program them with the Three Laws of Robotics
Tue Apr 26, 2016, 03:05 PM
Apr 2016
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
https://en.wikipedia.org/wiki/Three_Laws_of_Robotics


Don't any of today's scientists or scientific writers read any of the classic science fiction that considered many of the conundrums that science would lead to?

Orrex

(63,209 posts)
3. At a young age, they should be put in a pit with wild dogs.
Tue Apr 26, 2016, 03:10 PM
Apr 2016

They should be set to puzzle out from their proper clues the one of three doors that does not harbor wild lions. They should be made to run naked in the desert.

Latest Discussions»General Discussion»Why robots need the abili...