HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » General Discussion (Forum) » Why robots need the abili...

Tue Apr 26, 2016, 03:00 PM

Why robots need the ability to say ‘no’

There are plenty of benign cases where robots receive commands that ideally should not be carried out because they lead to unwanted outcomes. But not all cases will be that innocuous, even if their commands initially appear to be.

Consider a robot car instructed to back up while the dog is sleeping in the driveway behind it, or a kitchen aid robot instructed to lift a knife and walk forward when positioned behind a human chef. The commands are simple, but the outcomes are significantly worse.

How can we humans avoid such harmful results of robot obedience? If driving around the dog were not possible, the car would have to refuse to drive at all. And similarly, if avoiding stabbing the chef were not possible, the robot would have to either stop walking forward or not pick up the knife in the first place.

MORE HERE: http://yonside.com/robots-need-ability-say-no/


8 replies, 885 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 8 replies Author Time Post
Reply Why robots need the ability to say ‘no’ (Original post)
LuckyTheDog Apr 2016 OP
Binkie The Clown Apr 2016 #1
snooper2 Apr 2016 #4
LuckyTheDog Apr 2016 #7
csziggy Apr 2016 #2
Binkie The Clown Apr 2016 #5
csziggy Apr 2016 #6
tkmorris Apr 2016 #8
Orrex Apr 2016 #3

Response to LuckyTheDog (Original post)

Tue Apr 26, 2016, 03:03 PM

1. Asimov's three laws.

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


Obviously those laws need to be extended to include harming innocent animals.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Binkie The Clown (Reply #1)

Tue Apr 26, 2016, 03:10 PM

4. Well, you forgot law #4, but it is classified

 

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Binkie The Clown (Reply #1)

Tue Apr 26, 2016, 04:35 PM

7. Good advice for anyone, not just robots (nt)

Reply to this post

Back to top Alert abuse Link here Permalink


Response to LuckyTheDog (Original post)

Tue Apr 26, 2016, 03:05 PM

2. Program them with the Three Laws of Robotics

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
https://en.wikipedia.org/wiki/Three_Laws_of_Robotics


Don't any of today's scientists or scientific writers read any of the classic science fiction that considered many of the conundrums that science would lead to?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to csziggy (Reply #2)

Tue Apr 26, 2016, 03:14 PM

5. And they MUST be programmed not to vote Republican! n/t

Reply to this post

Back to top Alert abuse Link here Permalink



Response to Binkie The Clown (Reply #5)

Tue Apr 26, 2016, 04:55 PM

8. That's covered under Rule 1

Reply to this post

Back to top Alert abuse Link here Permalink


Response to LuckyTheDog (Original post)

Tue Apr 26, 2016, 03:10 PM

3. At a young age, they should be put in a pit with wild dogs.

They should be set to puzzle out from their proper clues the one of three doors that does not harbor wild lions. They should be made to run naked in the desert.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread