Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsWhy robots need the ability to say ‘no’
There are plenty of benign cases where robots receive commands that ideally should not be carried out because they lead to unwanted outcomes. But not all cases will be that innocuous, even if their commands initially appear to be.
Consider a robot car instructed to back up while the dog is sleeping in the driveway behind it, or a kitchen aid robot instructed to lift a knife and walk forward when positioned behind a human chef. The commands are simple, but the outcomes are significantly worse.
How can we humans avoid such harmful results of robot obedience? If driving around the dog were not possible, the car would have to refuse to drive at all. And similarly, if avoiding stabbing the chef were not possible, the robot would have to either stop walking forward or not pick up the knife in the first place.
MORE HERE: http://yonside.com/robots-need-ability-say-no/
Consider a robot car instructed to back up while the dog is sleeping in the driveway behind it, or a kitchen aid robot instructed to lift a knife and walk forward when positioned behind a human chef. The commands are simple, but the outcomes are significantly worse.
How can we humans avoid such harmful results of robot obedience? If driving around the dog were not possible, the car would have to refuse to drive at all. And similarly, if avoiding stabbing the chef were not possible, the robot would have to either stop walking forward or not pick up the knife in the first place.
MORE HERE: http://yonside.com/robots-need-ability-say-no/
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
8 replies, 1000 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (0)
ReplyReply to this post
8 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Why robots need the ability to say ‘no’ (Original Post)
LuckyTheDog
Apr 2016
OP
Binkie The Clown
(7,911 posts)1. Asimov's three laws.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Obviously those laws need to be extended to include harming innocent animals.
snooper2
(30,151 posts)4. Well, you forgot law #4, but it is classified
LuckyTheDog
(6,837 posts)7. Good advice for anyone, not just robots (nt)
csziggy
(34,136 posts)2. Program them with the Three Laws of Robotics
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
https://en.wikipedia.org/wiki/Three_Laws_of_Robotics
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
https://en.wikipedia.org/wiki/Three_Laws_of_Robotics
Don't any of today's scientists or scientific writers read any of the classic science fiction that considered many of the conundrums that science would lead to?
Binkie The Clown
(7,911 posts)5. And they MUST be programmed not to vote Republican! n/t
csziggy
(34,136 posts)6. Or to found a religion
tkmorris
(11,138 posts)8. That's covered under Rule 1
Orrex
(63,209 posts)3. At a young age, they should be put in a pit with wild dogs.
They should be set to puzzle out from their proper clues the one of three doors that does not harbor wild lions. They should be made to run naked in the desert.