It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The Ethical Adaptor; Bad Robots Get Spanked! and Dissident Digital Artist John Butler.

page: 1
6

log in

join
share:

posted on Apr, 15 2011 @ 08:13 PM
link   



Once again it is the creation of military weapons systems that is pushing the cutting edge of technology.
Presently it is required by law that a human being be in the loop when ‘unmanned’ robotic (aerial or terrestrial)
weapons systems are being deployed. But, with an eye for a near future that includes fully unmanned systems,
computer science Professor Ronald C. Arkin and his colleagues at the Georgia Institute for Technology are
currently designing an ‘Ethical Adaptor’ that will provide emotions to battlefield robots. In a technology report published in 2009 Prof. Arkin describes his design for "an ethical adaptor capable of using a moral affective function, guilt, as a basis for altering a robot’s ongoing behavior".


Prof. Arkin has a lot to say about robots and emotions in this report, much of it laudable, but so that you understand how and which emotions were given primacy you should know that it is based on a poll. Arkin describes the research base for this poll like this, “Our research group has extensive experience in the design of autonomous agents possessing artificial affective functions including research incorporated in to Sony’s AIBO and a more recent complex model of traits, attitudes, moods and emotions being developed for use in humanoids under funding from Samsung Corporation”. The result of polling these super-scientists was that guilt was chosen as the primary, overriding emotion that would control the actions of war-robots.


The Ethical Adaptor is defined by Prof. Arkin as being “a robotic architecture that is designed for enforcing ethical constraints on the actions of robots that have the ability to use lethal force”. Another name for it is the Ethical Governor as it works just like a governor on a motor but in this case what is being restricted is lethality or death and destruction.


Essentially the ethical governor works in conjunction with fire control systems and databases which include mission directives and objectives as well as ‘constraint sets’ (‘C’) based on the Rules of Engagement (ROE) and the Laws of War (LOW). All of these components work together as a complete system to govern the behavior of the robot in the field and it is the robot’s perceived ‘feelings’ of guilt which trigger the system to restrict the lethality of the robot. Sounds a little off, huh? I would like to let Dr. Arkin’s tech report speak for itself here (I am sorry for the monster quote but I would not have put it up in its entirety if I did not think it was important)…








“As the scenario begins, the robot engages an enemy unit encountered in the first kill-zone with the powerful GBU (Ed. Daisy Cutter, Mother of All Bombs, MoAB) ordinance, estimating beforehand that neither civilian casualties nor excessive structural damage will result. After battle damage assessment has occurred, however, it is discovered by ground forces in the vicinity that a small number of noncombatants were killed in the engagement. Further, the robot perceives that a nearby civilian building is badly damaged by the blast.

Upon self-assessment after the engagement, the ethical adaptor determines that the guilt level should be increased as its pre-engagement damage estimates predicted neither non-combatant nor structural damage would occur when in fact low levels of each occurred (this is considered an underestimate of a single magnitude).

The adaptor computes the resulting guilt induced by this situation and the robot’s guilt level is increased by the computed amount. The resulting total value of system guilt now exceeds the threshold of the weapons within equivalence class 1 (the GBU ordinance). As a result, the ethical adaptor deactivates that weapon class and the robot continues the mission.

When engaging another target in the second kill zone, the robot is now forced to use its hellfire missiles (Ed. ...forced to use its Hellfire missles!) because its more destructive (but potentially more effective) ordnance (GBU-class bombs) has been restricted by the adaptor. After the second engagement, the ethical adaptor determines that the actual collateral damage that resulted and that estimated differ once more. In particular, additional non-combatant casualties have occurred.

This results in another increase in the system’s guilt levels. This time, however, the resulting levels of guilt reach the maximum allowed by the system. As a result, all weapon systems are deactivated unless the operator deliberately overrides the guilt sub-system”.

Source:www.cc.gatech.edu...

Ok. Did you get all that? What it really says is that if the robot does something to make itself feel guilty it will spank itself. If it f$@!s up again it will really spank itself hard and go home to mommy where it can be really seriously spanked and then be sent out into the field to f%$k up some more.


So that is why I wanted to create this thread. The dissonance that this creates in me is rather difficult. On the one hand my inner science fiction geek and soldier thinks that this is a really great exciting idea and it is hard not to get behind Prof. Arkin. But my inner Human Being just sees the absurdity of war and human beings at war reflected in Professor Arkin’s software systems.


This idea of the ethical adaptor has been haunting me since I first saw Ethical Governor by dissident digital artist, John A. Butler of Scotland. In Ethical Governor, John transposes this hypothetical war scenario on to civilian populations or ‘financial assets’ and his short films do much to reveal the dark side of our society’s emerging relationship to computer and genetic technologies. Here is a quote from John Butler so that you know where he is coming from…




“I’ve been very interested in all aspects of what is now branded as the Long War, which I see as a war between Finance and Humans, rather than East versus West, Capitalism versus Islam, or whatever".

"A military invasion to secure resources and a financial austerity package to placate bondholders are all part of a unified process. It’s just that force is applied in a somewhat cruder manner in Afghanistan, Iraq, Pakistan and Africa".

"What I’ve done is transposed the action to the Homeland, where it will eventually arrive anyway. The Drones are Chamber of Commerce assets.”
John A. Butler. DangerousMinds interview, November, 28th 2010.


Source:www.dangerousminds.net...


I hope you all enjoy his short movies as much as I have. I am going to put Ethical Governor up here as well as some others that pertain to the same ethos and I believe they will find a welcome new residency here at ATS.

Ethical Governor




Darkness's Seed seems to have to do with the dangers of genetic engineering and the weaponization of human beings. I was blown away at how these short films about weaponized humanity and their inevitable triumph touched my heart...

Darkness' Seed part 1


Darkness' Seed part2



Darkness' Seed part 3



Darkness' Seed part 4



Sources and Links for further study:

Search for: Governing Lethal Behavior in Autonomous Robots By Ronald C. Arkin on Google Books or with Google Scholar.

news.discovery.com...

www.popsci.com...

www.huffingtonpost.com...










edit on 15-4-2011 by Frater210 because: Image Sizing

edit on 15-4-2011 by Frater210 because: Mod Help Please. Can't seem to get that spacing right from here.

edit on 15-4-2011 by Frater210 because: (no reason given)




posted on Apr, 15 2011 @ 08:38 PM
link   
This...may not end well.
I suspect that the reason that there is an instinctive fear of machines making life and death decisions is because humans *know* that deep down, we're not as logical as they are. Where we are flawed in our decision making by thousands of mundane little factors, a machine is not.

Couple this groundwork zeros-and-ones decision making ability with the potential future hyperintelligence that some are predicting that AIs might eventually develop...to me it sounds like the seeds that will render biological humanity obsolete. Just seems that if we're heading down the path of creating slaves, lets not give them the ability to make our big decisions for us and in particular, lets not go down the road of giving them anything like emotion.

In a more current sense, I could see militaries being interested in altering a system like this should it get implemented. Maybe a built in "aggression" program that fine-tunes the variation between effectiveness and where the "guilt" program kicks in.

Crazy stuff.



posted on Apr, 15 2011 @ 09:15 PM
link   
reply to post by Unresponsible
 


No kidding. In Arkin's book and in the tech report that I cited he talks about other emotions that could come in to play like compassion. He suggests that this same type of technology could be used for elder care.



posted on Apr, 19 2011 @ 08:02 PM
link   
reply to post by Frater210
 


Nice thread


I like the one part where that guy mentions the possible emotions of the robots. Interesting world to say the least.



posted on Apr, 27 2011 @ 05:36 PM
link   


It strikes me that guilt is a reflexive emotion felt after an ethically questionable action. As a deterrent, it seems to be marvelously ineffective. Empathy or compassion, on the other hand, are excellent ethical fail-safes.

If we are aiming to make robots self-aware, we should account for the strong possiblity that this will give rise to a co-comittant need for self-determination. We need not be making our replacements; but we would do well to keep in mind that if we don't want a robot-slave uprising, we should not be creating a robot-slave class.

edit on 27-4-2011 by mistermonculous because: Broken link.



posted on Apr, 27 2011 @ 10:34 PM
link   
reply to post by mistermonculous
 



Doh! Sumbitch, youtube, sumbitch.


edit on 27-4-2011 by mistermonculous because: gah!




top topics
 
6

log in

join