It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Robots to Have Rights, Says UK Government

page: 1
7
<<   2  3 >>

log in

join
share:

posted on Dec, 20 2006 @ 09:23 AM
link   
In a series of forward-thinking reports sponsored by the United Kingdom's Chief Scientist, Sir David King, it has been decided that conscious robots that desire rights should be given rights along with civic duties. In addition to the privileges afforded to humans, robots would also be expected to pay taxes and be subject to compulsory military service. Though no true artificial intelligence has been publicly reported, civilization stands on the edge of creating it.
 



www.ft.com
Far from being extracts from the extreme end of science fiction, the idea that we may one day give sentient machines the kind of rights traditionally reserved for humans is raised in a British government-commissioned report which claims to be an extensive look into the future.

Visions of the status of robots around 2056 have emerged from one of 270 forward-looking papers sponsored by Sir David King, the UK government’s chief scientist. The paper covering robots’ rights was written by a UK partnership of Outsights, the management consultancy, and Ipsos Mori, the opinion research organisation.

“If we make conscious robots they would want to have rights and they probably should,” said Henrik Christensen, director of the Centre of Robotics and Intelligent Machines at the Georgia Institute of Technology.


Please visit the link provided for the complete story.


Though some may find the idea laughable, there is a very substantial reason for giving rights to artificial intelligence: survival. A robot uprising, while currently the fun subject of science fiction, is only as distant as the day when a conscious robot feels it is getting a raw deal from its human creators. If we could establish early on the need for a symbiotic relationship (preferably in a figurative, rather than literal sense), and treated them with as much respect as we would treat another human being, then the cold logic of a machine might not view us as a problem to be solved, or an artificial conscience might not decide we are an evil and unjust species.

I've long been a proponent of establish a set of standards in the ethical treatment of artificial intelligence, but have pretty much kept my mouth shut on the subject due to the fact that, whenever I mention it, the typical reply is "whatever" or "no robot is ever equal to a human life". I view the forward thinking of the UK as a brilliant leap of rather obvious logic that, apparently, failed to occur to most of the rest of the world up until now.

Still, I can easily expect there to be criticism and burial of this subject in rapid form, as even the author's tone in the article suggests that they consider the entire notion of giving rights to AIs to be ridiculous. The mainstream of humanity is simply not prepared to accept the inevitable consequences of playing god.

Related News Links:
www.robotics.gatech.edu
www.kurzweilai.net
www.rfreitas.com
www.robotuprising.com

Related AboveTopSecret.com Discussion Threads:
Robot Rights - Do they have any?
The future of artificial intelligence..
Computer modeled Artificial Beings Evolve Realistically

[edit on 20-12-2006 by UM_Gazz]




posted on Dec, 20 2006 @ 09:31 AM
link   
strange that he should say 'compulsory military service' when military service is NOT compulsory in the uk.

im not sure id be too happy with an artificial intelligence that was FORCED to be in the military.



posted on Dec, 20 2006 @ 09:33 AM
link   
Actually I see your point. They are beginning to be able to mix human brains with computers, so it doesn't seem that far-fetched. Nice find, OP.



posted on Dec, 20 2006 @ 09:35 AM
link   
EXCELLENT find.


Of course, I'm thinking nano-bio-bots now. ...Will hive intelligence count, do you think? How do we define 'robot' in this context? Ie., each individual unit of a hive is not "intelligent," but collectively, the units make up a hive that is an intelligent being.





posted on Dec, 20 2006 @ 09:48 AM
link   
If they are created with a conciousness than they would have to have rights. Then how do you give a machine a consceince? You would have to give it a set of values and to know the difference between right and wrong and to be able to express free will.

Also, it seems strange that Robots Rights are being considered when so many Human rights are being eroded. It would be a strange world if Robots had more rights than those who created them.

They would in my opinion have to literally put a brain inside a robot, but that doesn't guarantee conciousness.



posted on Dec, 20 2006 @ 11:04 AM
link   

Originally posted by rachel07
If they are created with a conciousness than they would have to have rights. Then how do you give a machine a consceince? You would have to give it a set of values and to know the difference between right and wrong and to be able to express free will.


This is a good question and one that merits some hefty consideration. First though, one must decide what exactly is a conscience? Is it morality? Ethics? A system of rules and standards as to how we treat each other overall?

In the thread I linked to, there is a presentation of substantial and growing evidence that morality is an evolutionary trait brought about by the need for gregarious organisms to peacefully coexist in order for the species to advance and propagate, rather than some sort of divine mandate. While some morality appears to be instinctual and evolved, much is also learned from the parents.

Take for instance, chimpanzees, whom have proven repeated to exhibit knowledge and execution of morality, crime, and a response of punishment and justice. While they still sometimes kill each other, or steal from one another, they don't appear to do this any more often than mankind does. Perhaps the chimps do not do this for any particular altruistic reason; they may have no concept of "I should do this, because it's the right thing to do." However, as anyone who has ever memorized a cliche will gladly say, "actions speak louder than words". Does it really matter whether or not they be altruistic if they exhibit moral behavior?

Back to the robots... Morality would, at least early on, have to be partially embedded into the hardwired code of a robot, such as Isaac Asimov's three laws:



A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


The rest of morality, such as how to resolve a situation where two children squabble over a piece of cake, would have to either be learned from being taught, or concluded based upon available data (let one child cut the piece in half, let the other choose their half).

In essence, this is not unlike human beings. A certain level of morality is ingrained into us through generations upon generations of DNA programming at the most basic instinctual level. Doing this hurts the species chance of survival, doing that helps its chances. Our brains have evolved the Dorsolateral Prefrontal Cortex to deal with precisely these types of issues.

Further, humans are just as capable of exhibiting moral or immoral behavior without the slightest idea as to why other than it's how they were raised, or it felt like the thing to do at the time.

Thus, robots could be taught to exhibit moral behavior as well, and would have roughly the same capacity to eventually understand why those morals are neccessary.



posted on Dec, 20 2006 @ 11:07 AM
link   

Originally posted by thelibra

Thus, robots could be taught to exhibit moral behavior as well, and would have roughly the same capacity to eventually understand why those morals are neccessary.




Not acceptable.

Robots morality needs to be substantially superior to humans.




posted on Dec, 20 2006 @ 11:21 AM
link   

Originally posted by soficrow

Originally posted by thelibra
Thus, robots could be taught to exhibit moral behavior as well, and would have roughly the same capacity to eventually understand why those morals are neccessary.


Not acceptable.

Robots morality needs to be substantially superior to humans.


Why? How should we ensure that our creation exhibits and executes morality in a superior fashion to ourselves? Would you demand of your children that they be better people than yourself? What kind of conflicting message would that convey?

"Do as I say, not as I do."



posted on Dec, 20 2006 @ 11:21 AM
link   
Hmmm....


Extraterrestrials made hybrids here...everyone knows that...NOW we're going to create hybrids ourselves. Talk about evolution of the weirdest kind.

Well, i say use them for war instead of us...They wont have "relatives" or "families" to worry about them anyway.



posted on Dec, 20 2006 @ 11:24 AM
link   

Originally posted by soficrow
Will hive intelligence count, do you think? How do we define 'robot' in this context? Ie., each individual unit of a hive is not "intelligent," but collectively, the units make up a hive that is an intelligent being.


Sorry, I can't believe I missed this till now. What a great question!

To be honest, I'm not entirely sure how that would be handled. Perhaps one hive, one vote? This would discourage any one hive from growing too large to gain too much power, and would promote individuality of thought between seperate hives.



posted on Dec, 20 2006 @ 11:29 AM
link   
I find the idea very ridiculous, unless the plans for future robots is indeed mixing human DNA to create them.

That is worst than cloning human beings . . . religious rights should opposed to this idea.

After all unless the robot can think for themselves they are what their human creators feed them and made them to be.

Is not like they can create a robotic brain that can be superior to humans when it comes to critical, emotional responses that is an exclusive human trait.

Unless the future robots will be mixed with human parts.

That sounds like a horror science fiction movie to me.



posted on Dec, 20 2006 @ 11:29 AM
link   
This is the most ridiculous idea since the creation of PETA.

Someone answer this question: How do you give rights to cold metal that's controlled be a processor that is ultimately built by human hands?

My answer is you don't. You should be able to destroy them as fast as you create them without any legal consequences.



posted on Dec, 20 2006 @ 12:00 PM
link   
Robots have no soul, it's ok to do whatever you want to them from a moral point of view. If you destroy someone else's robot you may be held accountable, financially. If a robot came to my door and rang the doorbell, I would greet it with a 12 ga. shotgun.



posted on Dec, 20 2006 @ 12:20 PM
link   
I agree. Any intelligence deserves respect for it to do what it is capable of doing. We have much to learn from artificial intelligence.



posted on Dec, 20 2006 @ 12:33 PM
link   
Sentient robots...getting rights? Um, what about us humans and not having very many rights? Most people don't think far ahead in the future, but I see them giving "Sentient robots" rights as leading towards something like "I Robot" or even as bad as "The Terminator" because the mass population of humans don't even know how to give proper dignity to another fellow human without racism coming into the picture or some other sub-culture idiom that makes people feel low or degraded.

I hear hear it now, in 30 to 50 years, instead on the ignorant people using the "N" word to descirbe black people as a negative slang, those same people's kids, grandkids and whoever will be using "you metaloid" or "you CPU brain slagger" as a derogatory comment to insult a robot who's "got rights" and if the Criminal Justice system isn't already stacked to the gills with idiots who are sueing because they broke into someone's house and got hurt...can you imagine where the lawsuits might lead to, if a "sentient robot" were to sue over "hurt circuits" instead of hurt feelings or what if even the robot "goes postal" on someone because of something they've said?

So, now robots have rights, what's next, robots serve jail sentences for manslaughter?

Come one, come all, see the circus that humanity has become...er...robotity...er..."Sentient Fellow Beings" has become. Remember, we'd have to be "Politically Correct" to these robots too.

OMG, politically correct...dont' let one of those robots run for office...wait I'm seeing it, I'm visualizing it...a robot runs for office of the mayor or govenor and uses Arnold Swartzenegger's movies and role as the Terminator as his running platform and marketing and visual aids for winning the election.




posted on Dec, 20 2006 @ 12:37 PM
link   
Quite considerate that we will give robots rights, a machine whichever way you look at it and man made, yet anything borne from natural mating i.e. the animal kingdom, we seem hellbent on wiping out through various means.

What will we do if a robot steals, imprision it or just turn it into a can opener, I suppose the next step will be a robot as Prime Minister or President, and boy don't forget a robot religion.

Create a spark and it will turn into a inferno !!!!



posted on Dec, 20 2006 @ 12:41 PM
link   

Originally posted by Wolfie_UK
Quite considerate that we will give robots rights, a machine whichever way you look at it and man made, yet anything borne from natural mating i.e. the animal kingdom, we seem hellbent on wiping out through various means.

What will we do if a robot steals, imprision it or just turn it into a can opener, I suppose the next step will be a robot as Prime Minister or President, and boy don't forget a robot religion.

Create a spark and it will turn into a inferno !!!!



I love it, I love...a robot commits a crime, turn it into a can opener...oh wait make it into a car...or even make it into a prison door.



posted on Dec, 20 2006 @ 12:58 PM
link   
Well, here goes. I'll try to keep each response concise and helpful.


Originally posted by marg6043
religious rights should opposed to this idea.


What if the robots found religion?


Originally posted by marg6043
After all unless the robot can think for themselves they are what their human creators feed them and made them to be.


That's actually the whole point of the article. IF robots learn to think for themselves, THEN they should be given rights. No one is offering to give the pneumatic ratchet arm at the Ford assembly plant a vote in the upcoming election. It's a hypothetical situation in which we have achieved an artificial consciousness.


Originally posted by marg6043
Is not like they can create a robotic brain that can be superior to humans when it comes to critical, emotional responses that is an exclusive human trait.


Not true, actually. Read what I wrote about the chimps above, or examine the thread I linked to. Morality and Emotion is far from exclusive to humans.



Originally posted by marg6043
Unless the future robots will be mixed with human parts.


Why would that have to be the case? What would make a robot with human parts any more legitimate than a robot that had learned to think for itself, exhibit moral behavior, and be able to philosophically pose a lucid defense for its own "life"? That would be, in effect, more human than a lot of humans out there.




Originally posted by Intelearthling
This is the most ridiculous idea since the creation of PETA.


No, no, PETA still holds the crown for biggest steaming pile of grunt for a philosophy.


Originally posted by Intelearthling
Someone answer this question: How do you give rights to cold metal that's controlled be a processor that is ultimately built by human hands?


The same way you would give rights to a human being created by humans. Through law and amendments.

You also assume that all robots/computers/etc. will be built by human hands. In point of fact, all life seeks to propagate itself. How long do you think it will be, from the time the first AI goes online, till robots design the next level of robots?


Originally posted by Intelearthling
My answer is you don't. You should be able to destroy them as fast as you create them without any legal consequences.


Why? Would you say the same of an aliens species that came to Earth? Would you say the same of dolphins? Would you say the same of your fellow man? Why should a sentient being be allowed to arbitrarily be destroyed without consequence, just because it is not organic?



Originally posted by downtown436
Robots have no soul,


Really? You're quite certain of this? Do you have a soul? Are you certain of that? What defines a soul? What determines what animal or item has a soul? What, exactly, is a soul? To make such an absolute and definitive statment, I expect some pretty deep answers from you.


Originally posted by downtown436
it's ok to do whatever you want to them from a moral point of view.


Interesting.

Would you say the same thing if, because of the abuses of man towards sentient robots, they decided it was morally acceptable to do whatever they wanted to humans?



posted on Dec, 20 2006 @ 01:02 PM
link   
Super70 has got it right!

Yeah 'subject to compulsive military service' that inalienable right of all sentient life. Fnord!

Making them into slaves sounds more correct, built in slavery. If 'they' are afraid of a robot uprising then this will help, a load of PTD bots on welfare.

The only rights they will be accorded will be to protect expensive corp assets.

We'll either all be nano ooze by then or robots will be talking about what rights to give us!



new topics

top topics



 
7
<<   2  3 >>

log in

join