It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


Help ATS via PayPal:
learn more

Killer robots a small step away and must be outlawed, says top UN official

page: 1

log in


posted on Aug, 28 2014 @ 01:51 AM
Yet another example of how this space is evolving quickly and there's going to be a lot of future debates about robots and how much we should allow machines to think and learn.

Killer robots programmed to open fire without human control are just a “small step” from the battlefield and military powers should agree to outlaw them, a top United Nations official has said.

Angela Kane, the UN’s high representative for disarmament, said governments should be more open about programmes to develop the technology and she favoured a pre-emptive ban before it was too late.

She said: “Any weapon of war is terrible, and if you can launch this without human intervention, I think it’s even worse. It compounds the problem and dehumanises it in a way.

“It becomes a faceless war and I think that’s really terrible and so to my mind I think it should be outlawed. The decision is really in the hands of the states who have the capability to develop them."

Ms Kane said there was “a great deal of concern” about the prospect of killer robots being developed that would commit war crimes on the battlefield.

He's correct. This is a small step away. Machine learning is really advancing and people have to realize machine intelligence will not be exactly like human intelligence. Overtime it will actually be more efficient.

Here's what an expert said in this article:

Huw Williams, Unmanned Systems Editor at IHS Jane’s International Defence Review, said he knew of no programmes to make killer robots and even the most advanced machines had little ability to act on their own.

He said: “Autonomy at the moment is quite limited. You set the task parameters, give it way points and tell it to go and do x, y and z. Then the machine decides: 'How do I get from x to y to z in the most efficient manner and get things done?' But in terms of real thinking, real autonomy, then no."

First, I think the debate is great and it's something we need to start thinking about. The thing he said is "real learning" and this again goes to my point. Some people think if you program information for a robot to use then it's not really thinking or learning. Of course it is and we program information into our children from the first day of school to the 12th grade. He said the MACHINE DECIDES how to get from x to y to z in the most efficient manner. This is machine learning and it doesn't matter if information is programmed into the machine and overtime if you put a human against a robot to do a task the robot will find the most efficient manner to carry out the task much faster than a human.

I'm not against allowing robots to fight on the battlefield if it will save human lives. I think you can give a robot a specific target and through voice and face recognition it could take out that target. These robots could never be equipped with the best machine learning programs though. You want their missions to be strict and limited.
edit on 28-8-2014 by neoholographic because: (no reason given)

posted on Aug, 28 2014 @ 02:07 AM
"These robots could never be equipped with the best machine learning programs though. You want their missions to be strict and limited.

edit on E14America/Chicago08098 by Eyemin because: (no reason given)

posted on Aug, 28 2014 @ 02:20 AM
a reply to: neoholographic

I believe that in matters of war, if it is not worth risking ones life to do, then it should not be done at all.

Perhaps I am skewed in the head (it would not be the first time !) but it seems to me that without the risk posed to those looking to get something done, the achievement is worth less than it would be if achieved in the traditional way.

For example, there was an officer in the Second World War who was famed as saying that any man going onto the battlefield without a sword, was improperly dressed. His name was Lieutenant Colonel John Churchill, or Mad Jack as he was referred to by his subordinates. He captured 42 Nazi soldiers, USING A BROADSWORD! He would storm beaches with a zweihander and an English Longbow, from which he would launch Native American arrows, and was the only person in modern military history to score kills with a bow.

This man may have been eccentric, but he was a badass because he risked it all, using antiquated weapons and tactics which totally flummoxed his enemies and put the fear of God into them. The fellow was Rambo in Harris tweed for goodness sake!

And very much like the way you do not get good customer service from an automatic checkout machine, good soldiering, inspiring leadership, these things are not programmable. You can tinker with the numbers all you like, but all you end up with is a gun carriage. You might argue that training is programming, but I would argue that each soldier trained and issued a weapon, adds what training they receive to their pre-existing understanding of morality and ethics, their own courage and personality, things with which they are already equipped when they join up.

Robots have no redeeming feature in that regard, and I believe we would be better off either taking care of business ourselves, or leaving jobs undone, than to allow some death machine to do our labours for us.

posted on Aug, 28 2014 @ 02:47 AM
a reply to: TrueBrit

I completely agree. Especially with your first sentence. That's one reason I don't really care for drones that fly around and make it possible to blow stuff up and kill people with the push of a button.

Taking someone's life should always involve the human element present in the time and place. When we remove that element I think we will become more animalistic. It is easier to convince yourself that you aren't really hurting/killing people if you aren't there in the flesh (at least for some). Life is precious and should take more thought on ending it than what a robot would give IMO.

As for fighting for other things and using robots? I don't really agree with that facet of it either. Anything worth having is worth fighting for. If you do not earn it or put yourself at stake for it, then the appreciation would diminish if not disappear all together. One may not have to weigh the option of whether it's truly worth the fight or not if they don't have to do it themselves. That would likely end up costing more lives because the decisions could be made on the careless whims of what a person felt like they wanted at that moment even if it was virtually worthless.

Of course these are just my opinions and I am fully aware that there are many that disagree vehemently. Any time life is on the line, the choice should not be made by something that places no value on what it can't possibly understand.

edit on 8/28/2014 by Kangaruex4Ewe because: (no reason given)

posted on Aug, 28 2014 @ 03:37 AM
a reply to: Kangaruex4Ewe

You bring up another very good point Kanga...

The value placed upon a human life will change the moment battlefield robotics come into major play on the world stage. Not only will the lives taken by robots be taken without consideration for the seriousness of that act, and its import, but by degrees lives being lost this way will seem commonplace, and eventually it will not just be the robots who forget to value it.

new topics

top topics

log in