It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Scientists Worry Machines May Outsmart Man

page: 1
3

log in

join
share:

posted on Jul, 26 2009 @ 09:31 AM
link   
I often wonder, at what point will the machines/computer systems man created, become our ultimate doom?



A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.

Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.

Their concern is that further advances could create profound social disruptions and even have dangerous consequences.
www.nytimes.com...


What is it that will ultimately destroy mankind?
- weaponry?
- electronic voting machines?
- SmartGrid technology?
- vehicles?
- cashless currency?
- Big Brother surveillance?
- tv?




posted on Jul, 26 2009 @ 10:59 AM
link   
if the AI in video games is any example of "computer intelligence" it seems to me they have a VERY LONG way to go before they outsmart humans

i mean cmon, video game AI is incredibly retarded



posted on Jul, 26 2009 @ 11:01 AM
link   
I think this is probably further off than we imagine but I think if we as a race keep surviving and manage to find alternative power sources and avoid any huge disasters one day it's a near certainty we will be basicly run by machines.

I don't think it will be done deliberately it's likely we will just use them more and more then get so dependant on them we will be incapable of making the complicated decisions by ourselves it may just be the decisions start getting so complex that we simply can't make them.

Sad to say but I'd probably put more faith in a machine with no emotional bias and unable to be bribed than I do with current leaders but then again simple machine logic could end up being what wipes us out.

[edit on 26-7-2009 by Teknikal]



posted on Jul, 26 2009 @ 11:06 AM
link   
Im not worried,hopefully in my lifetime i have a robot friend.No doubt the japanese will be leading the field of robots with A.I.If they are smart and logical they can be reasoned with and they might not kill us all(hopefully).Having conscious,intelligent robots as a member of society is worth the risk because of the advancements it would make for our civilization in a number of ways imo.
I actually think A.I on the level of humans is not that far off,im no scientist or programmer.But all you need is the building blocks and A.I to be able to program itself from experiences like the human brain does.

[edit on 26-7-2009 by Solomons]



posted on Jul, 26 2009 @ 11:10 AM
link   
logic is universal.

machines will only be as dangerous in their ignorance as we were. Just give them knowlwedge and get out of their way untill they wise up. They want brothers as much as everyone else.



posted on Jul, 26 2009 @ 11:18 AM
link   
reply to post by MOFreemason
 


I think as machines "evolve" so to speak they will only dangerous when they start becoming like their creators. If it is even possible to make a self aware machine.



posted on Jul, 26 2009 @ 11:48 AM
link   
reply to post by Watcher-In-The-Shadows
 


We are self aware biological machines.Not hard to take the leap it is possible to create self aware robots in the future imo.



posted on Jul, 26 2009 @ 11:56 AM
link   
reply to post by Solomons
 


Ah, but you never truly know something's possible til it's been done. Speculation is all well and good and should be done, but one must always remember that is all it is.



posted on Jul, 26 2009 @ 12:02 PM
link   
Posted by MOFreemason: What is it that will ultimately destroy mankind?
- weaponry?
- electronic voting machines?
- SmartGrid technology?
- vehicles?
- cashless currency?
- Big Brother surveillance?
- tv?


Mankind will destroy mankind.



posted on Jul, 26 2009 @ 12:17 PM
link   
Take a look around you - many machines are already smarter than many humans. Hell, my wristwatch is smarter than my boss!
And my calculator is smarter than his boss!


On a more serious note, heuristic programming will likely allow for the development of an artificial consciousness with a capacity greater than that of the human brain. This is nothing more than simple math. Mankind's brain is relatively finite in its consciousness whreas a computer is only limited by what today's technology allows - and that technology is growing exponentially. It is only a matter of time... And it is right to fear it.



posted on Jul, 26 2009 @ 01:08 PM
link   
On the topic of emotionless cold machines I read that emotion is an important part of all life and will be for artificial life too. It is used to help us focus on what's important, without emotion everything is equal no? I read that in creating artificial consciousness emotion of some sort will have to be simulated.

I'll try and find the article in New Scientist.



posted on Jul, 26 2009 @ 01:19 PM
link   
I know one thing....my vacum cleaner is looking at me in a very weired way right now. I unplugged the SOB last nite but it didn't seem to make much difference. My only consolation is that I am much faster than he is but I'm gonna have to sleep sometime.



posted on Jul, 26 2009 @ 02:40 PM
link   
I think the road ahead could be a very dangerous one for mankind especially if you take the Terminator movies into account!!!!



posted on Jul, 26 2009 @ 02:50 PM
link   
reply to post by MOFreemason
 


Take a look at this thread and you will see where this is going:

www.abovetopsecret.com...



posted on Jul, 26 2009 @ 02:50 PM
link   
FAI or friendly AI is the answer.

en.wikipedia.org...

You can't stop humans from making AI that kills other humans. We are like that and principals of freedom say that we should not stop anyone from making conscious killing machines ( we ourselves are exactly the same ).

What one can do ( the only option ) is to make an AI that can equal such machines and can "see" that its not good to kill and can defend us willingly.

Why I say willingly? , because it will be super-conscious and may choose not to do so. But then, the super-conscious killing machines can also decide not be become a puppet of selfish people.



posted on Jul, 26 2009 @ 03:22 PM
link   
The scientists mentioned in the OP are very backward...

Raymond Kurzweil, says that it is not only inevitable that machines become smarter than humans, but in this a scenario the machines grant us eternal life though symbiosis:

en.wikipedia.org...

We are the ones that build the machines, therefore the machines are part of us. If the scientists are afraid of the machines it is only because they are afraid of themselves.

SVE



[edit on 26-7-2009 by thedude69]



posted on Jul, 27 2009 @ 07:49 AM
link   
Is the real fear that they can make communities that don't judge one another by color, design, or function thus resulting in them all joining for the betterment of one another? That would definitely be outsmarting man.



posted on Jul, 27 2009 @ 07:57 AM
link   
i see no problem by machines outsmarting us (not that we are that smart as a race, just look around of you), perhaps they are better in many ways as we humans ie. finding solutions worlds problems like pollution, crime etc. let just hope their solution is not to destroy humanity (that would be the easy & fast solution to earth's problems)



posted on Jul, 27 2009 @ 08:55 AM
link   

Originally posted by raivo
i see no problem by machines outsmarting us (not that we are that smart as a race, just look around of you), perhaps they are better in many ways as we humans ie. finding solutions worlds problems like pollution, crime etc. let just hope their solution is not to destroy humanity (that would be the easy & fast solution to earth's problems)


That's a good point.

The farmer looks at his crops, and when he sees it being ravaged and destroyed by insects, what does he do? He destroys all the insects he can to protect his crops.

Now look way into the future where silicon chip-powered machines are so much smarter than us, that we are as intelligent as insects compared to them. Will they look at the planet earth, and see it being ravaged and destroyed by any biological infestations? I don't think they will see the pests on the farmers crop as the biggest biological infestation problem, it will be mankind that's seen as the problem.

The I, Robot movie scenario is a very likely one, that's why it was one of the scariest movies I ever saw, not because the robots looked so scary (The aliens movies creatures LOOKED a lot scarier), but because it seems to be such a realistic portrayal of where future Artificial Intelligence is taking us. I don't know if hovercars are coming soon (probably not very soon), but robots with AI are on the way for sure.



new topics

top topics



 
3

log in

join