It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The Machines

page: 1
0

log in

join
share:

posted on Aug, 16 2003 @ 05:12 AM
link   
Few out the years, computers have got more and more advance. Being able to do more task and be able to communicate even more. But is the technology getting too far?

Is it possible that machines could get to advance for us and turn on humans? This could happen if the machines get the ability to think. In 1995 (i think) a computer called "Deep Blue" was able to beat the best chess player in the world. Deep Blue was able to think through thousands of moves per second and choose the best move. Now Deep Blue isn't going to plan to take over the world, but a machine which has been designed to be under the control of the army could turn on its masters. If the machine understands that its more powerful then humans, it will be able to be more powerful then humans.

Films like T3, show how machines have got too advance. A computer system called "SKYNET" was designed to destory computer viruses and it would take control over the computer systems to find it. But SKYNET took control over the nuclear weapons and started a nuclear war. Now in current times, this is highly unlikely and will not happen, but if we get the technology to do this, you never know.

They key thing is for us to be able to know when machines have got too advanced. Machines now are be made to think. IBM developed a computer which could respond to human speech and was able to think and respond to commands. This could be the first step to our downfall. If IBM are developing thinking machines, who knows what the pentagon is developing. There could be an army of thinking machines out there, which we don't know about, and they could be thinking between themselves how to take control over humans.

I believe that at this current stage, a take over by computers is highly unlikely. We should let films,books and science fiction be warning signs for us about machines. We need to stay control over them, not them over us. If we keep developing thinking machines, humans will not be in control of this planet.

-infinite-

[Edited on 16-8-2003 by infinite]



posted on Aug, 20 2003 @ 02:52 AM
link   
i think it's a real possibility.
we've already become subservient to systems. responsibility has shifted from people to incorporeal entities which exist only on paper(companies, institutions). our children have strict guidelines imposed on their learning.
i know my definition of machine is broad, but i see these things as analogs of each other.
they are designing fridges that are hooked to the internet, so you never have to think about ordering food or grocery's. the fridge does it for you. GPS systems take away the need to know where you are an dhow you got there. furnaces automatically prevent you from freezing.
we gradually give away our powers of survival over to "the machine".
there was a movie in the seventies, "the demon seed", which was about an automated house gone malicious.
machines will do what we program them to do. the more complicated the programming, the more you're playing with "the monkey's paw". be careful what you wish for, yeah?



posted on Aug, 20 2003 @ 02:54 AM
link   
There is one small factor not mentioned, the ultimate flaw in machines. They are made by humans...



posted on Aug, 20 2003 @ 03:02 AM
link   

Originally posted by Thorfinn Skullsplitter
There is one small factor not mentioned, the ultimate flaw in machines. They are made by humans...


less and less. more and more machines are made by machines.



posted on Aug, 20 2003 @ 03:11 AM
link   
It had to start somewhere, I am sure we have left our legacy on machines, no matter how advanced they become or if they become completely independant.

Just like if aliens created humans (which I think is a good explanation). Obviously we have some major flaws that have existed for hundreds of thousands of years, apparently our creator(s) left their mark as well...

[Edited on 20-8-2003 by Thorfinn Skullsplitter]



posted on Aug, 21 2003 @ 06:25 PM
link   
No matter how "intellegent" a machine is, it's still a machine. We just program it to know things or do things, its never really working on its own. If we ever made Artificial Intellegence there would be safeguards and protocalls that prevent it from going out of control and such.



posted on Aug, 21 2003 @ 06:39 PM
link   

Originally posted by billybob
there was a movie in the seventies, "the demon seed", which was about an automated house gone malicious.


I have not seen the movie, but I did read the novel on which it is based by Dean Koontz. He updated it signifigantly for the 90's I believe. Not a very good book IMHO, but an interesting read for the man vs. machine fan.



posted on Aug, 22 2003 @ 12:11 PM
link   
Yeah i reckon like you Infinite that machines will some day dominate us it is our detiny to hand over the world to them.



posted on Aug, 22 2003 @ 12:19 PM
link   
this is the thing though, computers cannot think think, they process lines of code, for a computer to think it would have to be programing itself...

now the shear speed of some of the computers now-adays, could mean that a realtime thinking computer in 50 years or so, could within a few secconds, have the real inteligence of a young adult..


i mean imagine ai reading all recorded history, all humans really do is kill oppresses and destroy...

i honnestly truely think, ai would allmost instantainiously percieve humans to be a threat to its exsistance...

it doesnt feel pain, it doesnt have morality, killing would not be a problem for an electronic entertity



posted on Aug, 22 2003 @ 12:21 PM
link   
i think though, that true ai is immpossible with silicon based tech, only a biological computer could become self aware....



posted on Aug, 22 2003 @ 12:22 PM
link   
Yeah but who knows what the futre holds? computers could function from a humain i mean cyborg just like in that movie Virus.



posted on Aug, 22 2003 @ 12:32 PM
link   
Indeed, a scary thought...we must put our faith in those who ever design such machines, that they have built-in protocols, and self-destruct mechanisms....(q.v. Blade Runner androids)...

Personally, I don't care how advanced we think we are, I have no need for my fridge to be hooked up to the internet...hehe....



posted on Aug, 22 2003 @ 12:34 PM
link   

Originally posted by Gazrok
Indeed, a scary thought...we must put our faith in those who ever design such machines, that they have built-in protocols, and self-destruct mechanisms....(q.v. Blade Runner androids)...

Personally, I don't care how advanced we think we are, I have no need for my fridge to be hooked up to the internet...hehe....



"Fridge hooked up to internet" I like that!



posted on Aug, 28 2003 @ 03:53 PM
link   
It'll never happen. We already have machines that can learn, but machines that can think? Never. Whether or not you believe in God and the human soul, you have to admit that there is an element separating us from the animals, which exist by instinct. Or as I like to think of them, as machines. They do not ask questions, but merely respond to chemical impulses in their brains. Also, this may be a flawed analogy, but think about the question of whether "God can create a rock that he can't lift." (as seen in an earlier discussion) We can't create something equal to ourselves. It goes against the laws of cause and effect.
If you think that we humans can "create" reason, by making thinking computers, then shoot me down. I'm interested to hear what you have to say.



posted on Aug, 28 2003 @ 07:49 PM
link   
I think nothing is impossible. I don't think "real" AI is achievable, but someday, the machine will be so complex, with functions letting them litteraly adapt them selves (like some computer virus already do), that they may present a threat to humanity. Probably not an important one, maybe just some wrong line of code that will make some robot go berserk and kill a couple of human, or cause some big disaster.

But I don't think we have to fear stuff as big as presented in some sci-fi scenario, by really imaginative author. But again, hey, we never know, with nano-technology, and the introduction of bio-technology in machine, where will all this stop? And what will we achieve in the end? Will "true" AI be possible?



posted on Aug, 29 2003 @ 12:13 PM
link   
I don't think machines could never take over as long as there is a human at the switch to turn it off when it became unruly.



posted on Aug, 29 2003 @ 12:19 PM
link   
if we every create a highly intelligent AI, it will probably be the downfall of humans. A highly intelligent AI would be able to do many tasks and would think ability like humans. Lets hope the AI doesn't plan to take over the world or some



posted on Aug, 29 2003 @ 12:35 PM
link   
The flw with AI is the first word...'artificial'. Artificial intelligence is by definition flawed and will have a back door, or a way that they can be shut down or control regained over them. I agree with the point that for any AI to be viable, it will have to be a biological comp and not a silicon based form. AI will never gain control over humans because out greatest strenght lies in something they can never truly have, independent thought and feelings. AI can never output anything that is not input into it (even if they are leaning machines that self-input). We on the other hand are capable of original thought. This will allow us to always have the upper hand.




top topics



 
0

log in

join