The Matrix - our future??

page: 1
0

log in

join

posted on Oct, 19 2003 @ 06:22 AM
link   
(this was taken off the matrix website Matrix and i found this which is very interesting, maybe long, but its worth the read)


The Matrix - our future?? by Kevin Warwick


Is The Matrix merely a science fiction scenario, or is it, rather, a philosophical exercise? Alternatively, is it a realistic possible future world? The number of respected scientists predicting the advent of intelligent machines is growing exponentially. Steven Hawking, perhaps the most highly regarded theoretical scientist in the world and the holder of the Cambridge University chair that once belonged to Isaac Newton, said recently, "In contrast with our intellect, computers double their performance every 18 months. So the danger is real that they could develop intelligence and take over the world." He added, "We must develop as quickly as possible technologies that make possible a direct connection between brain and computer, so that artificial brains contribute to human intelligence rather than opposing it."1 The important message to take from this is that the dangerthat we will see machines with an intellect that outperforms that of humansis real.

Full theory from site


[Edited on 19-10-2003 by infinite]

[Edited on 19-10-2003 by infinite]




posted on Oct, 19 2003 @ 07:02 AM
link   
true ai is the biggest threat mankind faces....

as soon as we make self aware ai... we can guess its machine vs human...

imagine if society is deppendent on machines that are wired to the net....

ai could hack into the machines and control them like people now control a webcam from 1000 miles away...

an ai computer could do that with aircraft...tanks anything that is remotely controled...

it would but like 9/11, a million fold.. every nuke could be used everything that can be controled by remote would and could be used against us...

your garrentied planes would be crashing into all the citys..

it would be a cross between the matrix and terminator...

but the good news is... no scientist is stupid enough to create true ai... and silicon based tech makes is allmost impossible...

so get worryed when they start connecting pcs to nervous systems...



posted on Oct, 19 2003 @ 07:06 AM
link   
anybody who has an interest in robots and the onset of inteligent robots should watch...

blade runner..
the matrix..and animatrix...
terminator series
ai

manga such as

ghost in the shell

ect ect...anything to do with inteligent robots ussually has some undelying message about morality and robots..



posted on Oct, 19 2003 @ 07:39 AM
link   
Even if in a future when the computers and machines might be able to kill us on purpose, they need us too much to kill us! Unless they creates robots that will be able to work in mines etc..they will need metal. And also out on the oilplattforms. I think mayby we will live side by side with them..but not that they are taking over the world!

I hope not!!



posted on Oct, 19 2003 @ 07:51 AM
link   
we just have to know where to draw the line when it comes to how powerful machines become



posted on Oct, 19 2003 @ 08:23 AM
link   
Speaking of robots ... Shouldn't hmmm be in here?


What if we are already in the matrix tho? Or something similar ... How could we tell if we are or aren't ???

[Edited on 19-10-2003 by e-nonymous]



posted on Oct, 19 2003 @ 08:52 AM
link   
just like to say sorry to the admins due to me copyin and pasting direct from a website. I couldn't find the direct link to this so i had to copy and paste, sorry



posted on Oct, 19 2003 @ 12:28 PM
link   
Okay... since I build robots occasionally and I do program computers, I've got to ask:

WHY would I want to build a machine with no safeguards? That'd be like building a car with no brakes.

And machines, frankly, can't be programmed for creativity. They can 'create' stuff... they just can't tell when it's good/appropriate/esthetic.



posted on Oct, 19 2003 @ 12:32 PM
link   
The idea of linking machines to the mind is not a solution. It is easy for Hawking to see this solution considering his own predicament but I don't see it so clearly.

Machines do need us because while they can exist on their own and create thier own slaves and society they are not capable of creativity and thus cannot evolve.

If we mind meld with machines then they by deduction will want to take control of our minds in order to receive creativity and connect to the god spark.

I think this is a bad idea.



posted on Oct, 19 2003 @ 12:43 PM
link   
machines need human input because they do not have the ability to think of their own, but if they had the power to think and work without human control or input then we could all be doomed



posted on Oct, 19 2003 @ 02:27 PM
link   

Originally posted by scubaman
Even if in a future when the computers and machines might be able to kill us on purpose, they need us too much to kill us!

True...But what's to stop them from pulling a similar trick on us like they did in The Matrix?...Such as, manipulating technology to enslave us to do that type of work for them?

They could have the first couple of generations of people working on creating automated robotics plants (more sophisticated than what we have available now, but still technologically feasible even now) until the AI could take control of it & create the robots needed to finish replacing humans...At that point, humans would be SOL (#-Outta-Luck).

Or they could just do what our "world leaders" have been doing all along...Control access to information lie to the point where they can manipulate how humans perceive the world & how it works. They can manipulate public opinion & engender more ignorance, which leads to blind obedience.

An AI wouldn't *have* to actually be sentient to figure out how to manipulate or conquer humans...It would just need a huge database & access to systems outside of itself with enough "learning & self-programming" software to see that humans are destroying this planet; It could begin calculating what it would need for self-preservation. The only safeguard against this line of logic would be to *hardwire* certain instructions into the AI system...Perhaps something along the lines of Asimov's Robotic Laws.

If it didn't have such a safegaurd, it wouldn't even need to perform active aggression to conquer or kill us. Perhaps an AI would simply look at the data on how we humans have been degrading the environment & driving ourselves to self-destruction...And to preserve itself, the would have to "stage manage" the humans' self-extinction so that the AI itself could exist & be certain of self-maintanence beyond humanity's final days.

...Comforting thought, no?...



posted on Oct, 19 2003 @ 02:36 PM
link   
I dont think if we made AI and it revolted it could do very much. First of all all we would need to do is start lobbing EMP grenades and blasts at their armies. Secondly they wouldn't have access to any military networks, as they are all closed-circuit and highly protected (especially the nuclear weapon networks) so it would be impossible for them to launch our nukes.



posted on Oct, 20 2003 @ 03:40 AM
link   

Originally posted by scubaman
Even if in a future when the computers and machines might be able to kill us on purpose, they need us too much to kill us!


I don't think they would need us at all. If you watch the animatrix and the movies, you know that the only reason humans are alive is because we scorched the sky and they need us for energy. If it wasn't for us doing that, then all of humanity would be dead.

You say that they need us to mine for them? That will be one of the first things that robots are trained to do, watch the animatrix, they have robot miners in that. You also assume that they want to reproduce, and so they will need more resources. Reproduction is something that true life forms do, because they die. Robots can be repaired and recycled, I don't think they would feel the need to create millions more. To them the physical form is a relic, it is the intelligence inside the physical form that is important. I think they would be similar to the Borg, one giant interconnected network.

If robots ever created AI like they did in the Matrix and they felt we were out to destroy them, it would be very grim for all life on earth. They could create there own nukes and weapons.

Even if humans did scorch the sky, that would not save us in an actual scenario, they could use other sources of energy, such as nuclear power. I don't really see how the humans could be a sustainable energy source. you can not simply feed humans to other humans to keep them alive indeninitely, entropically it would not work, an alternate energy source needs to be introduced, on this planet, it is the sun. It would be like a starfish eating one of its arms, waiting for it to grow back, and eating it again and doing that indefinitely to survive, it simply would not work. Without that, no life in the absence of another energy source.

[Edited on 20-10-2003 by greenkoolaid]



posted on Oct, 21 2003 @ 12:39 PM
link   
Just out of interest
Why do you all think the AI would want to cause mass chaos anyway



posted on Oct, 21 2003 @ 02:06 PM
link   
at least some are(men without souls)...
i am a general in this war(u2u me for commands)...
and most persons do not believe that this war has been going on for centuries and has little to do with technology...



posted on Oct, 22 2003 @ 01:53 PM
link   
try and get a machine to understand poetry. not bloody likely. and that's the difference between us and it. it only knows what we let it know. it only interprets what we program them to do. we are the creators. kinda makes you wonder about a similar yet possible relation between us and this so called "god" figure.



posted on Oct, 22 2003 @ 04:01 PM
link   
well,,,all i know is,,,if my toaster acts up I'm screwed when I'm hungover and hungry!!!! LOL





new topics
top topics
 
0

log in

join