It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Will humans become obsolete

page: 1
0

log in

join
share:

posted on Aug, 29 2004 @ 03:18 AM
link   
In around 20 years time it is projected that computers will match or even exceed the complexity of the human brain but does that really imply that AI has been created? Or just a really fast computer?

ABSTRACT

This paper describes how the performance of AI machines tends to improve at the same pace that AI researchers get access to faster hardware. The processing power and memory capacity necessary to match general intellectual performance of the human brain are estimated. Based on extrapolation of past trends and on examination of technologies under development, it is predicted that the required hardware will be available in cheap machines in the 2020s.


I would like to hear what people think about this very interesting subject.



www.transhumanist.com...

[edit on 29-8-2004 by ufo3]



posted on Aug, 29 2004 @ 04:18 AM
link   
AI is fascinating to me as much as it is somewhat scary. The side of it that I find amazing is obviously the more 'front end' aspects which are very useful and interesting extensions of our own abilities, usually with significant boosts in speed and/or accuracy.

The scary side is actually not what most would typically have in mind either. I don't find the usual theories along the sci-fi path all that credible, although some do bring up occasional ideas for one to ponder. The scary side IMO has more to do with why I also question it's actual success in the Big Picture, so to speak. What I'm talking about is the fact that it's of Human Creation and that will include to some degree Human Error as well. Now, of course many typical errors will be corrected by the functions of AI itself and eventually even the complete design, function, etc. will be decided upon the most currently advanced AI unit.

However, especially when considering how flawed we as a race are currently even now and perhaps just considering our top 10%, I can't see us being able to create something without it being extremely flawed because of our own ignorance. In other words, for example, I don't find ideas like 'Terminator' or 'The Matrix' all that scary, not because I doubt our ability to create something similar, but more in that once we created it, it wouldn't be quite what we expected.

Even if it ended up killing us or whatever, I doubt it would happen as a result of being a more superior & creative intellect, but more likely would be the result of us being careless enough in creating our own destruction. If anything, should AI of our own creation actually develop in a superior way in all respects to humans themselves, I imagine that instead of it killing us off, it would more than likely correct our 'bad programming/wiring' for us. In doing so it would then be more beneficial for itself as well, far more than destroying us anyway.

Perhaps eventually, in cooperation of both AI and Biological, we would then join together for both our evolutionary benefit, who knows. The point being that while many people theorize AI surpassing Humans then choosing to destroy us, (for various reasons, most of which are usually correct unfortunately), I doubt that would be the actions of superior intellect.

For example, rats can be a very destructive and disease carrying pest and throughout history has caused many problems for mankind in many ways. Sometimes just a massive uncontrolled breeding of them can lead to the destruction of entire crops, habitat, etc. However, never in a million years would anyone be so stupid as to wipe out the entire rat population of the earth, as it would result in a devastating reaction within the natural food chain and order of life. In a similar way, I imagine we'd be viewed something along those lines, only perhaps much more useful than we normally view rats.



 
0

log in

join