It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Technologic Singularity

page: 1
0

log in

join
share:

posted on Mar, 1 2009 @ 09:03 AM
link   
While reading "The Intelligent Universe" by james gardener..................
there was a mention of the technological singularity....and i was awed by its beauty and yes to be honest somewhat concerned as well...............

John von Neumann was the first person to give a thought
about the implications of a looming technological singularity:



One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.


And then the statiscian I.J.Good remarked



Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an “intelligence explosion,” and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.


At a conference the famous context made about technological singularity...



Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended. Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented.


What are the consequences of this event? When greater-than-
human intelligence drives progress, that progress will be much
more rapid. In fact, there seems no reason why progress itself
would not involve the creation of still more intelligent
entities—on a still-shorter time scale. The best analogy that I
see is with the evolutionary past: Animals can adapt to problems
and make inventions, but often no faster than natural selection
can do its work—the world acts as its own simulator in the
case of natural selection. We humans have the ability to
internalize the world and conduct “what ifs” in our heads; we
can solve many problems thousands of times faster than natural
selection. Now by creating the means to execute those
simulations at much higher speeds, we are entering a regime
as radically different from our own human past as we humans
are from the lower animals.

The lessons of our evolutionary past were, in Vinge’s view, not exactly
comforting:
From the human point of view this change will be a throwing
away of all previous rules, perhaps in the blink of an eye, an
exponential runaway beyond any hope of control.
Developments that before were thought might only happen
in “a million years” (if ever) will likely happen in the next
century. [One commentator] paints a picture of the major
changes happening in a matter of hours.... [The most
disturbing consequence of the technological singularity is that
any hyper-intelligent machine] would not be humankind’s
“tool”—any more than humans are the tools of rabbits or
robins or chimpanzees

ANY COMMENTS

srry didnt know where to post this so I put it here.................feel free to move it under the suitable forum



posted on Mar, 1 2009 @ 10:02 AM
link   
My take on it.

Repost due to laziness.

Death Through Technology
---------------------------------

Think of a circle drawn on a piece of paper. Everything you understand and know is on the inside of the circle, everything you are ignorant of is on the outside. Lets say you expand your knowledge and understanding, now the circle has to grow.. The area on the inside is larger but indirectly the circumference looking outwards has also increased. In other words, you have learned more; but with that knowledge comes the counter-intuitive realization that you're now more ignorant than when you started.

Bottom Line - As the area of your understanding increases in your "circle of knowledge", so does the circumference increase in your "circle of ignorance".

Now most people might say "so what", with our new knowledge we can just combat whatever detrimental effects come along with our newfound ignorance. And.. You're right, you can.. For a time. Yet for every step of technological progress forward there are 2 steps backward. Most people are unaware of this due to the fact that the world we lived in seemed infinite for a time and those 2 steps backward could be accommodated with more land, resources and more technology... Once we reached the barriers of our world the law of perpetual growth within a finite environment began to rear its ugly head. So now the 2 steps backward we would have remedied with expansion, turned into 2 steps inward. We are feeling the effects of this tightening spiral on an almost hour to hour basis. As we approach the singularity (the beginning of the outward spiral now returning to its home) we bring with it all the energy of our "outward/inward" journey. With that energy we destroy ourselves and then we get to start it all over again.


Peace


PS. For more on this, check out "Timewave Zero" by Terrence Mckenna.



posted on Mar, 1 2009 @ 02:34 PM
link   
My idea is that we are already in the singularity and we are actually working backwards in that we are trying to examine ways in which we are not in the singularity .

My opinion is, there was no big bang. We and the entire universe is still in the singularity. There is only a perception of space and time . We are still one mass the size of an atom. We and the entire universe are a life form of particles less than the size of a pea learning how to compute. When we finally do realize this, it will be strange to say the least.


Just my opinion, but good thread to the OP.



new topics
 
0

log in

join