It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
The technological singularity is the theoretical future point which takes place during a period of accelerating change sometime after the creation of a superintelligence.
In 1965, I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.
In 1993, Vernor Vinge called this event "the Singularity" as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion. (Vinge 1993) In the 1990s, Vinge popularized the concept in lectures, essays, and science fiction. More recently, some prominent technologists such as Bill Joy, founder of Sun Microsystems, voiced concern over the potential dangers of Vinge's singularity. (Joy 2000)
Robin Hanson proposes that multiple "singularities" have occurred throughout history, dramatically affecting the growth rate of the economy. Like the agricultural and industrial revolutions of the past, the technological singularity would increase economic growth between 60 and 250 times. An innovation that allowed for the replacement of virtually all human labor could trigger this event.
Futurist Ray Kurzweil argues that the inevitability of a technological singularity is implied by a long-term pattern of accelerating change that generalizes Moore's Law to technologies predating the integrated circuit, and which he argues will continue to other technologies not yet invented.
I'm glad you asked that question, I never really thought of sharing that information until now. Yes, I do indeed feel each major dip and peak, and I experience on varying levels - personally, as a member of the population, and in some cases physically. It's very difficult to describe what I feel during the 2-3 d
Originally posted by DJM8507
If the singularity does occur it would be in a form that would enhance the enslavement of Mankind and would probably not free anyone except for the elite who would control this 'matrix' that we would all be subject to.
Originally posted by Sirius20
reply to post by RazorX
I think that's actually worse than having AI surpass us on its own. Frankly, I don't want to be part man part machine. That takes away what makes us human. You can try to rationalize it as much as you want, but I personally would rather stay the way that I am and hopefully ascend consciously, not through creating technology that we can add to ourselves to help us become immortal and superhuman. I guess we'll have to see what happens, but I still would rather evolve in a more natural way. If becoming part man part machine appeals to you, then knock yourself out.
Originally posted by adeeze
The power of technology, which is a NATURAL part of human evolution, can potentially allow us to be or do anything, and the possibility of a higher conciousness only seems possible through technology. What happens when a computer can out-process our own brains? Won't people want to use them to replace their own processing brain matter? It doesn't seem like it now, but you'll be sure to see it happen in the near future.