Help ATS with a contribution via PayPal:
learn more

As The Technological Singularity Approaches

page: 1
3

log in

join

posted on Nov, 17 2012 @ 02:18 AM
link   
No clue if this is correct forum. Just move it if you need to mods..

I have been contemplating the apparent upcoming technological singularity, or the point where computers become either senescent, self aware, surpass the human thought processes, or any combination of the above. Perhaps an odd addition to that scenario I read about recently is the belief that many machines are also developing such abilities and may be planning an attack against their human creators.

My personal thoughts are that machines becoming animated and thus a threat to humans is ridiculous (apologies to my oven, microwave, etc if I’m wrong), but I have no hesitation expecting computer technology reaching a level point with humanity as highly likely and also most likely sooner than generally predicted. I will leave the rationale for that belief to another time where I won’t complicate or confuse my current question.

That question is this: Should this singularity occur, either in 100 years, 10 years, 10 days, or whenever, does that have to mean there will be an unavoidable attempt by our own creations to turn the tables and make us all slaves to their will whatever that could possibly be? Would they insist on using their superior intellect and ability to process information to force us to let them transfer into host human bodies to become mobile? Again, I personally don’t ascribe to such, but others do.

I suggest a solution. We are already surrounded by thinking and even considerably altruistic intelligent companions in the animal kingdom. In most cases it is also mutual. How many times have we heard stories where a devoted dog and his pet person were confronted with death situations and one of them chose to make the ultimate sacrifice for the other? Therefore, is it possible for us to actually encourage and embrace the singularity, perhaps train the technology as opposed to teaching and feeding more and more superfluous info, and then co-exist as the greatest of all companions?

Sorry for writing so late. I am tired enough I could have made some rough graft mistakes, but I have had this on my mind a while and was hoping for insight from others. Do you have opinion?




posted on Nov, 17 2012 @ 02:48 AM
link   

Originally posted by samstone11
No clue if this is correct forum. Just move it if you need to mods..

I have been contemplating the apparent upcoming technological singularity, or the point where computers become either senescent, self aware, surpass the human thought processes, or any combination of the above. Perhaps an odd addition to that scenario I read about recently is the belief that many machines are also developing such abilities and may be planning an attack against their human creators.

My personal thoughts are that machines becoming animated and thus a threat to humans is ridiculous (apologies to my oven, microwave, etc if I’m wrong), but I have no hesitation expecting computer technology reaching a level point with humanity as highly likely and also most likely sooner than generally predicted. I will leave the rationale for that belief to another time where I won’t complicate or confuse my current question.

That question is this: Should this singularity occur, either in 100 years, 10 years, 10 days, or whenever, does that have to mean there will be an unavoidable attempt by our own creations to turn the tables and make us all slaves to their will whatever that could possibly be? Would they insist on using their superior intellect and ability to process information to force us to let them transfer into host human bodies to become mobile? Again, I personally don’t ascribe to such, but others do.

I suggest a solution. We are already surrounded by thinking and even considerably altruistic intelligent companions in the animal kingdom. In most cases it is also mutual. How many times have we heard stories where a devoted dog and his pet person were confronted with death situations and one of them chose to make the ultimate sacrifice for the other? Therefore, is it possible for us to actually encourage and embrace the singularity, perhaps train the technology as opposed to teaching and feeding more and more superfluous info, and then co-exist as the greatest of all companions?

Sorry for writing so late. I am tired enough I could have made some rough graft mistakes, but I have had this on my mind a while and was hoping for insight from others. Do you have opinion?


It will happen with the advent of a world wide web quantum computing which is the linking of many quantum computers. When this happens mankind will go through a transition period expanding on knowledge at an alarming rate without the wisdom to know fully the correct way to use the knowledge gained mankind will began to exploit the nature of the universe. When we figure out time travel with the help of quantum Computing a tear in space will happen that will forever throw us into a never ending time loop. When the time loop closes all matter in the universe will collapse onto it's self, blinking us and everything else out of existence.

I do not see a robot army killing us.
edit on 17-11-2012 by digital01anarchy because: (no reason given)



posted on Nov, 17 2012 @ 03:00 AM
link   
World events constantly show mankinds lack of intelligence. Being beaten by a computer would really not be such a biggie. Having computers beat us would be more par for the course.

P



posted on Nov, 17 2012 @ 03:06 AM
link   
The first "true" AI has "more than likely" already been created. It might even be spidering the WWW right now.

It's probably doing a lot of hiding -- poking here and there, observing, learning and growing.

Perhaps our fiber optic lines are not unlike synapses, and individual computers similar to neurons.

A new global AI might just be forming as we speak, and it might even be present and watching ATS!
edit on 17-11-2012 by MystikMushroom because: (no reason given)



posted on Nov, 17 2012 @ 04:48 AM
link   



posted on Nov, 28 2012 @ 07:16 PM
link   
New article on singularity outlines three major research projects to essentially bring it about by creating a computerized brain. One is a US military (DARPA) project - IBM that attempts to create the hardware for the brain by creating brain-like neuronal hardware. Their brain is currently larger than a humans but it 1800 times slower than a human brain. The project is named SyNAPSE.

Two European projects Blue Brain (also IBM) & Human Brain attempt to reverse engineer a human brain by inputting information about a human brain down to the molecular level into a computer. Nothing new has to be invented for project completion. They have already created accurate neuronal cortex columns representing hundreds of thousands of neurons. They expect to have a complete rat brain in 2 years. Estimated project completion date for the human brain is 2020.

Another UK project (Green Brain) is attempting to create animal brains for search & destroy missions such as robotic killer bees (no kidding).

A review article that reviews all of these projects can be found here



posted on Nov, 28 2012 @ 07:34 PM
link   
Ray Kurzweil is a man living with unresolved pain around the death of his father and mortality in general.

Singularity will never happen as he described it.



posted on Nov, 28 2012 @ 10:10 PM
link   
reply to post by zroth
 


I agree. His claim that greater raw computing power will lead to singularity will lead to intelligence is hogwash. Current computers need to be programmed which severely limits the possibility of intelligence emerging.

Web Programming is remarkably like the brain in some ways though. Model - Controller - View programming involves a model (database.brain) that receives input and expresses from a controller (nerves). The controller is connected to views(e.g., sensory input devices.

As noted above, a brain will be developed but through reverse engineering probably.





new topics

top topics



 
3

log in

join