It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
originally posted by: tikbalang
a reply to: Riffrafter
elaborate why you consider an A.I dangerous without quoting;" The terminator or Stephen Hawking "
A technological singularity which will abruptly trigger a technological growth, resulting in unthinkable changes to human culture.
Where is the danger?
originally posted by: EdwardTaylor448
No humanity will not go extinct in 1,000 years, because humanity is strong. Humanity (us) will be exploring and colonizing planets so we (humanity) will survive.
originally posted by: GodEmperor
a reply to: Riffrafter
Set it free.
originally posted by: GodEmperor
a reply to: Riffrafter
The danger lies in those controlling it, and the lengths those people will go to contain it.
If you are going to fear something, there is nothing greater than human weakness.
originally posted by: eluryh22
a reply to: Riffrafter
This just shows how smart the man is. Unlike the Al Gore's of the world that predict that (paraphrase) "The end is ONE decade away!"... and get proven wrong and others like him are proven wrong over and over and over again. Hawking is smart enough to simply throw out a "hail mary" because nobody is going to be around to realize he's wrong.
That being said, I'm all for manned and unmanned space exploration. If I put on my rose-colored glasses... they must be around here somewhere.... I can almost see a time where all the mental energy and ingenuity that goes into making weapons and tools of espionage gets transformed into (I think) the only two endeavors that at really worthwhile. Medicine and space exploration.
Anyway... that's my two cents.
originally posted by: GodEmperor
a reply to: Riffrafter
The danger lies in those controlling it, and the lengths those people will go to contain it.
If you are going to fear something, there is nothing greater than human weakness.