It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Help ATS via PayPal:
learn more

Stephen Hawking: Humans only have about 1,000 years left

page: 1
14
<<   2  3  4 >>

log in

join
share:

posted on Nov, 17 2016 @ 07:25 PM
link   


During a talk at Oxford Union debating society this week, the renowned theoretical physicist said that humanity probably only has about 1,000 years left before we go extinct.

In his 74 years, Hawking has spoken several times about our doomed fate , with the risk of things like nuclear war increasing as well as the oncoming threat of global warming. He has also warned that the development of artificial intelligence could end mankind .

Our only hope of escaping these dangers, says Hawking, is by finding another habitable planet.


Stephen Hawking: Humans only have about 1,000 years left

There has been more and more really serious people in important positions in tech, government and business discussing this topic. I don't necessarily agree with Stephen Hawking, but working with and developing AI for a living, I recognize the danger posed.

One thing I definitely do agree with, is that this is an important and necessary topic of discussion.


edit on 11/17/2016 by Riffrafter because: (no reason given)




posted on Nov, 17 2016 @ 07:29 PM
link   
a reply to: Riffrafter

He needs to cut the crap.

He was famous, loved and will die. Stop trying to link your own mortal coil to humanitys.

We will be just fine. We wont even remember our "today".


+2 more 
posted on Nov, 17 2016 @ 07:29 PM
link   
At the rate we're going I'd say Mr. Hawking is being extremely optimistic.



posted on Nov, 17 2016 @ 07:31 PM
link   
a reply to: Riffrafter




I don't necessarily agree with Stephen Hawking, but working with and developing AI for a living, I recognize the danger posed.


elaborate please



posted on Nov, 17 2016 @ 07:31 PM
link   
If even one of these extremes is inevitable, does any of this really matter? If we are no more, there's nobody left to weep.

If we create our successors (more than likely become/evolve into our successors), then life goes on.

I don't think it's as black and white as this. All we have to do is reach the stars. 1,000 years is far too long ahead to think of global warming in any meaningful way. We'll have long transcended our current limitations on this pebble of a stone.

I think we'll branch out, the same as we've done already in the past, only within a shorter timeline, and in ways not so readily apparent.



posted on Nov, 17 2016 @ 07:35 PM
link   
Way too optimistic.



posted on Nov, 17 2016 @ 07:37 PM
link   
a reply to: tadaman

You and I tadaman will be running around the streets of NY, hoodies over head, Red Bull in hand singing ha ha ha we all told you so. As nothing happens we will twist and shout like we were Ferris B in Chicago.

Life will go on!



posted on Nov, 17 2016 @ 07:38 PM
link   
a reply to: Riffrafter

So Hawking advises that instead of cleaning up our mess here, we should simply find the next most habitable planet and leave our mess there? Amirite? Makes a lot of sense.

edit on 17-11-2016 by BeefNoMeat because: Typo



posted on Nov, 17 2016 @ 07:40 PM
link   
a reply to: Riffrafter

This just shows how smart the man is. Unlike the Al Gore's of the world that predict that (paraphrase) "The end is ONE decade away!"... and get proven wrong and others like him are proven wrong over and over and over again. Hawking is smart enough to simply throw out a "hail mary" because nobody is going to be around to realize he's wrong.

That being said, I'm all for manned and unmanned space exploration. If I put on my rose-colored glasses... they must be around here somewhere.... I can almost see a time where all the mental energy and ingenuity that goes into making weapons and tools of espionage gets transformed into (I think) the only two endeavors that at really worthwhile. Medicine and space exploration.

Anyway... that's my two cents.



posted on Nov, 17 2016 @ 07:46 PM
link   
Already posted

www.abovetopsecret.com...



posted on Nov, 17 2016 @ 07:46 PM
link   
If we are smart enough to colonize into the solar system in the next few centuries our survival chances wI'll greatly increase.
AI can be a variable but I hope proper safe guards stop the terminator before he goes after Sarah Connor.
I know Hawking is smart but I think he is incorrect here. If we're going extinct from war, natural disaster or plague it will be within the next 250 years. After that I believe we'll pass on to the next tech rung.



posted on Nov, 17 2016 @ 07:52 PM
link   

originally posted by: GodEmperor
Already posted

www.abovetopsecret.com...


Just clicked your link - thanks.

General chit chat forum...umm, well ok.

That's great, but I think this is a little beyond general chit chat and deserves a thread here too.

Your mileage may vary, void where prohibited, etc.


edit on 11/17/2016 by Riffrafter because: (no reason given)



posted on Nov, 17 2016 @ 07:52 PM
link   
Since our spirit lives forever, the existence as Homo sapiens is just a chapter in the book. (IMO of course)



posted on Nov, 17 2016 @ 07:56 PM
link   
a reply to: tadaman

I don't disagree with a single thing you say.

I do think the topic is very worthy of serious discussion though.



posted on Nov, 17 2016 @ 07:57 PM
link   

originally posted by: tikbalang
a reply to: Riffrafter




I don't necessarily agree with Stephen Hawking, but working with and developing AI for a living, I recognize the danger posed.


elaborate please


Elaborate on which part? My work or the dangers potentially posed?



posted on Nov, 17 2016 @ 08:01 PM
link   
Why don't he use his amazingly gifted mind and find a solution for FTL travel...



posted on Nov, 17 2016 @ 08:04 PM
link   
a reply to: Riffrafter

elaborate why you consider an A.I dangerous without quoting;" The terminator or Stephen Hawking "

A technological singularity which will abruptly trigger a technological growth, resulting in unthinkable changes to human culture.

Where is the danger?



posted on Nov, 17 2016 @ 08:05 PM
link   
a reply to: ATSAlex

Light a flashlight towards the sky, "VOILA!" A human would be smashed into atoms



posted on Nov, 17 2016 @ 08:06 PM
link   
a reply to: projectvxn

The day i see the A.I developing im gonna send you this meme..




posted on Nov, 17 2016 @ 08:09 PM
link   
a reply to: Riffrafter

Ah, Right, I think you can have the same subject in two categories.

He's giving an arbitrary number for our extinction that nobody alive now will be able to verify.

I'll just chalk his rhetoric up to anti-evolutionism, the dinosaurs didn't get wiped out, they grew wings and flew away.

So will we.



new topics

top topics



 
14
<<   2  3  4 >>

log in

join