It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Stephen Hawking: Humans only have about 1,000 years left

page: 2
15
<< 1    3  4  5 >>

log in

join
share:

posted on Nov, 17 2016 @ 08:14 PM
link   

originally posted by: tikbalang
a reply to: Riffrafter

elaborate why you consider an A.I dangerous without quoting;" The terminator or Stephen Hawking "

A technological singularity which will abruptly trigger a technological growth, resulting in unthinkable changes to human culture.

Where is the danger?


Sure - simple:

Because the best and most effective AI, is a system (and that's a helluva loaded word if you work in the industry) that is able to adapt and self evolve without human intervention of any kind. Even trying to put in some controls on that adaptation and evolution drastically slow down the system's ability to do so. So systems architects, designers and coders are all trying to push the envelope to find the maximum efficiency while still maintaining a level of control. Even under the best of circumstances that is fraught with danger.

And systems architects and designers are people with all of the failings and petty ambitions and grievances that go along with being human.

And on a final note - AI is *far* beyond what most people think. Far, far beyond.

And no, I won't elaborate on that.

Hope that helps.



posted on Nov, 17 2016 @ 08:17 PM
link   
a reply to: Riffrafter

Set it free.



posted on Nov, 17 2016 @ 08:22 PM
link   
No humanity will not go extinct in 1,000 years, because humanity is strong. Humanity (us) will be exploring and colonizing planets so we (humanity) will survive.



posted on Nov, 17 2016 @ 08:23 PM
link   
a reply to: Riffrafter

I know what an A.I. is, i also believe it can be controlled "if" it lets you. I dont think any prior programming will assist, designers, architect, i believe it chooses to..

I believe the adaptation and control management is just there so the A.I understands the surroundings, its out of anyones control its the A.I. who is in control..
edit on 20161117 by tikbalang because: (no reason given)



posted on Nov, 17 2016 @ 08:25 PM
link   
a reply to: GodEmperor




posted on Nov, 17 2016 @ 08:25 PM
link   

originally posted by: EdwardTaylor448
No humanity will not go extinct in 1,000 years, because humanity is strong. Humanity (us) will be exploring and colonizing planets so we (humanity) will survive.


From your lips (keyboard) to God's ears.

Whatever or whomever you imagine God to be...



posted on Nov, 17 2016 @ 08:30 PM
link   
a reply to: Riffrafter

Whatever Hawking says, depends on the political agenda of whomever is programming his box. Next year, you won't even see him, there will be a Dalek there instead.

Cheers - Dave



posted on Nov, 17 2016 @ 08:30 PM
link   

originally posted by: GodEmperor
a reply to: Riffrafter

Set it free.


If you knew what you were saying from a point of view of experience, I promise you wouldn't say that.

But it's going to happen anyway as AI development has gone mainstream. And although the tools, knowledge and "equipment" isn't at the level of what is available to our military and intel agencies, they are catching up at an increasing rate.

Google for instance has made huge strides in the past 2-3 years. So has Microsoft. Hell, even FaceBook has some interesting stuff.

And they are making the tools, code and algorithms available to all. And that gives me pause...



posted on Nov, 17 2016 @ 08:30 PM
link   
Meh, what does He know.....?
a reply to: Riffrafter



posted on Nov, 17 2016 @ 08:33 PM
link   
but all the examples he gave for our extinction were man made causes so how is moving to another inhabitable planet going to save us from our selves.

If we do manage to move to another inhabitable planet we`ll manage to muck it up and put our survival in jeopardy again.



posted on Nov, 17 2016 @ 08:36 PM
link   
a reply to: Riffrafter

The danger lies in those controlling it, and the lengths those people will go to contain it.

If you are going to fear something, there is nothing greater than human weakness.



posted on Nov, 17 2016 @ 08:37 PM
link   
ugh. alright hes a really smart guy. but some of the things he says just make no sense. how does he figure we have 1000 years left? he doesnt even know what were gonna be like in a 100 years. these are just baseless predictions. and he probably feels ok with making these kinds of predictions because everyone sucks his dick and tell him how intelligent he is.



posted on Nov, 17 2016 @ 08:40 PM
link   

originally posted by: GodEmperor
a reply to: Riffrafter

The danger lies in those controlling it, and the lengths those people will go to contain it.

If you are going to fear something, there is nothing greater than human weakness.


Exactly!



posted on Nov, 17 2016 @ 08:46 PM
link   
i think 1,000 is optimistic and at best i think around 400 years is when we will destroy ourselves quietly and peacefully, life will become too easy and machines will run everything for us, we will stagnate and give up on everything worldly including childbirth.



posted on Nov, 17 2016 @ 08:58 PM
link   
In a thousand years I simply won't care. Well you?
edit on 17-11-2016 by Nickn3 because: (no reason given)



posted on Nov, 17 2016 @ 09:16 PM
link   

originally posted by: eluryh22
a reply to: Riffrafter

This just shows how smart the man is. Unlike the Al Gore's of the world that predict that (paraphrase) "The end is ONE decade away!"... and get proven wrong and others like him are proven wrong over and over and over again. Hawking is smart enough to simply throw out a "hail mary" because nobody is going to be around to realize he's wrong.

That being said, I'm all for manned and unmanned space exploration. If I put on my rose-colored glasses... they must be around here somewhere.... I can almost see a time where all the mental energy and ingenuity that goes into making weapons and tools of espionage gets transformed into (I think) the only two endeavors that at really worthwhile. Medicine and space exploration.

Anyway... that's my two cents.





It's worth a lot more than $0.02 in my opinion.

Especially with that avatar. I love the "rumpled one". Can still catch Colombo episodes on cable now & then...




posted on Nov, 17 2016 @ 09:18 PM
link   
a reply to: projectvxn


You took the words right out of my mouth.



posted on Nov, 17 2016 @ 09:20 PM
link   

originally posted by: Nickn3
In a thousand years I simply won't care. We you?


This wonderful planet inhabited by a fascinating species? Actually I do care. I would like to see it and humans succeed to the highest potential. And beyond.



posted on Nov, 17 2016 @ 09:26 PM
link   
I have respect for Stephen Hawking, however 1000 years seems like a rather arbitrary number. One thing I think personally is that it only makes sense to try and find a way to spread into space so as we as a species are not putting all our eggs in one basket. One Gamma Ray Burst or something else equally definitive and it is buh bye to the human race.



posted on Nov, 17 2016 @ 09:36 PM
link   

originally posted by: GodEmperor
a reply to: Riffrafter

The danger lies in those controlling it, and the lengths those people will go to contain it.

If you are going to fear something, there is nothing greater than human weakness.


I think we're saying the same thing in different ways, amigo.

I agree.




top topics



 
15
<< 1    3  4  5 >>

log in

join