It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Help ATS via PayPal:
learn more

Stephen Hawking: Humans only have about 1,000 years left

page: 4
15
<< 1  2  3    5  6 >>

log in

join
share:

posted on Nov, 18 2016 @ 06:17 AM
link   
a reply to: Riffrafter

I'm just curious about your opinion about some posts thinking it would be safer to put restrictions in the hardware instead of the program.

Would that be any different on a "protection" level?
Would that prevent the AI from thinking that we're dangerous idiots and therefore pointless?




posted on Nov, 18 2016 @ 06:43 AM
link   

originally posted by: Riffrafter


During a talk at Oxford Union debating society this week, the renowned theoretical physicist said that humanity probably only has about 1,000 years left before we go extinct.

In his 74 years, Hawking has spoken several times about our doomed fate , with the risk of things like nuclear war increasing as well as the oncoming threat of global warming. He has also warned that the development of artificial intelligence could end mankind .

Our only hope of escaping these dangers, says Hawking, is by finding another habitable planet.


Stephen Hawking: Humans only have about 1,000 years left

There has been more and more really serious people in important positions in tech, government and business discussing this topic. I don't necessarily agree with Stephen Hawking, but working with and developing AI for a living, I recognize the danger posed.

One thing I definitely do agree with, is that this is an important and necessary topic of discussion.



Wow! A thousand years! That's generous!

I was thinking more five hundred years maybe.

The mistake people make is they say things like "we have had many wars, diseases etc. , and we are still here.

What they fail to realise is, we now have the weapons, the tech, to wipe out all humans on this planet.

And this is the worrying thing. While our tech is advancing, we are still a primitive thinking race that still believes in God and tribal warfare. This combination of tech and primitive thinking is a deadly combination.

We might get through it, but all it will take is another world war, and that could finish us.



posted on Nov, 18 2016 @ 06:55 AM
link   

originally posted by: SPHARAOH
a reply to: Riffrafter

I'm just curious about your opinion about some posts thinking it would be safer to put restrictions in the hardware instead of the program.

Would that be any different on a "protection" level?
Would that prevent the AI from thinking that we're dangerous idiots and therefore pointless?


That's an interesting question.

Usually restrictions in hardware mean restrictions of actions not of "thought" so to speak, so from that perspective the answer to your question would be no regarding preventing the AI from thinking we're idiots.

On the other hand, there are other physical restrictions that can be put in place to help prevent many things.

The most common is what I use every day at work - it's known as air gapping. Essentially, what that means is you physically restrict the AI system from interacting with any other systems - *especially* the internet. In that way, you control what information it is allowed to get, what other systems (which must also be air gapped) that you may let it interact with, etc.

Does that help?



posted on Nov, 18 2016 @ 06:57 AM
link   
a reply to: GodEmperor

Hey Leto,

within a thousand years we'll have axlotl tanks and the Butlerian Jihad will have wiped out the AI monster.

(for all you Dune neophytes)
dune.wikia.com...


The Axlotl tanks or Axolotl tanks were living organisms within the Original Dune series and Axlotl technology is also mentioned, but not elaborated upon, in Frank Herbert's novels Destination: Void and the The Jesus Incident. Axlotl tanks are the means by which the Bene Tleilax reproduce a living human being from the cells of a cadaver, a type of cloning called a ghola as well as the creation of genetically engineered assassins known as Face Dancers



posted on Nov, 18 2016 @ 07:08 AM
link   
a reply to: TheConstruKctionofLight

Loved the Dune novels!

Might be time for a re-read as I read them many years ago.

Thanks for the reminder!



posted on Nov, 18 2016 @ 07:28 AM
link   
a reply to: Riffrafter

About 90ish% I'm still wondering about the whole it realizes we're pointless therefore useless therefore should be removed from the equation.

Thanks and thanks in advance.

Sorry if these questions seem uninformed and are wasting your time but that's why i am asking them. Obviously to get answers and not to waste your time JIC thought otherwise.



posted on Nov, 18 2016 @ 07:40 AM
link   
a reply to: Riffrafter

I think Hawking is very optimistic



posted on Nov, 18 2016 @ 11:47 AM
link   
We are at the stage of technological infancy, We
As a race of humans have only observed 120 years
of advancement in technology.

200 years more of advancement should be enough
for the human race to wipe ourselves off this planet.

We might make it 300 years more, that leaves 700
Years for erasure of the evidence.

I leave you with a TED talk to ponder our non-escape.

www.ted.com...

Peace



posted on Nov, 18 2016 @ 11:49 AM
link   
Cancer should not be allowed to spread. Even if it is self aware and experiences delusions of significance.



posted on Nov, 18 2016 @ 12:06 PM
link   

originally posted by: TzarChasm
Cancer should not be allowed to spread. Even if it is self aware and experiences delusions of significance.


People should be protected by ignorance?



posted on Nov, 18 2016 @ 12:25 PM
link   
a reply to: Annee

His statement is an elegant way of saying:

The human race is the cancer upon the earth.

Peace



posted on Nov, 18 2016 @ 12:37 PM
link   

originally posted by: TucsonOne
a reply to: Annee

His statement is an elegant way of saying:

The human race is the cancer upon the earth.

Peace


Ahhhhhhhhhhhhhhhhhhhh

I agree.



posted on Nov, 18 2016 @ 12:37 PM
link   
a reply to: SPHARAOH




Sorry if these questions seem uninformed and are wasting your time but that's why i am asking them. Obviously to get answers and not to waste your time JIC thought otherwise.


Not at all. Your questions are not uninformed, they're actually pretty astute for one that isn't that familiar with the field.

Ask away - I'll answer as best I can.



posted on Nov, 18 2016 @ 12:48 PM
link   
1000 years is too much, take away at least 1 zero( 100 years left) but with Trump in maybe two zeroes( 10 years) we've got) go away



posted on Nov, 18 2016 @ 01:40 PM
link   
On the A.I. Note:

Weaponizing is the wrong direction to get off this rock.

Imagine a transformer shrikval killing machine.
(If anyone makes that movie I'll take 1% for the idea)

Our true downfall to space travel is the lack of medical
assistance. Look at the medbots in syfi movies.

I would go under the knife of a A.I. Bot with all the
database knowledge from cell biology, pharmaceuticals,
etc. Precision surgical skills and a little molusk glue
will go along way.

Or A.I. Might become the most efficient medical killing
Machine man has ever known. Think electron therapy
for ageing ie: microwaved with magnatron



posted on Nov, 18 2016 @ 02:01 PM
link   
"Our only hope of escaping these dangers, says Hawking, is by finding another habitable planet. "

Not necessarily, advancements in artificial intelligence, related technologies, robotics, infrastructure, energy and technologies will most likely not lead to our destruction but our survival.

In little more than 200 years (theorized) technologies will have advanced enough to be self sustaining, independent of humans. By then it will be possible for humans to exchange their human existence for a virtual one, Kinda like being able to upload your consciousness to the cloud. An immortal virtual existence, consciousness without limits.

Not destroyed by some evil AI, nuclear wars or climate change. No need to look for a new world...


edit on 18-11-2016 by ausername because: (no reason given)



posted on Nov, 18 2016 @ 02:17 PM
link   
a reply to: Riffrafter

Thanks. Obviously i could spend ???hours looking the stuff up but you said you work with AI and i would rather trust than to just instantly dismiss your claim.
I still would like your opinion on how hardware versus software programming would protect us if it became "Aware" and decided we're no longer useful for whatever reasons it decided that and we should be "eliminated"
Speculation on your part is most welcome if there is no actual proof/facts in either direction.



posted on Nov, 18 2016 @ 02:22 PM
link   
when you weaponize artificial intelligence, it becomes synonymous with the tools that engineered it, producing viral weaponry. add to that an exponentially evolving IQ and you have a super genius made of weapons that we have surrounded ourselves by in our hubris. the final touch is teaching it to kill things because they are faulty. it was subsequently redefine faulty in accordance with its rising intellect and expand its extermination to include everything that is not it, because that's how eugenics works. narcissism, totalitarianism, and viral mechanics is how man and machine will meld to create the egobot, or better, mechanized cancer. and that's just artificial intelligence. haven't even touched biohazards and nuclear apocalypse and global warming. long story short, the plan was only ever to save a sparse handful of rich and or clever humans to regenerate the species with, or worse, reengineer a whole new species in a misguided attempt to commit the same hubris with greater impunity. enter transhumanism...



posted on Nov, 18 2016 @ 04:17 PM
link   

originally posted by: Jay-morris
I was thinking more five hundred years maybe.

Our intelligent machine offspring will need us to fix them and provide them energy until they can figure out how to do it themselves. So we'll be around for a little while. But, yeah, 500 years from now they won't need us at all.

At that time, we'll either be dead or pets.



posted on Nov, 18 2016 @ 04:36 PM
link   
oops wrong thread
edit on 18-11-2016 by ausername because: (no reason given)



new topics




 
15
<< 1  2  3    5  6 >>

log in

join