Help ATS with a contribution via PayPal:
learn more

By 2045 "The Top Species Will No Longer Be Humans"

page: 1
24
<<   2  3  4 >>

log in

join
+6 more 
posted on Jul, 5 2014 @ 08:49 AM
link   

"Today there's no legislation regarding how much intelligence a machine can have, how interconnected it can be. If that continues, look at the exponential trend. We will reach the singularity in the timeframe most experts predict. From that point on you're going to see that the top species will no longer be humans, but machines."


This is from Louis Del Monte, a physicist and entrepreneur who has recently authored a book entitled: "The Artificial Intelligence Revolution."

Del Monte also talks about "the singularity", a point in time where machine intelligence will not only outmatch the intelligence of a single individual, but the combined intelligence of the entire human race as a whole.


"Machines will make breakthroughs in medical technology, most of the human race will have more leisure time, and we'll think we've never had it better. The concern I'm raising is that the machines will view us as an unpredictable and dangerous species.

They might view us the same way we view harmful insects. Humans are a species that is unstable, creates wars, has weapons to wipe out the world twice over, and makes computer viruses."
www.businessinsider.com...

Because of this, Del Monte believes that machines will become self-aware and start protecting themselves. But he doesn't believe it will be a "Terminator"-type scenario where there will be a war.

In my opinion, one cannot predict with much certainty if there will be a war between human and machine or not. If machines do become self-aware and start protecting themselves, no one knows to what extent machines will go to to protect themselves.

Del Monte wrote the book as a warning to what could be coming to our near future if safeguards and limits aren't put into place concerning how intelligent a machine is allowed to become.




posted on Jul, 5 2014 @ 08:55 AM
link   
a reply to: _BoneZ_

AI has always fascinated me. What would a machine want to achieve? I seriously doubt a machine would have anywhere near the same goals of a biological entity ... nor would they care about the continued existence of them.



posted on Jul, 5 2014 @ 09:01 AM
link   
a reply to: Snarl

When a machine becomes self aware and able to protect itself, it essentially becomes sentient. Machines will then become their own species.

They will care about the existence of humans once they see that humans are or can be harmful to them or their existence.



posted on Jul, 5 2014 @ 09:02 AM
link   
I like to think that somehow we'll dodge a bullet when this all comes to pass. But the nature of the Singularity is its unknowable character. How self aware machines will think and act is a mystery.
Even when it happens, their acts may be unknowable and unintelligible to us, if their intelligence progresses as fast as some presume.

We'll be like dogs trying to understand the engineering of a space shuttle.

Edit: There's a Gary Larson joke in there somewhere....
edit on 5-7-2014 by Unresponsible because: (no reason given)



posted on Jul, 5 2014 @ 09:03 AM
link   
a reply to: _BoneZ_

If the machines are really smart, I think they would just leave after hitting a certain point. Unlike us humans who have very specific environments we have to live in, a machine would be able modify itself to any environments.

They would have a much better chance of survival if they left and started a machine colony on say Mars.
Away from us.


+14 more 
posted on Jul, 5 2014 @ 09:05 AM
link   
I have never thought that humans were the top species. That honor goes to germs. We are always subject to their domination. Once in a while we'll win a battle here or there .. like anti-biotics or anti-germ hand soap. But then they regroup and come back even stronger.



posted on Jul, 5 2014 @ 09:09 AM
link   
a reply to: FlyersFan

I don't know how sentient germs are, but you do have a point there. Never thought about it like that. We are at their mercy. lol

But by "top species" in this context, we're talking intelligence.






edit on 5-7-2014 by _BoneZ_ because: (no reason given)



posted on Jul, 5 2014 @ 09:18 AM
link   
Would be wrong to call a computer a species as species is a biological classification and taxonomic rank.



posted on Jul, 5 2014 @ 09:24 AM
link   
a reply to: _BoneZ_

I find it unlikely. Possible, but unlikely.

The problem is some of these experts treat the singularity as doom porn: when the machines rise against us and bring doom to humanity. But what they don't see is the most likely outcome-symbiosis.

We are already taking the first steps to merging man and machine in the form of physical augmentation. Today it's google glasses, advanced prosthetics, and so forth- imagine the technology we'll have in the three decades to come, we could get to a stage where we are a perfect merger between man and machine, and if that's the case then any A.I that want's to wipe us out would be effectively wiping out themselves, and every life form has an instinct for self preservation.





edit on 5-7-2014 by Thecakeisalie because: (no reason given)



posted on Jul, 5 2014 @ 09:34 AM
link   
No way machines can outclass humans. Machines require electricity. Deprive a machine of electricity and it is useless. Besides, machines modeled after humans can't be any better than humans.



posted on Jul, 5 2014 @ 09:37 AM
link   

originally posted by: Unresponsible

We'll be like dogs trying to understand the engineering of a space shuttle.

Edit: There's a Gary Larson joke in there somewhere....




It's not Gary Larson, but close enough.




posted on Jul, 5 2014 @ 09:42 AM
link   
Me thinks this guy has been watching/reading too much sci-fi. It's not as if the field of AI is the wild wild west and there's a bunch of rogue geeks out there that are going to create an army of sentient, uncontrollable machines, especially not by 2045. In the end, a machine does what it's told, regardless of how it gets to that final instruction...



posted on Jul, 5 2014 @ 09:45 AM
link   

originally posted by: eManym
No way machines can outclass humans. Machines require electricity. Deprive a machine of electricity and it is useless. Besides, machines modeled after humans can't be any better than humans.


Oh they will, and when we reach the point of singularity there is no going back.

It's difficult to comprehend sure, but the future course of human history is unpredictable.



posted on Jul, 5 2014 @ 10:00 AM
link   
a reply to: Snarl

they will probably go to the bar...drink oil and look at the newest issue of "bolts"..a new magazine for machines to get their rocks ...or whatever off.

I agree though...AI is fascinating but AI with robotics is what truly could get out of hand... Im ok with that though.



posted on Jul, 5 2014 @ 10:11 AM
link   
Humans will always be top over machines why?



Unless we are stupid enough to make them without simple kill switches. In which case us Humans deserve to go extinct.
edit on 5-7-2014 by crazyewok because: (no reason given)



posted on Jul, 5 2014 @ 10:21 AM
link   
a reply to: _BoneZ_
Machines and man alike would both still be comprised of atoms, and quarks. The smaller you go, the more similar we become. I'm of the opinion machines will be both friend and foe, capable of both loyalty and betrayal.

Thank you for the link, off to scope it out now.

ETA: I don't think people understand what it will mean when machines go sentient. Machines will re-produce themselves! And make adjustments along the way until there is a crescendo! Can you imagine??? There will be no kill-switch. They will harvest their own electricity, if they indeed even need it.


edit on 7 5 2014 by JohnTheSmith because: ETA



posted on Jul, 5 2014 @ 10:29 AM
link   

originally posted by: Snarl
AI has always fascinated me. What would a machine want to achieve? I seriously doubt a machine would have anywhere near the same goals of a biological entity ... nor would they care about the continued existence of them.


I agree,

There is no survival of the fittest with machines, not yet at least. One interesting part is that maybe machines bypass the long evolutionary time line and evolve exponentially. It comes down to when machines takeover and start to do their own programming, and that could go off in a really weird direction.

I do think that humans and machines will end up morphing together, creating human cyborgs, and so in the end we will become seamlessly connected into one and not be fighting each other. I can see the super human cyborgs fighting the "naturals" and that would be when Darwin wins the day and humans in their natural form become extinct.

Is there a person here that would not like a chip put in their heads that has all the languages of the world? Or maybe 1000s of other enhancements? Or have 10,000s of Nano bots cleaning/repairing their bodies from the inside out? You would be 200 years old and look 20, and that is where it will start the next evolutionary phase or singularity.
edit on 5-7-2014 by Xtrozero because: (no reason given)



posted on Jul, 5 2014 @ 11:11 AM
link   
I think WE are more likely to become the machines than the machines having to do anything to get us out of the way. I mean, look around at all the rules they keep adding to things. Look at how kids grow up in schools and with increasing rules. Look how we control so many things and are sure to control even more. With increasingly functional prosthetic devices on the horizon and thought/emotion scanners and gene screening and the capability to replace parts of the brain with synthetic parts and so on and so forth... We will start to resemble hte machines we fear today. And it'll come to a point in time where on the outside we look biological, but on the inside we're a mess of everything we've learned in science.

This Louis Del Monte the OP refers to says it himself: we're violent, uncontrolled and unpredictable. BUT YOU KNOW what he's actually saying in that veiled statement don't yo? He's saying "We need to stop being violent, uncontrolled and unpredictable." And you know inside the psyche of so many others this same statement is being said. He's not attacking machines in his statements, he's bringing up fears he has about what he thinks our shortcomings are. Some will say humans will always be this way, but in our science institutions and in our schools you see an opposite vision. Our society is working towards a more controlled future. A safer future. A future where humans are less unpredictable. Less crime. Less economic recessions. Less mental disorders. Less conflicts. More cooperations. And why can so few see what this means? It means things that're not controlled today will be controlled in the future. More control means we're more robotic.

The machines will help us be what we were already becoming.

Agent Smith in the Matrix is not what we hate. Agent Smith is what we're. When Agent Smith said humans are a virus that consume and leave destruction in their wake, he was speaking our innermost thoughts.
edit on 5-7-2014 by jonnywhite because: (no reason given)



posted on Jul, 5 2014 @ 11:15 AM
link   

originally posted by: _BoneZ_
a reply to: Snarl

When a machine becomes self aware and able to protect itself, it essentially becomes sentient. Machines will then become their own species.

They will care about the existence of humans once they see that humans are or can be harmful to them or their existence.

What if they're so efficient at reproduction they're not concerned with self-preservation?

I'm more curious about what they would find interesting. Humans give machines complex problems to solve. When they reach a point of self-awareness, I think they'll no longer be 'interested' in our problems, and we'll simply be ... ignored.

Anyway, there are a bunch of scenarios. Fascinating topic and I'm interested to read what other people think. Ewok's pic is a hoot!!



posted on Jul, 5 2014 @ 12:03 PM
link   
A la Bender - Futurama a reply to: rockpaperhammock





new topics

top topics



 
24
<<   2  3  4 >>

log in

join