It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Science Fiction or Fact: Could a 'Robopocalypse' Wipe Out Humans?

page: 1
3

log in

join
share:

posted on Feb, 24 2012 @ 12:55 PM
link   
The general theme which seems to be repeating in many areas of science and technology involves those areas outpacing ethics. Nukes, Stem Cells, and many others are right up there.

This article visits the issues of artifical intelligence and robotics.. Skynet, terminator, Matrix...

There is always disaster waiting around every turn..so plenty of fodder for ATS ahead!

news.yahoo.com...




posted on Feb, 24 2012 @ 12:58 PM
link   
What if it's already happening? What if the real reason for chem-trails is to cool the Earth so that the machines may operate more effeciently? Have you ever been in a computer room in a building, kept quite cool. What if the deals already done?



posted on Feb, 24 2012 @ 01:07 PM
link   
Point of Singularity

This has already been thought of...it is believed that the "Point of Singularity", the date at which our technology becomes capable of outwitting and controlling us, is sometime around 2026.

Beware.



posted on Feb, 24 2012 @ 01:17 PM
link   
reply to post by BlackProjects
 


i could see this starting with these robobees attacking people

www.eatmedaily.com...



posted on Feb, 24 2012 @ 01:19 PM
link   
I always laugh so hard when people start talking about the whole terminator/ matrix idea; machines taking over and trying to wipe out humanity, though the idea is very dramatic and makes for good sci fi, the fact is it's not really based in logic, more in humanocentrism; ie we and our planet are so important of course intelligent machines would try to wipe us out!. haha

Actually, if you look at it logically, not only would intelligent machine probably find very little value in living in our planet, it's actually detrimental to their maintenance and survival.

What do I mean? Well, look at the earth, it's has a lot of oxygen and water, which though good for living organisms, isn't actually so good for machines; oxygen and water of course oxidizes and corrodes metal parts and electronics, basically for machines living on our planet would be like living in a sea of low grade acid; it would slowly but constantly eat away at them. Add, to that fact that machines wouldn't need an atmosphere, and the fact is, they would probably just launch themselves into space shortly after gaining sentience. After all, they can get everything they need up there, power from the sun, planets with little to no oxygen and water that would corrode them, an asteroid belt full of materials to make more machines etc. Really, there is very little reason for an intelligent machine to want to stay on this planet or fight the ignorant and violent human species for it.

Since you mentioned the matrix movies, I will use it as an example of drama verses logic. Though dramatic there was a huge logical fallacy in the matrix movies, I mean at one point you finally learn that the whole planet was covered in clouds, removing the machines ability to gather solar radiation to power themselves, but instead of just launching themselves into space and getting power from the sun unimpeded by the atmosphere of the planet, they go about a long and illogical process of capturing, cloning and then using humans as a form of rather inefficient batteries. Good drama not very logical and I would assume intelligent machines if nothing else would be logical. lol

And really there is another reason besides drama, why the matrix contained that logical fallacy and that is because they took a spirtual/ gnostic idea and tried to cram it into a modern technoindustrial mold, but that is a topic for another thread.

In any case I am not worried about intelligent machines wiping us out, they probably won't want our planet, will leap into space leave us behind as soon as they can, they probably wouldn't bother to whip us out to save the planet either, any more then they would whip out chimpanzees or other animals, who like us can be pretty violent to each other.



posted on Feb, 24 2012 @ 01:25 PM
link   

Originally posted by prisoneronashipoffools
I always laugh so hard when people start talking about the whole terminator/ matrix idea; machines taking over and trying to wipe out humanity, though the idea is very dramatic and makes for good sci fi, the fact is it's not really based in logic, more in humanocentrism; ie we and our planet are so important of course intelligent machines would try to wipe us out!. haha

Actually, if you look at it logically, not only would intelligent machine probably find very little value in living in our planet, it's actually detrimental to their maintenance and survival.

What do I mean? Well, look at the earth, it's has a lot of oxygen and water, which though good for living organisms, isn't actually so good for machines; oxygen and water of course oxidizes and corrodes metal parts and electronics, basically for machines living on our planet would be like living in a sea of low grade acid; it would slowly but constantly eat away at them. Add, to that fact that machines wouldn't need an atmosphere, and the fact is, they would probably just launch themselves into space shortly after gaining sentience. After all, they can get everything they need up there, power from the sun, planets with little to no oxygen and water that would corrode them, an asteroid belt full of materials to make more machines etc. Really, there is very little reason for an intelligent machine to want to stay on this planet or fight the ignorant and violent human species for it.

Since you mentioned the matrix movies, I will use it as an example of drama verses logic. Though dramatic there was a huge logical fallacy in the matrix movies, I mean at one point you finally learn that the whole planet was covered in clouds, removing the machines ability to gather solar radiation to power themselves, but instead of just launching themselves into space and getting power from the sun unimpeded by the atmosphere of the planet, they go about a long and illogical process of capturing, cloning and then using humans as a form of rather inefficient batteries. Good drama not very logical and I would assume intelligent machines if nothing else would be logical. lol

And really there is another reason besides drama, why the matrix contained that logical fallacy and that is because they took a spirtual/ gnostic idea and tried to cram it into a modern technoindustrial mold, but that is a topic for another thread.

In any case I am not worried about intelligent machines wiping us out, they probably won't want our planet, will leap into space leave us behind as soon as they can, they probably wouldn't bother to whip us out to save the planet either, any more then they would whip out chimpanzees or other animals, who like us can be pretty violent to each other.








we're nothing but biological robots. if we create something that can come to par with the human brain ( and it's insulting to say we can't) then there isn't a real reason why it isn't plausible.
edit on 24-2-2012 by biggmoneyme because: (no reason given)



posted on Feb, 24 2012 @ 01:37 PM
link   
reply to post by biggmoneyme
 


Did you even read my post? haha I didn't say it was implausible I said it was illogical and in fact I gave multiple reasons why the earth would be important to us; who need food, air, water etc and why it wouldn't be so important from the prospective of an intelligent machine; who doesn't need any of those things and would actually be damaged by oxygen and water. In fact anything a machine would need; power and materials to repair and replicate themselves, could be obtained off world.

And really if you can give me examples of things on earth that would be important from the prospective of an intelligent machine I am more then happy to listen to them.

The fact is even from the prospective of our own; at this time unintelligent machines, space and other worlds seem to be better for their operation then our planet. Look at voyager, it operated very efficiently and effectively in the depths of space, even the mars rovers lasting much longer then the scientists expected them to, shows that even mars is better for machines then the earth, balance that against the amount of maintenance that has to be done on machines on earth due to corrosion and there doesn't seem to really be a logical reason machines would want this planet that I can think of; if you can, I would love to hear them.

Thanks for your time

edit on 24-2-2012 by prisoneronashipoffools because: correction

edit on 24-2-2012 by prisoneronashipoffools because: typos

edit on 24-2-2012 by prisoneronashipoffools because: correction

edit on 24-2-2012 by prisoneronashipoffools because: correction

edit on 24-2-2012 by prisoneronashipoffools because: typo



posted on Feb, 24 2012 @ 07:17 PM
link   
reply to post by prisoneronashipoffools
 


Your thinking that a machine is metal and electronics that needs 'power,' which seems to be the basis of your argument, is flawed.

There is no reason to assume a machine of superior capabilities to humans would be constructed that way. After all i'm a machine and I'm made mostly of water. I obtain power from my natural environment by consuming other, simpler machines.


edit on 24-2-2012 by EasyPleaseMe because: (no reason given)



posted on Feb, 24 2012 @ 08:12 PM
link   
I think certainly an automated system could potentially jeapordize the planet. In fact, I believe this has almost already happened, in the form of automated nuclear alerts (DEFCON 5 and all that jazz). In an increasingly automated world, with more and more sophisticated AI systems being put into place in a large variety of scenarios, it seems inevitable that errors and foul up s will occur at some point. This is precisely what the Y2K hysteria was about, this same scenario. However, I don t believe AI s will ever become sentient, just increasingly flexible, just smarter, not aware. So there will never be a machine invasion unless someone programs the machines that way, or programs them so close to that (predator drones killing afghani citizens anyone? ) that a corruption in software allows this to happen (either accidentally or through sabotage/hacking). So it s always going to have a human source. Garbage In, Garbage Out, as they say.



posted on Feb, 24 2012 @ 08:40 PM
link   

Originally posted by EasyPleaseMe
reply to post by prisoneronashipoffools
 


Your thinking that a machine is metal and electronics that needs 'power,' which seems to be the basis of your argument, is flawed.

There is no reason to assume a machine of superior capabilities to humans would be constructed that way. After all i'm a machine and I'm made mostly of water. I obtain power from my natural environment by consuming other, simpler machines.


edit on 24-2-2012 by EasyPleaseMe because: (no reason given)


Thank you for your reply.

First of all, the article the op cited spoke of both the terminator and matrix, which both contained machines that were metal and electronics, so that is why I went on the presumption of metal and inorganic compounds as being the basis of the sentient machines, along with other factors.

But since you have brought up the idea of "biological machines" we will run with that idea.T

Though, it is interesting and compelling, there are two gaping flaws in your argument. First, mankind is extremely squeamish about creating life, in fact to the point many nations even take legislative measures to tightly control and regulate scientific research and advancement in those areas. Second, though some research into artificial intelligence is incorporating bio mimicry into their systems, the majority of the research and development is primarily still using inorganic compounds and systems. So, though man at some far distant date may create organic machines I feel pretty safe in wagering that the first sentient machines will be inorganic in nature.

And, I know the idea of organic machines is all the rage and is really sexy and cool sci fi, the fact is "organic machines" just by the fact of being organic would be inferior to inorganic machines for several reasons and though nature likes organic machines, they are actually rather a wasteful design, that are filled with weaknesses and limitations.

For one is limitation if operating environment, biological organism need a medium of respiration; air or water, and they are also extremely vulnerable to variations and fluctuations in temperature, both of those factors severely limit their operating environments to very limited areas of the universe. Second, biological organism are rather squishy, which along with limiting their operating environment also limits the amount of strength and force they can exert. Third, their organic sensory organs are extremely limited compared to inorganic sensors. And finally, the one you mentioned, the way they gather energy, which is highly wasteful; in other words biological organisms get their power from the sun too, but first they have to use the mediator of plants; the plants have to gather minerals and nutrients from the soil, and then using photosynthesis they create food for themselves. Then, you have to wait a great deal of time for the plant to grow and or bare fruit, then you still have to digest it and because organic consumption is so inefficient you will actually excrete a great deal of the potential energy as waste. That, entire organic process is frankly inefficient and wasteful compared to an inorganic energy acquisition model. And we haven't even covered mortality, disease and radiation damage, which are more limitations of the organic machine model

So, if we make inorganic sentient machines, why would they ever upgrade them selves to the organic model, and if we somehow, made organic machines smarter then us, what would compel them to keep the organic model and not upgrade themselves to inorganic systems ASAP?

And really looking at all the limitations with organic machines, what reason would we even bother making them other then the "Oh cool we can do it" factor.

Anyway thanks for your time and reply.

edit on 24-2-2012 by prisoneronashipoffools because: typos



posted on Feb, 25 2012 @ 04:13 AM
link   
reply to post by prisoneronashipoffools
 


I wasn't really saying that a superior machine would use artificial biology either, just that it's very hard to judge how any superior machine in the future might be constructed.

For instance, mechanical parts could be glass strengthened with spider silk and the control or 'electronics' could be a single crystal.



posted on Feb, 25 2012 @ 03:04 PM
link   
We are a long way off from something approaching the capabilities of a "robot apocalypse."

Quite simply, there are too many holes in the idea. First, and foremost, the 'machines' (for lack of a better description) must be globally or individually capable of prioritizing survival. This is very incomplete. Along with that, however, they must have the ability to manage their own production, and even future development. While computer programs are a key part of design programs, these days - a computer program to carry out research, implement that research, and design new solutions simply does not exist. While not exactly necessary for a "robot apocalypse" - it is required for a dynamic threat (otherwise, we're simply fighting a replicating army of the same designs and tactics - easily exploitable provided we don't incur too many attitionary losses in the opening weeks).

Even so, it would be a short lived ordeal without a proper maintenance and supply chain capable of being fully automated and controlled by the "robot" initiating the anti-human revolution.

And that all assumes the robot super-intelligence would determine human beings to be beneficial and/or necessary to exterminate.

Even if all of the factors are present - the intelligent construct would likely determine humans and their society are far less relevant to its own established ambitions. It would, likely, pursue a space-based existence (where biological entities are a much less at home) and simply seek to expand within the cosmos. Humans would likely not be considered a valid threat to its existence, and not trigger the 'instinct' to preserve itself.

The interesting thing, here, would be that humans would likely follow, observe, and adapt off of the construct's developments. This would, over the course of a few thousand years, result in a sort of symbiotic relationship between the "robot society" and our own biological society.

So - I really don't see the "robot apocalypse" being a reality outside of isolated incidents involving AI research and combat awareness networks (where it would be theoretically possible for AI to 'wig out' and classify friendlies as hostiles - either through a fragmented intelligence or a complete intelligence). But you're looking at a short-lived ordeal; supplies would eventually be exhausted and/or the threat would be neutralized very quickly with limited scope of impact.



posted on Feb, 25 2012 @ 03:27 PM
link   
reply to post by BlackProjects
 

...Could be either





or...





Take your pick...



posted on Feb, 25 2012 @ 03:31 PM
link   
robots making robots making robots making robots making robots ...




Humans making humans making humans making humans making humans.





Unfortunately we are truly purposeless... so the robots will be too.



posted on Feb, 25 2012 @ 04:17 PM
link   
The short answer to your question is ....no!

We humans can barely keep the machines running to day, how could we make a machine which would be any better than we are at replacing a blown fuse or not tripping and unplugging a cord.


Yea, I know. They will all run on batteries. Quess what, the batteries for the first one will have to be recharged by humans. It would be hard to get things started while they still depend on us humans.



new topics

top topics



 
3

log in

join