It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

AI

page: 2
0
<< 1   >>

log in

join
share:

posted on Jul, 15 2005 @ 11:01 AM
link   
If this happens expect AI and Cybernetic entities to flee the solar system to look for greener pastures. Also what makes you so sure such a system would be effective, MAD even on an Automated system will still be built by 'just' Humans so will thusly be flawed, enough so that any intelligent artificial lifeform or semi-artificial lifeform would be able to bypass it. And why do you assume they would want to take over this planet? It's finite, the galaxy is abundant. As a Transhumanist I wouldn't want to share this planet with my selfish, greed and violent brothers and sisters I would want to go somewhere to live in peace as most Transhumanists I've talked too seem to want. Are Genetically Enhanced lifeforms on your list too?

I can forsee an era where Pure Humans will do war on the Impure AI's and Cyborgs, and I'll make a bet that it will be the Humans that will fire the first shot due to fear and ignorance.

Basically you're assuming they are gonna be a threat due to Hollywood. We have no experience with such lifeforms except on the Big Screen and those flicks wouldn't sell well if they were about complex social relationships between "US" and "Them". Only way to sell the Movie is with Big Explosions and a threat to the entire race. Hell even in the Animatrix it was US HUMANS who pushed the AI to enslave our race due to the fact we wouldn't admit them into the U.N.

[edit on 15-7-2005 by sardion2000]




posted on Jul, 15 2005 @ 05:26 PM
link   
sardion2000,

Although I would not like to forsee them as a threat, the fact is that since they will potentially be able to exceed our abilities in every area - they may have ambitions to match.

And as a Human I would be interested in sharing this world, but not in being a second class lifeform in it.

They will possibly see it as their manifest destiny to rule not only this planet but all of the one's they encounter where lifeforms are more primitive than themselves.

Maybe this will not occur at all, the motivations of any self-aware life form (synthetic or otherwise) that does not share our emotional irrationality will be quite alien to most of us.

Therefore the smart thing is to be prepared.

And perhaps if a blackhole is launched directly into the sun (or we have probes in close solar orbit capable of creating a black hole) it will have enough mass and grow quickly enough to engulf the entire solar system before the A.I. could escape?

If not perhaps a series of black holes created around all the planets in a coordinated fashion would do the trick?

As far as assuming they can outmaneuver our every move, our irrationality may be hard for them to predict and counter, and we may still have a slight advantage over them in creative thinking.

No reason to lay down and let them walk all over us if we can wisely prepare.



posted on Jul, 15 2005 @ 05:57 PM
link   


No reason to lay down and let them walk all over us if we can wisely prepare.


Who is this "us" you talk about? I would not condone such actions that you are advocating and such actions would just push me deeper into the Transhumanist camp, you're worldview is what I fear the most because you talk about not wanting Humans to become 2nd class citizens while advocating the same for AI and Cyborgs. Very Orweillian is all I got to say.

And your plan is crazily unworkable at that too from a great many aspects but I'll leave that to the more mathematically oriented folks on ats to dissect.



posted on Jul, 19 2005 @ 12:16 PM
link   
"Let him who desires peace prepare for war."

-- Vegetius



posted on Jul, 22 2005 @ 10:29 PM
link   
We will be messing with AI in earnest once nano tech takes off imho

you shoudl read the butlerian jihad sometime hehehe

and this thread :

www.abovetopsecret.com...'



posted on Jul, 22 2005 @ 11:32 PM
link   
wed have no chance... no discussion



posted on Sep, 2 2005 @ 08:02 AM
link   
we would have a chance though-we just go for gurellia tactics. thing is they dont fear, so it may not work...



posted on Sep, 2 2005 @ 09:35 AM
link   
One this type of subject you have to see where we are in the AI and the robotics area for "AI" to take "control" of themselves.

1. The Japanese just made the first actual robot that can walk, run, and recognize people and the faces.

2. Wireless technology is booming right now. Which inturn would mean that if there ever was an "AI" powerful enough yes it would be very easy to control from one central computer. ( I don't mean to be copying a movie) BUT that area would be the easy part.

3. The hard part would be to combine both sides and making them work properly.

4. Remember if a human made the AI then a human can alway destroy it.

Another issue that is happening right now is that we do have "robot" that are in war right now. Unmanned plains, with human control. Also we have small tanks that help out our troops. These "small tanks" can be fitted with 20mm all the way up to 50mm, and can have recon cameras on them as well.


O yeah take a look: If only the guy behind IT wasn't there and an AI put in its place. O WAIT, wireless satellites why didn't they think of that. Then this machine would be a killer.



posted on Sep, 2 2005 @ 10:50 AM
link   
If humanity could produce such a machine as AI and developed it to such a point where it ould take over, humanity should have other weapons besides them. It's not like we'd really be stupid enough to focus all our manpower onto AI as weapons/slaves.
Or could we?



posted on Sep, 2 2005 @ 02:33 PM
link   
Lifeadventurer,

I think the biggest danger is that they don't tire, sleep, or eat.

Therefore we are at a huge disadvantage from the satr.

If they also don't feel and have no empathy for biological creatures they could indeed overpower us in a relatively short time.

Essentially we will have bequesthed them many of our strengths with almost non of our weaknesses...

As for those that say we can destroy anything we create - I think that is quite an oversimplification.

Following that line of logic, once they have significantly modified themselves (from our original design) then it would follow that only they could destroy themselves.

I think that is a higly specious argument.



posted on Sep, 15 2005 @ 07:56 AM
link   

Originally posted by blue_sky_9

Originally posted by Philosophical Fanatic
In theory AI would enevatably take over, but that is asuming the human race lasts long enough, and is stupid enough, to create such things!


how is the human race stupid, fanatic? scencie and techonolgy have advanced in leaps and bounds sicne the begining of the 19th centry. and why should not we produce AI? apart from the risks, it would make our lives a lot easier!

I appologise for the confusion, what I meant was that it would shock me if the human race was foolish enough to create something that could challenge our intelect, and put us at risk of no longer being the dominent species on this planet.
And in answer to your question... another question! Why can't we one day in the future create machines that can carry out our actions at will, with no sort of intelect whatsoever, opposed to AI?



posted on Sep, 15 2005 @ 11:36 AM
link   

Originally posted by Philosophical Fanaticwhat I meant was that it would shock me if the human race was foolish enough to create something that could challenge our intelect, and put us at risk of no longer being the dominent species on this planet.


Umm - do you live in the U.S.?

And you would be shocked by something our leaders might do to endanger Humanity??

Okie Dokie..


Homer: Oh, look at me! I'm making people happy! I'm the Magical Man from Happy-Land, in a gumdrop house on Lollipop Lane! (slams the door, then put his head back round) Oh, by the way, I was being sarcastic.

Marge: Well, duh!



posted on Sep, 16 2005 @ 04:59 AM
link   

Originally posted by TruthMagnet

Originally posted by Philosophical Fanaticwhat I meant was that it would shock me if the human race was foolish enough to create something that could challenge our intelect, and put us at risk of no longer being the dominent species on this planet.


Umm - do you live in the U.S.?

And you would be shocked by something our leaders might do to endanger Humanity??

Okie Dokie..

Hmm... I see your point, I'm converted!


P.S. No, I don't live in the USA - I'm a Briton.



posted on Sep, 16 2005 @ 12:48 PM
link   
Ahh - what the hell is happening in your country then!??

It's becoming USA jr. now...

I always liked to think of it as a place I could flee to if this place ever went in the crapper...

You guys need to nip that sh*t in the butt right now!

(we cant do anything about it here - no venues for open dialog)



posted on Sep, 16 2005 @ 01:16 PM
link   
Thank god for Sci Fi ...

Has anyone read any Asimov?

I think AI and robots with AI is inevitable. Can we live peacefully with them? I don't know. I think its certainly dangerous. It all depends on how smart we make them. I think it would be wise to make them all stupid or just smart enough to do the task assigned.

If we did however wind up going to war with them. I think the Human race could win out. It would be rough and bloody but we could do it.

And as far as gobbling up the solar system with a black hole ....
booooo bad idea.



posted on Sep, 16 2005 @ 02:46 PM
link   

Originally posted by noise
Thank god for Sci Fi ...


I agree - Sci Fi allows us to make predictions about the future, often with uncanny accuracy. Mainly it gets us thinking about the implications of technology in a useful way.



[If we did however wind up going to war with them. I think the Human race could win out. It would be rough and bloody but we could do it.


Wow that surely is an arrogant prediction! And since the risk of being wrong is the extinction or enslavement of the entire Human race it is not a bet I would be willing to make.



[And as far as gobbling up the solar system with a black hole ....
booooo bad idea.


Umm, I'm up for suggestions - clearly this is a weapon of last resort - until I hear a better idea I think it is reasonable to prepare such a plan.

Unless you believe its better to pacify the robots with obedience instead of negotiating with them on equal footing.

And if you think that - the terrorbots have already won!



new topics

top topics



 
0
<< 1   >>

log in

join