It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Here's 2 news items that bring us closer to Artificial Intelligence

page: 4
9
<< 1  2  3    5 >>

log in

join
share:

posted on Nov, 3 2014 @ 08:50 AM
link   
a reply to: ZetaRediculian

Is there a Psychopathy API from Microsoft yet?

A computer (and an AI) is a tool. If the tool is capable of "evil" and suddenly wants, as the OP fears, "destroy mankind", then the one who built the tool did something very wrong, and spent ALOT of effort for the tool to become so evil. And since computers cannot become evil on their own since they are simply tools, then the blame should not go on AIs, but on the programmer, and his sinister intents.


edit on 3-11-2014 by swanne because: (no reason given)



posted on Nov, 3 2014 @ 09:01 AM
link   
a reply to: swanne

That brings into question the entire method by which an AI would learn, and experience the world. Since it's "genetics" are programmed, only it's environment and experiences are the primary reasons for personal growth on the machine.

The programmer can only do so much to try to make his/her creation safe. But like a child, an abused and/or neglected AI that was otherwise programmed to do "good" might take a different path based it's experiences.



posted on Nov, 3 2014 @ 09:18 AM
link   
a reply to: ScientificRailgun

Perhaps, but there is a huge difference between a computer, which is nothing more than a fancy search engine, and an actual child who can experience pride, remorses, fear. The human brain is still a mystery to us, and pretend that a child's mind is even remotely comparable to a computer's search engine is an absolute scientism fantasy.



posted on Nov, 3 2014 @ 09:25 AM
link   

originally posted by: swanne
a reply to: ZetaRediculian
Is there a Psychopathy API from Microsoft yet?

not that I know of.



A computer (and an AI) is a tool. If the tool is capable of "evil" and suddenly wants, as the OP fears, "destroy mankind", then the one who built the tool did something very wrong, and spent ALOT of effort for the tool to become so evil. And since computers cannot become evil on their own since they are simply tools, then the blame should not go on AIs, but on the programmer, and his sinister intents.


Could be just a buggy program. Perhaps a surge took out the the Microsoft morality servers.

you can't program "evil" or morality. Are computer viruses evil? But, yes, the programmer is to blame for everything.

I am responsible for my kids and my dog so why not my AI
edit on 3-11-2014 by ZetaRediculian because: (no reason given)



posted on Nov, 3 2014 @ 09:28 AM
link   

originally posted by: swanne
a reply to: ScientificRailgun

Perhaps, but there is a huge difference between a computer, which is nothing more than a fancy search engine, and an actual child who can experience pride, remorses, fear. The human brain is still a mystery to us, and pretend that a child's mind is even remotely comparable to a computer's search engine is an absolute scientism fantasy.


Yes! We are on the same line of reasoning
edit on 3-11-2014 by ZetaRediculian because: (no reason given)



posted on Nov, 3 2014 @ 09:29 AM
link   

originally posted by: ZetaRediculian
Could be just a buggy program. Perhaps a surge took out the the Microsoft morality servers.


That's a funny thing - see, my video games have a basic AI (for the opponent character's moves). Yet when they glitch or bug, they just tend to freeze, not take over the World.


edit on 3-11-2014 by swanne because: (no reason given)



posted on Nov, 3 2014 @ 09:36 AM
link   

originally posted by: swanne

originally posted by: ZetaRediculian
Could be just a buggy program. Perhaps a surge took out the the Microsoft morality servers.


That's a funny thing - see, my video games have a basic AI (for the opponent character's moves). Yet when they glitch or bug, they just tend to freeze, not take over the World.


good point. As a programmer though, I have seen some odd bugs.



posted on Nov, 3 2014 @ 09:38 AM
link   
a reply to: ZetaRediculian

So did I. But glitches and bugs are failures for the program to access its resources, a crash of its capabilities. It represents a failure for the program.

How can a sinking ship attack the port?



posted on Nov, 3 2014 @ 10:03 AM
link   
a reply to: swanne


How can a sinking ship attack the port?

I have this little robot I have been working on and I use a LiPo battery pack. They can catch fire or explode if I'm not careful. I have been very hesitant to make the little guy a self charging system. If left unattended, and there is a problem in the code or elsewhere, I could burn the house down.

hopefully the sinking ship doesn't have nukes on it.



posted on Nov, 3 2014 @ 10:12 AM
link   

originally posted by: ZetaRediculian
If left unattended, and there is a problem in the code or elsewhere, I could burn the house down.


With the robot still inside, if I may point out.



posted on Nov, 3 2014 @ 10:14 AM
link   
a reply to: swanne


With the robot still inside, if I may point out.


yes, of course

but for the greater good of all poorly programmed self charging robots!


edit on 3-11-2014 by ZetaRediculian because: (no reason given)



posted on Nov, 3 2014 @ 10:25 AM
link   
a reply to: ZetaRediculian

Hehe, good point



posted on Nov, 3 2014 @ 10:29 AM
link   
a reply to: swanne

Maybe our criteria for AI differs a bit. My personal definition of true AI would be something that has the capacity for learning coupled with the ability to feel emotions, or at least the digital approximation of emotion.

True AI to me would be something akin to that childlike wonder of the world.



posted on Nov, 3 2014 @ 10:42 AM
link   

originally posted by: ScientificRailgun
a reply to: swanne

Maybe our criteria for AI differs a bit. My personal definition of true AI would be something that has the capacity for learning coupled with the ability to feel emotions, or at least the digital approximation of emotion.


Learning is already a capability of many programs, and non-intelligent ones at that. It only requires memory.

But I agree, emotions would be one of the best way of defining a true AI.

Instead of learning I would add true evolution (not spoilers, like this cloud sharing thing) to my definition of a true AI. If an AI can solve problems for which it does not have any answers for, a problem which his programmer has never programmed it to learn, then it succeeded in achieving intelligence.



posted on Nov, 3 2014 @ 10:56 AM
link   
a reply to: ScientificRailgun


True AI to me would be something akin to that childlike wonder of the world.

No doubt that would be cool but nothing close to that exists. There are some pretty spooky AIs though but they do exactly what they are told even if its to be random in their responses. Programming is nothing more than a script or a set of instructions to follow. How would you program emotions or feelings? emotions are chemistry of the brain. That's why drugs effect us. Can't give a computer a drug.

our feelings and senses are due to our biology. They are subjective. Taste, fear, understanding...
I asked earlier. What does a banana taste like? To me it tastes like yum yum and that is particularly true when I need some sugar and potassium.
edit on 3-11-2014 by ZetaRediculian because: (no reason given)



posted on Nov, 3 2014 @ 10:56 AM
link   
a reply to: swanne

I agree on the problem solving, for sure. So, emotion, learning, and problem solving?



posted on Nov, 3 2014 @ 11:01 AM
link   
a reply to: ScientificRailgun

That pretty much sums it up in my opinion.



posted on Nov, 3 2014 @ 11:04 AM
link   

originally posted by: ZetaRediculian
No doubt that would be cool but nothing close to that exists. There are some pretty spooky AIs though bit they do exactly what they are told even if its to be random in their responses. Programming is nothing more than a script or a set of instructions to follow. How would you program emotions or feelings? emotions are chemistry of the brain. That's why drugs effect us. Can't give a computer a drug.


But that is precisely my point. An evil AI is impossible, because true Artificial Intelligence is impossible for computers to achieve. They are always instructed to do something, and in the end the "AI" is nothing but the blurred mirror of its programmer.



posted on Nov, 3 2014 @ 11:24 AM
link   

originally posted by: swanne
Perhaps, but there is a huge difference between a computer, which is nothing more than a fancy search engine, and an actual child who can experience pride, remorses, fear.

Yeah, but we can emulate that. That's why I keep thinking about tamagotchi programming. Of course, a tamagatchi doesn't "really" experience hunger, or loneliness, but it doesn't matter. It's real to the machine. You put a range of those kinds of diminishing programs in a more complex machine that can weigh them and decide what's the most important thing at any moment to make a decision.

How do you create a "pride" parameter? You have a parameter that goes up when a programmer (an "interactor" basically), praises the machine for doing a good job at something. You use key words. You use image recognition of the interactor's smile. You have sensors on the machine body that feels a warm touch or a pat or caress. Again, that requires a kind of physical body that can perceive things, but that can be built. You program it so that as time goes by, if it doesn't get those things, the need for recognition increases, so the machine's behavior will start to become more influenced by that parameter. Of course, something like pride is not going to be as high a priority as "safety" or "food/energy," but in the long run it will have an affect on how motivated the machine is to learn things or do other good things to make its interactor proud of it.

No, it's not "real," but then again... what is?




edit on 3-11-2014 by Blue Shift because: (no reason given)



posted on Nov, 3 2014 @ 12:47 PM
link   
a reply to: Blue Shift


No, it's not "real," but then again... what is?

if it seems real to someone who interacts with it, does it matter?




top topics



 
9
<< 1  2  3    5 >>

log in

join