It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Could An Alien Pass the Turing Test?

page: 1
4
<<   2 >>

log in

join
share:

posted on Jun, 28 2012 @ 12:10 AM
link   

Across the gulf of space... intellects vast and cool and unsympathetic regarded this Earth with envious eyes, and slowly and surely drew their plans against us. – The War of the Worlds

The opening paragraphs of Wells' classic novel perpetuated an idea that has lived on ever since: intelligent aliens, in addition to being cleverer and technologically more advanced than ourselves, are likely to be cold, passionless creatures of logic, their advanced rationality incompatible with what we are used to calling 'human feelings'. In countless books, radio broadcasts, films and contactee testimonials, aliens are portrayed either as soulless, ruthless and amoral, or else as so psychologically and spiritually advanced that they are beyond the petty distractions of feeling. Devils, in other words, or angels.

Alan Turing, whose hundredth birthday fell on May 23 this year, was a seminal figure in the development of information technology and the drive for artificial intelligence. This great and tragic man famously created a test which, he claimed, would determine whether a machine could be regarded as intelligent or not. The Turing Test is conceptually very simple:


A human judge engages in a natural language conversation with a human and a machine designed to generate performance indistinguishable from that of a human being. All participants are separated from one another. If the judge cannot reliably tell the machine from the human, the machine is said to have passed the test.

The test does not check the ability to give the correct answer; it checks how closely the answer resembles typical human answers.

Personally, I am not wholly convinced the Turing Test would work. A machine may be as intelligent, or even more intelligent than a human, but could still fail the test if it failed to simulate human feelings and the special insight and understanding that come from them. Ask it whether you should sell your car or your wife and you might not get the answer you expect.

Which brings us to an interesting question. Would an intelligent alien be able to pass a Turing Test?

Imagine SETI picked up a signal from outer space and started a conversation. Might we conclude from the aliens' words that they were soulless robots? Are feelings and emotions peculiar to Earthlings, or are such attributes universal among intelligent beings?

Interesting questions indeed. I have my own opinions, of course, but I don't want to dictate the terms of conversation in this thread, so I will keep my views to myself for the moment and simply ask you what you think. Let's see where the discussion takes us.

Could an alien pass the Turing Test? Can intelligence evolve in the absence of emotion? Might alien emotions be so foreign to us that we cannot understand or even perceive them? Or is the Turing Test itself flawed?

I look forward eagerly to reading your responses.



posted on Jun, 28 2012 @ 12:46 AM
link   


Could an alien pass the Turing Test?

Insufficient data......Can not compute!


Can intelligence evolve in the absence of emotion?

Interesting question. Though completely unverifiable.

Might alien emotions be so foreign to us that we cannot understand or even perceive them? Or is the Turing Test itself flawed?


I think the test is flawed.

It would not be hard to write a program that could pass the test. I wrote an AI algorithm in the late '80s that could have been easily programmed (easy but time consuming) to beat the Turing test.

On a parallel tack......The AI system I designed had an emotional factor in the algorithm.....Allowing the system to make decisions based on an - artificial - emotional value.


edit on 28/6/2012 by OccamAssassin because: (no reason given)



posted on Jun, 28 2012 @ 12:53 AM
link   
I don't think an alien could mimic human thought processes unless they had a common cultural and / or genetic origin with us, and even then, it might be hard. There are probably different types, some might even be human and be able to pass the test, although with at least a little cultural difficulty, while some really might be completely logical.

Even if an alien was logical, however, he or she would still have an advantage over a computer - being alive and being able to assess the situation and choose responses, instead of having to be programmed for every single response imaginable. The alien could even purposefully choose a bad response, if it wanted to.
edit on 28-6-2012 by darkbake because: (no reason given)



posted on Jun, 28 2012 @ 12:59 AM
link   
How different their emotions could be from ours? What if they consider us to be emotionless? We could have such differences that we are "incompatible", i.e. incapable of judging the other for what they really are because we have a background so totally different.

I think an alien could easily pass the test. But they may also completely fail, as their answer might just be too "not human" (do we really have to expect it to be humanlike?) to be considered for passing the test...



posted on Jun, 28 2012 @ 01:04 AM
link   

Originally posted by OccamAssassin
It would not be hard to write a program that could pass the test. I wrote an AI algorithm in the late '80s that could have been easily programmed (easy but time consuming) to beat the Turing test.


There is a competition to find an AI which can give human like responses called the Loebner Prize.

As of yet, no AI has ever actually won the major prize of being indistinguishable from a real human being. If you can write the AI, I'd suggest doing it if it will take under 300 working days given the prize money is pretty awesome.


I always wondered why they never made a test to work out if a human is a computer. Though I guess all you need to do is give a human a massive sum and when they don't answer immediately, they're obviously not a computer.

On a side note, aliens should be able to pass such test with abstract thinking. A computer can do vast sums quite easily, but can't distinguish other concepts that humans would find simple. The human brain is vastily different from a computer in how we see, percieve, and associate things. Humans can draw correlations between events etc ... which computers struggle to do without specific instructions. Ask a computer to do maths, it will give the answer in a tiny tiny second ... Ask a computer why the chicken crossed the road, or to abstractly relate a handful of objects and it will struggle.
edit on 28-6-2012 by Pinke because: last paragraph



posted on Jun, 28 2012 @ 01:47 AM
link   
reply to post by Pinke
 



As of yet, no AI has ever actually won the major prize of being indistinguishable from a real human being. If you can write the AI, I'd suggest doing it if it will take under 300 working days given the prize money is pretty awesome.


Maybe.......its nearly 25 years in the making now.....the original design hasn't changed much though the hardware to run it has. In all honesty, the tech is still lagging behind the design.

Things I am waiting for to improve so I can build my system.

*Accurate facial recognition
*3D scanners that can map on the fly.......Note; these are available now but (like the facial recognition) are glitchy at best
*Quantum processors. Not required, but would reduce the amount of CPU's needed to run the "brAIn".
*My kids to leave home.

*Faith in humanity not to turn my algorithm into a weapon.

Even if I get over the tech hurdles and manage to build it, the last point is still a worry.



posted on Jun, 28 2012 @ 02:11 AM
link   
reply to post by Astyanax
 


Just an answer to the question posed in the title. Hypothetically speaking, an alien COULD be so advanced that our pathetic measures of sentience might not actually be applicable to it. It could just as easily be a ravening, acid spitting, metal eating, planet destroying intelligence, on a par with a pit bull, but packing molecular stripping particle beams capable of taking diamond and titanium apart on the smallest scales.

To answer properly, one must wait until there is a subject to ask these questions.



posted on Jun, 28 2012 @ 02:41 AM
link   

Originally posted by OccamAssassin
reply to post by Pinke
Things I am waiting for to improve so I can build my system.


At the moment its mostly just based on chat responses.

In theory, having an AI manage to think enough on its own to understand abstract concepts would be enough to win the prizes.

I've had another thought about it really, and I guess the turing test would have to be heavily modified to apply to aliens in some senses. IE ... animals couldn't pass such tests in many instances, but then for the last few hundred years we do tend to treat animals as if they don't have 'souls'.



posted on Jun, 28 2012 @ 04:30 AM
link   

Originally posted by OccamAssassin
reply to post by Pinke
 



As of yet, no AI has ever actually won the major prize of being indistinguishable from a real human being. If you can write the AI, I'd suggest doing it if it will take under 300 working days given the prize money is pretty awesome.


Maybe.......its nearly 25 years in the making now.....the original design hasn't changed much though the hardware to run it has. In all honesty, the tech is still lagging behind the design.

Things I am waiting for to improve so I can build my system.

*Accurate facial recognition
*3D scanners that can map on the fly.......Note; these are available now but (like the facial recognition) are glitchy at best
*Quantum processors. Not required, but would reduce the amount of CPU's needed to run the "brAIn".
*My kids to leave home.

*Faith in humanity not to turn my algorithm into a weapon.

Even if I get over the tech hurdles and manage to build it, the last point is still a worry.



Why quantum computers, are you factoring lots of integers? Does the human brain routinely break RSA?

To be honest it's not out of the question that the human brain uses quantum effect, but i see this thrown about quite a bit, without much justification.



posted on Jun, 28 2012 @ 06:01 AM
link   
reply to post by Legos
 


It's one big mass of floating point arguments. I initially tried to limit computations to whole numbers but hit walls with a targeting system.



posted on Jun, 28 2012 @ 08:52 AM
link   

Originally posted by Astyanax
[Imagine SETI picked up a signal from outer space and started a conversation. Might we conclude from the aliens' words that they were soulless robots? Are feelings and emotions peculiar to Earthlings, or are such attributes universal among intelligent beings?


I don't agree that the Greys are cyber robots who don't have emotions, it doesn't make any sense. Just because they do their procedures without talking much to the individual does not mean they are robots. They do have other aliens who are in a higher status than the Greys, yes the Greys were given the mission to perform all of the cattle mutilations and abductions. The other races of aliens are involved with the greys and they sit back observing, some leading. The ones in charge of the greys are reptilians, tall greys, and insectiods. Why would a cyber genetic being create a study beyond the dream world and time travel in order to study human emotions? Because they do not have emotions true, but greys are creators on earth - they can create weather, they can shapeshift, their craft is living and can morph into anything. We see this as utter magic because we are not in this 4th density or even higher - the ability to create magic on earth. We would not see our terms of a futuristic robot use religions, secret societies, magic and witchcraft to control populations of the world, or such a wide variety of interests and unique execution of events. Some are now being as foolish as those listening to War of the Worlds in 1938, thinking that computer robots from mars have invaded us. There is a difference between made up stories and witnesses seeing cattle beamed up into large balls of light.
edit on 28-6-2012 by greyer because: (no reason given)



posted on Jun, 28 2012 @ 09:00 AM
link   
reply to post by Astyanax
 

It depends on the machine and how well it could mimic a human emotional response.

As far as ET emotions go, I believe they would still have the same base emotions and survival instincts of all other creatures that evolved on a planet similiar to our own.



posted on Jun, 28 2012 @ 09:24 AM
link   
reply to post by darkbake
 


I don't think an alien could mimic human thought processes unless they had a common cultural and / or genetic origin with us, and even then, it might be hard.

Fair enough. The Turing Test isn't really a test for humanity, though. It's a test for intelligence – nonhuman, specifically artificial intelligence. And yes, it does seem to based on some unexamined cultural assumptions.

All the same, your view of the matter does raise some interesting questions. If an alien passes the Turing Test, do we only have to regard it as intelligent, or do we also have to regard it as human and award it human rights? More vexingly, if an alien fails the Turing Test, does that mean it cannot be regarded as legally human, and has no such rights? What if its behaviour showed strong signs of intelligence and sentience, yet it failed the test because it wasn't culturally or evolutionarily 'human' enough?

You can see how loaded these questions are. If exopolitics ever becomes a genuine field of human activity, the answers to them will be of literally universal interest.


There are probably different types, some might even be human and be able to pass the test, although with at least a little cultural difficulty...

I agree that there are probably many different species of intelligent alien in the Galaxy. But what does it mean to say 'some might be human'? Some may bear a superficial physical resemblance to us, but they would still be profoundly alien, having quite a different evolutionary history from our own. Or do you mean that some may resemble human beings culturally – though they may also have six legs and elephant ears?

I think this is where the inquiry becomes fascinating. What does it mean to be 'intelligent'? What does it mean to be 'human'? Are they the same thing? And if they are not, is any similarity of interest – let alone mutual intercourse – possible between intelligent beings from different worlds? Might all the races of the Galaxy be obliged to their separate ways, with nothing to say to or do with one another? It's a saddening thought, but not an improbable one. What, after all, do we have to talk about with the dolphins?

*


reply to post by Balkan
 


As far as ET emotions go, I believe they would still have the same base emotions and survival instincts of all other creatures that evolved on a planet similiar to our own.

What if they evolved on planets different from our own, or not on planets at all? Would that make a difference?


edit on 28/6/12 by Astyanax because: of economy.



posted on Jun, 28 2012 @ 09:29 AM
link   
COnsidering CLEVERBOT
its possible



posted on Jun, 28 2012 @ 09:43 AM
link   

Originally posted by Astyanax
What if they evolved on planets different from our own, or not on planets at all? Would that make a difference?

edit on 28/6/12 by Astyanax because: of the economy.

I can't see why any living creature would not share the same survival instincts. But it's a big gulf.. who knows.



posted on Jun, 28 2012 @ 09:54 AM
link   
reply to post by SpookyVince
 


How different their emotions could be from ours?

An excellent question. Can we conceive of emotions other than the ones we know of? Do people all over the world feel exactly the same emotions? Some sociologists emphasize the difference between 'guilt' cultures and 'shame' cultures. Are guilt and shame different emotions? Can you think of other possible examples?

*


reply to post by Balkan
 


I can't see why any living creature would not share the same survival instincts.

True, but do emotions arise from instincts? How would you answer SpookyVince's very interesting question, quoted above?



posted on Jun, 28 2012 @ 10:34 AM
link   
I lean towards an instinctual origin for most of our base emotions, but who can say what sort of complex emotions et race may have. In any case, good old Wiki states:


Emotions are the various bodily feelings associated with mood, temperament, personality, disposition, and motivation and also with hormones such as dopamine, noradrenaline, and serotonin.



No definitive emotion classification system exists, though numerous taxonomies have been proposed. Some categorizations include:

"Cognitive" versus "non-cognitive" emotions
Instinctual emotions (from the amygdala), versus cognitive emotions (from the prefrontal cortex).
Universal emotions recognized cross-culturally based on research on identification of facial expressions.


en.wikipedia.org...



posted on Jun, 28 2012 @ 10:39 AM
link   
Why would a biological entity need to pass a test exclusively designed for artificial intelligence?

Just curious.



posted on Jun, 28 2012 @ 10:41 AM
link   
I'm glad you brought this book up btw. Someone else mentioned it in my thread, but after reading a bit about it, it's definately up my alley. I wonder why Wells thought they would be cold and emotionless? Was he envisioning some sort of Vulcan race? I can't see any advanced race, however intellectual and genetically advanced, emotionless. Could the instinct to survive, the 'lust' for life if you will, be considered an emotion?

edit on 28-6-2012 by Balkan because: (no reason given)



posted on Jun, 28 2012 @ 12:17 PM
link   
reply to post by DissonantOne
 


Why would a biological entity need to pass a test exclusively designed for artificial intelligence?

Follow the thread and we may both find out.



new topics

top topics



 
4
<<   2 >>

log in

join