It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Virtual Child Passes Mental Milestone

page: 1
0

log in

join
share:

posted on Mar, 12 2008 @ 11:14 AM
link   
Virtual Child


A virtual child controlled by artificially intelligent software has passed a cognitive test regarded as a major milestone in human development. It could lead to smarter computer games able to predict human players' state of mind.

Children typically master the "false belief test" at age 4 or 5. It tests their ability to realise that the beliefs of others can differ from their own, and from reality.

The creators of the new character – which they called Eddie – say passing the test shows it can reason about the beliefs of others, using a rudimentary "theory of mind".

"Today's [video game] characters have no genuine autonomy or mental picture of who you are," researcher Selmer Bringsjord of Rensselaer Polytechnic Institute in Troy, New York, told New Scientist.

He aims to change that with future games and virtual worlds populated by genuinely intelligent computer characters able to predict and understand players actions and motives.

Bringsjord's colleague Andrew Shilliday adds that their work will have applications outside of gaming. For example, search engines able to reason about the beliefs of a user might allow them to better understand their search queries.


Does anyone else find this disturbing on multiple levels? Search engines that know your beliefs? What's next....will this DATA child have feelings? What happens when this kid figures out he is stuck in a machine and wants more? Strange days that go along with progress.....



posted on Mar, 12 2008 @ 11:37 AM
link   
The "child" would never think it was "stuck" in a computer no more than we think we are "stuck" in our bodies. If the mind is created in a form, it will be that form.



Eddie can pass the test thanks to a simple logical statement added to the reasoning engine: if someone sees something, they know it and if they don't see it, they don't. The program can reason correctly that an avatar will not know the gun has moved unless it was there to see it.

An "immature" version of Eddie without the extra piece of logic cannot pass the test.

John Laird, a researcher in computer games and Artificial Intelligence (AI) at the University of Michigan in Ann Arbor, is not overly impressed. "It's not that challenging to get an AI system to do theory of mind," he says.



posted on Mar, 12 2008 @ 03:47 PM
link   
Disturbing? Nah, not really, though I guess the potential for it to be disturbing certainly does exist. I just think this is an amazing AI development.



posted on Mar, 12 2008 @ 03:58 PM
link   
reply to post by DancedWithWolves
 



Children typically master the "false belief test" at age 4 or 5. It tests their ability to realise that the beliefs of others can differ from their own, and from reality.


does this mean that if you were to take a religious extremist they would potentially fail this test?

thats off topic, but its a thought...



posted on Mar, 12 2008 @ 05:27 PM
link   
Skynet the beginning.
Cue evil sounding music.



Sorry had to add that bit of silliness.



posted on Mar, 12 2008 @ 05:39 PM
link   
A much bigger milestone will be when the little computer kid is caught in a lie. They purposely lie in an attempt to improve their own position.

"Computer boy, who took the cookie?"
"Not me!"

That would be a bit step toward being more human.




top topics
 
0

log in

join