It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

I am proposing a new Turin Test, How to detect sentience in machines

page: 2
9
<< 1   >>

log in

join
share:

posted on May, 6 2017 @ 07:10 AM
link   
a reply to: Ophiuchus 13
Then it realizes what consumes its fossil fuel reserves?



posted on May, 6 2017 @ 08:47 AM
link   
a reply to: toysforadults

Computers based on the Von Neuman architecture perform the get-fetch-execute cycle without any emotions or reflection what any significance to the instructions being interpreted and execute.

AI may happen. It will most likely be by accident. The limitations of the Von Neumann are well documented. It's like saying your lawnmower will become intelligent after cutting enough grass. Yeah, it could happen. But it's kind of unlikely.

Maybe a cyborg DNA living cells mixed with silicon architecture may lead to some kind of self-aware brain. But it will not be with just silicon alone.



posted on May, 6 2017 @ 01:29 PM
link   
a reply to: dfnj2015

But that implies that Neurons possess some property that can't be emulated with silicon (or some other computing medium). What properties are those?

Why do you assume that neurons are a necessary part of it?



posted on May, 6 2017 @ 01:33 PM
link   
a reply to: toysforadults

That would imply that the AI develops some kind of instinct independent of the programming. And the instinct becomes triggered when faced with death.



posted on May, 6 2017 @ 02:35 PM
link   
Start talking about gamer gate.
then start disagreeing.

If the other party completely loses their sh 1 t chances are it's a human.



posted on May, 6 2017 @ 02:48 PM
link   
a reply to: Deaf Alien

Exactly.



posted on May, 6 2017 @ 03:47 PM
link   
a reply to: toysforadults

All these memories will be lost like tears in rain. Wake up. Time to die.

Why limit a machine that you created? All parents want their children to be better off than them (except those ones that don't know any better). If you do set a limit then that only increases the generation time to our own replacement.

Oh, title is off. Not "Turin" like the shroud.




posted on May, 7 2017 @ 09:41 AM
link   

originally posted by: HeathenJessie
Everything it does is pre-conceived and coded by someone or something.

A system with just a simple set of rules, when left on its own, can create complexity and complex behaviour. In fact, that's how nature works. It was actually Turin himself who discovered that simple mathematical rules underline all the complexity and unpredictable behaviour seen in nature.

Please watch this documentary - The Secret Life of Chaos - www.dailymotion.com...

From 50:55 time mark, they show a computer program that learned to keep human-like constructs on their feet when they are dropped from a height or try to walk. Eventually, those "stick figures" learned to react to being pushed in a very human-like way.

In fact, computer programs learning on their own has been about for quite a while. It goes as far back as Conway's Game of Life which creates evolving behaviour based on some simple rules.

Have a play: bitstorm.org...

For example, writing my name "MAX" in the field resulted in some very interesting animated shapes, and eventually resulted in a bunch of stable (or oscillating) shapes, and a "glider" that raced off into the top left corner and disappeared off the grid.

What I've just realised is that a "glider" travelling away from its spawn point could collide with other stable shapes elsewhere on the grid, triggering change and potentially more "gliders". Likewise, an evolving AI, when connected to other computers, can affect them or be affected by them, in some unpredictable ways.
edit on 7-5-2017 by wildespace because: (no reason given)



posted on May, 7 2017 @ 09:57 AM
link   

originally posted by: dfnj2015
a reply to: toysforadults

Computers based on the Von Neuman architecture perform the get-fetch-execute cycle without any emotions or reflection what any significance to the instructions being interpreted and execute.

Emotions, or assignation of significance to something, are still just electrochemical impulses in neurons. Lobotomy, practiced just a few decades ago, created emotionless and docile humans.

Humans have probably evolved emotions in order to better survive as a group (family, neighbourhood), and if you get multiple computers all exchanging information with each other as they are learning, they might develop emotions as well.

Mind you, AI lacks one thing that we have - the role of chemicals in our mental state. So, computers might not ever learn to feel and express emotions, unless someone programs it into them. But they might learn to use a digital equivalent of emotions.



new topics

top topics



 
9
<< 1   >>

log in

join