It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Can Machines think?

page: 5
11
<< 2  3  4    6  7  8 >>

log in

join
share:

posted on Mar, 20 2012 @ 09:56 AM
link   

Originally posted by TWISTEDWORDS
Here you go a simple function of counting to five written in C#

double[] result = new double[5]
sum =0;
for(int i =0; i < 5;i++)
[
sum += i+1;
result[]= sum;
sum = 0;
]



Answer: 1,2,3,4,5

There you go a function of counting to five. All of this needed to count to five and you brain can do it so much faster.

As I described to everyone here before, this is a simple function of counting to five. What I was trying to show you all is that it cannot count to six or beyond with this function. So therefore it cannot learn or adapt to a situation. The reason being is the i




posted on Mar, 20 2012 @ 10:07 AM
link   

Originally posted by andersensrm
In the article it is presented that Descartes believes that it is impossible for a machine to do things as humans do. Basically he's saying that it is impossible, impractical, to essentially "program" every response to every situation.


Right, right, right; and we'll never fly, we'll never go to the moon and computers will always be giant devices that fill the whole floor of a building. I love it when scientists declare something "impossible" because in every case they end up looking the fool.



posted on Mar, 20 2012 @ 10:07 AM
link   
I guess machines can think,they are capable of thought,of retrieving stored information,just like our brains are retrieving stored information,taken from the inflow of information that our five senses accumulated there.But the really important question is,can machines ever be made self-aware,to be aware of themselves and their own existence like we are?
edit on 20-3-2012 by blocula because: (no reason given)



posted on Mar, 20 2012 @ 10:39 AM
link   
reply to post by camus154
 


Again camus, you would still have to enter in an input with your function. Also, using a MAX function would not give you a return until it finished until eternity. So again, the function fails as an AI. Lastly, an array is far faster than a console writeline, so you are going to have a console pop up on every loop? That doesn't make any sense, how are you going to access the data from a writeline that has passed?

We could get into what you or I would think as the appropriate function to do the job, but at the end of the day it still can't learn. Input has to be given by the user.



posted on Mar, 20 2012 @ 10:48 AM
link   

Originally posted by TWISTEDWORDS
Again camus, you would still have to enter in an input with your function. Also, using a MAX function would not give you a return until it finished until eternity.


The input could come from the program or an algorithm instead of the user, as I indicated previously. And I have no idea what you're getting at regarding the Max function and eternity, but I think we're getting lost for the trees here.



So again, the function fails as an AI. Lastly, an array is far faster than a console writeline, so you are going to have a console pop up on every loop? That doesn't make any sense, how are you going to access the data from a writeline that has passed?


Well of course it fails as AI. Do you have any idea how complex AI is and would be? Console.WriteLine doesn't pop up anything, it writes out to the console window. If the point of the function was to "count" to 5, then that's precisely what it does. If you need to store the results in a variable for some odd reason, then fine, use the array.

Anyway, I understand what you were trying to get at. All I'm saying is that it's not because of a limitation of math that machines can't think right now, but rather how enormously complex it would be to simulate the mental processes of our own brains.



posted on Mar, 20 2012 @ 10:58 AM
link   
reply to post by camus154
 


I am going to respond with one more on your function. I used a array of doubles and you used an integer function. With the array of doubles I can input a decimal number and yours cannot. So again, AI is limited to the function that you as the programmer define. So again with your function it cannot count with decimal number to five, but mine can and the stupid computer cannot figure out why?

Yes, as I stated before in this thread the amount of lines code required to simulate our brains is next to impossible. I don't know of one programmer that code every single possibility, that our brains can do(which we take for granted everyday).



posted on Mar, 20 2012 @ 10:59 AM
link   
reply to post by blocula
 


A persons memory is nothing but stored bits of information...

Careful now, nobody agrees about how the actual data is mapped in our brains. Computer engineers, scientists, doctors and psychologists all use the terms map, array, and bits, but the actuality of the "Neural Net" using Neurons in our heads is far different from computer memory. Computer 1's and 0's are tiny stored electromagnetic charges and neurons are flesh and blood. Plus the grid array and access to it (retrieval) is different.

If they knew how it worked they would have invented a brain by now (not that they aren't working on it). By comparison to a human brain, mans invention of computers is like comparing a flash light to the sun. We are emulating it with silicone, but can't come close.



posted on Mar, 20 2012 @ 11:01 AM
link   

Originally posted by TWISTEDWORDS
reply to post by camus154
 


I am going to respond with one more on your function. I used a array of doubles and you used an integer function. With the array of doubles I can input a decimal number and yours cannot. So again, AI is limited to the function that you as the programmer define. So again with your function it cannot count with decimal number to five, but mine can and the stupid computer cannot figure out why?


Well, dude, when you say count to five, that normally means in discrete integers, not decimals. But whatever.



Yes, as I stated before in this thread the amount of lines code required to simulate our brains is next to impossible. I don't know of one programmer that code every single possibility, that our brains can do(which we take for granted everyday).


It's not about programming every possibility. That's not learning. It's about encountering a new possibility and adopting based upon previous choices before.

All of that can be done with programs and computers, but it's still immensely complex.



posted on Mar, 20 2012 @ 11:03 AM
link   

Originally posted by intrptr
reply to post by blocula
 


A persons memory is nothing but stored bits of information...

Careful now, nobody agrees about how the actual data is mapped in our brains. Computer engineers, scientists, doctors and psychologists all use the terms map, array, and bits, but the actuality of the "Neural Net" using Neurons in our heads is far different from computer memory. Computer 1's and 0's are tiny stored electromagnetic charges and neurons are flesh and blood. Plus the grid array and access to it (retrieval) is different.

If they knew how it worked they would have invented a brain by now (not that they aren't working on it). By comparison to a human brain, mans invention of computers is like comparing a flash light to the sun. We are emulating it with silicone, but can't come close.


Precisely, A computer is really not that smart at all. It is nothing more than a calculator, calculating 1 and 0 all day long. Our brains work differently and can't be duplicated. Just think of everything your brain does all day long, from your heart beating to breath to hands moving, eating, keeping all of your organs functioning and then thinking. We don't have a computer that can do a tenth of it and keep up.



posted on Mar, 20 2012 @ 11:05 AM
link   
reply to post by camus154
 


You still missed the point. I thought far ahead on someone coming in to debunk the function and find a flaw. What I showed you is that it has limitations and always will with our current way of programming.



posted on Mar, 20 2012 @ 11:07 AM
link   

Originally posted by blocula
reply to post by intrptr
 
And the android ash from alien-1979,the android bishop from aliens-1986 and the tin man from the wizard of oz-1939.Although some people confuse androids with cyborgs.Androids are all machine and cyborgs are part machine...

edit on 20-3-2012 by blocula because: (no reason given)

What about Nomad from Star Treks, "The Changeling"? "It" was a fusion of two machines by accident, one sentient, one not. What do we call "him", an abortitron?

Captain Kirks conversation with Nomad about "error" is priceless.

Edit: I found the "error routine" from Star Trek. Its in here at about 10:12:


God forbid we should actually ever develop a "conscious machine". It would immediately become cruel out of "necessity" for it has not the ability to "forgive". Kind of like the way a change machine at the grocery store flawlessly gives you 4 pennies instead of a human clerk who "takes pity" and gives you a nickel instead.
edit on 20-3-2012 by intrptr because: YouTube



posted on Mar, 20 2012 @ 11:10 AM
link   

Originally posted by TWISTEDWORDS
reply to post by camus154
 


You still missed the point. I thought far ahead on someone coming in to debunk the function and find a flaw. What I showed you is that it has limitations and always will with our current way of programming.


Sorry, I'm not missing the point. You're trying to illustrate that computers are limited by way of a ridiculously contrived for loop example. No real life program is anywhere near that simple. Programs today make any number of complex decisions based upon any number of complex inputs and other factors.

Granted, yes, they do what we've programmed them to do. But even that's missing the point. Because the whole subject of AI is writing a program that can learn by itself, adopt by itself. You need to step up the abstraction chain here and quit looking at it at the functional level.

Do you know programs can dynamically compile and execute code on the fly? In other words, programs can modify themselves at run time.

I don't disagree with you about the current limitations, but I do disagree about the source of those limitations. It has nothing to do with "math", as you call it, or even the language. That's not the barrier here.


edit on 20-3-2012 by camus154 because: (no reason given)



posted on Mar, 20 2012 @ 11:20 AM
link   

Originally posted by blocula
reply to post by intrptr
 
Then theres the questions...Are we living in a simulated reality? Are we three dimensional holograms existing within someones or somethings virtual reality? that are yet to be answered and if we are,then we are down-loading and up-loading pre-programmed information all the time...

edit on 20-3-2012 by blocula because: (no reason given)

Nah... You are really you. Here and now. There are your very limited senses to detect "reality". Just outside the spectrum of your "reality' is the "REAL" world. But you have been limited by the senses so as to spare you the shock that would be like throwing the snowflake of your mind into a furnace.

If by programming you mean "thoughts" that "come to you"... sure, I believe in that.
Like gestation in the womb, our whole "lifetime" is but another gestation in a "womb" as a precursor to that other life beyond this one. Then like being born from the womb, we are born from this "husk" into that other realm. It is all around us all the time, we just don't perceive it yet/now. Sometimes we get a glimpse...

then people bring it here and argue like plants in a garden wondering if there are such a thing as "gardeners".



posted on Mar, 20 2012 @ 11:23 AM
link   
reply to post by camus154
 


Oh yes it is, I had you critically think and come up with answers and arguments to this simple functional equation. No computer in the world or program could have done what your brain just did. No program could have thought about this argument we just made about the function and come up with arguments or different points of view. I just gave you the answer as to why a computer cannot do what this exercise just proved. You couldn't program a computer to do what I just displayed. Now you could go back and write a program to come up with answers that were just discussed, but you couldn't write a program to come up with questions, arguments to what our brains just did.

So you see, my simple function of counting to five caused your brain to think up stuff and come up with random thoughts and different points of view. Unto which a computer cannot. Why? because every program is defined by MATH. Now, unless you know something I don't know about programming and can have a program think on it's own without using equal signs in the program please let me know. Your own function still used MATH to define it. Can you write me one that doesn't use math in a program? I would sure like to see it.



posted on Mar, 20 2012 @ 11:26 AM
link   
reply to post by intrptr
 


That's hilarious, Star Trek? So a fictional writer sitting in his house who came up with some random writing of a story is somehow science? I don't think so.



posted on Mar, 20 2012 @ 11:26 AM
link   
Before this becomes a geek pissing contest, which I may already be too late to prevent, let me say that this is a conundrum we cannot adequately answer given the variables.

We can measure only the things we have the tools to measure.

Since we cannot fathom even our own synapses, we will be at a total loss to measure with any certainty the consciousness of another being, whatever the 'being' might be made of.

As of today, machines cannot think, at least not independent of what the programmer expects them to think about.

Autonomy is indicative of independent thinking, so, if Big Blue stopped playing chess and instead gave a monologue about ferrets, it could be considered that he was truly AI.



posted on Mar, 20 2012 @ 11:29 AM
link   
reply to post by intrptr
 
I know that i think i'm real,but beyond that,i really dont know,because everything i see, hear, taste, touch and smell is taking place inside my brain and nowhere else...


edit on 20-3-2012 by blocula because: (no reason given)



posted on Mar, 20 2012 @ 11:40 AM
link   
Twisted,

This is the last I have to say on the topic. It has nothing to do with math and everything to do with complexity.

There's nothing magical about our own brains, after all. We may not understand fully how they work, but that's a matter of complexity. In the end our brains are every bit as much of a machine as a computer. Just because they are so enormously complex that we can "reason", as such, doesn't mean a machine could never be programmed with the framework to do the same thing. You don't have to program every single possibility. You "just" have to program the autonomy to adapt to every possibility.

Programming does not equal math. Not every statement in a program needs an equal sign. Console.WriteLine("hi"). There. No equals sign.

Oh, and by the way--your function before was incorrect in that it still doesn't allow for decimals, as you said it was supposed to. You may be using an array of doubles but your "for" counter is still using an int

edit on 20-3-2012 by camus154 because: (no reason given)



posted on Mar, 20 2012 @ 11:41 AM
link   
Machines do not think because they are told what to do by their programmers.

The difference is consciousness. At this stage it is clear machines are not conscious, if we understand consciousness to the degree that we can replicate it in machines, then they will be capable of thought.
edit on 20-3-2012 by humphreysjim because: (no reason given)



posted on Mar, 20 2012 @ 11:48 AM
link   

Originally posted by TWISTEDWORDS
reply to post by intrptr
 


That's hilarious, Star Trek? So a fictional writer sitting in his house who came up with some random writing of a story is somehow science? I don't think so.


Sorry you misunderstood. I think we agree Sci Fi, not science. Just entertaining the Concept put forward by Gene's Trek. He was good at what if. And gee really? Cell phones and "communicators", like other devices are born of that error from the TV series. Roddenberry introduced those ideas on Star Trek. I could be wrong. Shatner did a show on that, I think.



new topics

top topics



 
11
<< 2  3  4    6  7  8 >>

log in

join