It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Help ATS via PayPal:
learn more

Our brains don't 'process information' like a computer

page: 2
12
<< 1    3  4 >>

log in

join
share:

posted on Nov, 2 2016 @ 10:43 AM
link   
a reply to: ColeYounger

I'm asking for an actual scientific source of these "many" neuroscientists who believe such a thing, not a blog post on a creationist(i.e. anti-science) website.




posted on Nov, 2 2016 @ 10:45 AM
link   


Really? You can record language? There's a Nobel Prize in your immediate future!




You can't record language? Then why have many ancient languages been lost? Because they weren't recorded.

Recording language

I don't know if you're just tryng to argue semantics here.


edit on 2-11-2016 by ColeYounger because: (no reason given)



posted on Nov, 2 2016 @ 11:57 AM
link   
a reply to: ColeYounger

No, I'm making a point. There's a difference between language and writing that is seriously germane to artificial intelligence. Artificial intelligence requires much more than the ability to recognize symbols or wave patterns; it requires the ability to comprehend and form concepts from the information received.

Your link is simply an attempt to record the words/alphabet corresponding to languages which are being replaced.

Thus far no one has found a way to record the ideas conveyed, only symbols and wave patterns that we correlate into language as part of our own intelligence. That is a critical difference.

TheRedneck



posted on Nov, 2 2016 @ 12:18 PM
link   
I doubt many mothers would agree with him.

Most mothers know their child was born who they are, with their personalities, and abilities.

Even in the womb, each child behaves differently.

You really can only enhance a child's natural talents/abilities - - - you can't create them.

You can also affect them negatively. Something I find interesting is how some are so resilient to horrible treatment and some so sensitive to the smallest about of criticism.



posted on Nov, 2 2016 @ 01:00 PM
link   
a reply to: ColeYounger

I was going to make a thread on this but I'll put it out here instead.

Where is the Mind? Is it inside the brain or outside the brain? Is it inside the skull? If it's inside the brain then it is definitely inside the skull but if it's outside the brain then where does it reside?



posted on Nov, 2 2016 @ 01:14 PM
link   

originally posted by: ChaoticOrder
a reply to: ColeYounger


There are neuroscientists who don't believe that memories are stored in a brain, like data on a hard drive.

Of course memories aren't stored like data on a hard drive, they are stored in our highly complex neural network which computes information in a distributed fashion and uses association to provide triggers for memory retrieval. At the end of the day there's no reason we cannot simulate the neural activity of a human brain on a computer once we understand exactly how it works.

What if it requires some manner of quantum-computing? (link) I don't know enough to say. And what if the computing power needed is much higher than presently believed? Presently, they say it'll be about 10 years. However, I came across an article a year or two ago which suggested the computing necessary to simulate a neuron was much higher than previously thought. This suggests to me computers won't be reproducing the human brain for some time. Sorry I can't provide the link at this time, it's on my other HD.

EDIT: I forgot I have a copy. Here's the link:
www.spacedaily.com - UNC neuroscientists discover new 'mini-neural computer' in the brain...

Dendrites, the branch-like projections of neurons, were once thought to be passive wiring in the brain. But now researchers at the University of North Carolina at Chapel Hill have shown that these dendrites do more than relay information from one neuron to the next. They actively process information, multiplying the brain's computing power.

"Imagine you're reverse engineering a piece of alien technology, and what you thought was simple wiring turns out to be transistors that compute information," Smith said. "That's what this finding is like. The implications are exciting to think about."

And btw I reside in your camp. I think we'll reproduce parts of the brain or the whole brain--eventually--on computers. However, if quantum things are going on in the brain, that'll complicate it. I found some links recently about quantum processes in photosynthesis enabling the quickest paths to be found nearly instantly for efficient transfer of energy. Without a quantum computer to replicate this, traditional computers require more time to process. On the scale of a whole brain, that might make it impossible to do on a conventional computer on timescales practical for our needs.

www.wired.com - Everywhere in a Flash: The Quantum Physics of Photosynthesis...
edit on 11/2/2016 by jonnywhite because: (no reason given)



posted on Nov, 2 2016 @ 06:12 PM
link   
a reply to: jonnywhite


What if it requires some manner of quantum-computing?

It very well may, then we'll just have to use "quantum computers".


This suggests to me computers won't be reproducing the human brain for some time.

Yes I'd agree we aren't really close yet, we need to work more on highly parallel computing solutions before it will really be possible, and even when we have the necessary power we still need to find the right algorithm. My estimate would be between 2040 and 2050 is when we'll really start to see serious progress.



posted on Nov, 2 2016 @ 06:21 PM
link   
a reply to: ColeYounger

It is entirely wrong to call the brain "empty" - clearly, a buddhist-centric perspective.

On the other hand, I completely agree that all those terms are emergent properties that depend upon non-stop interactions between the human being, as its brain-dynamics attune to expectancies with the environment.

The brain is FILLED i.e. emotion and feeling move through it, and indeed, actually generate it. Neural evolution wouldn't be possible without the coherency of positive emotions and the control (or gradual reduction ) of negative affect. What else binds human minds together but the feeling-dynamics which operate between them?



posted on Nov, 2 2016 @ 06:24 PM
link   
a reply to: ChaoticOrder

What if quantum computers exist but they are just still classified?
They had Cyborg code cracking technology back in the 1940's.
Some military sites likely had laser disk storage in the 1960's but it would never have been reported by the press or available for sale to the public.



posted on Nov, 2 2016 @ 06:29 PM
link   
a reply to: ChaoticOrder

Computers aren't dialogical, and even more basically, AI theorists are probably fundamentally wrong about the "substrate doesn't matter" position. Indeed, origin of life researchers like Nick Lane thinks that silicon isn't versatile enough to generate life the way carbon does. Even more to the point, biologists are beginning to recognize the role water - the forgotten molecule which literally acts as a 'matrix' for all biodynamical processes - plays in mediating 'quantum entanglement' between molecular processes throughout the body.

If AI is ever to be achieved, it'll have to use carbon as a base; however, if carbon serves as a base, would that not just replicate natural life processes, albeit, with Human design intentions?

Even more strangely, life seems to be about "maintaining coherence", whether it be bacterial or multi-cellular, molecular dynamics are always pursuing the most "cost-effective" pathways. This same dynamic appears at the animal level in the cognitive and perceptual tendencies of animals as they "predict" their environments for more cost-effective relations with their worlds.

Could it be done? Possibly. But it would be useless and purposeless.

Robots - according to the technology of the present paradigm - will never achieve the computational and ONLINE COMPLEXITY (this being the main difference) of the Human brain.

Just do the math: around 86 billion neurons, each neuron has around 10,000 synaptic connections. Some people would leave it at this - but when you look at neurons and the number of molecular components which make it up, we are entering a level of organizational complexity that Humans will likely never match - not even with the help of "quantum computing'.

Of course, technology will continue to progress, but we should also be humble about what were actually doing.



posted on Nov, 2 2016 @ 06:31 PM
link   
A better example would be quantum computers, who do have very physical hard drives, even though making things smaller is always the future. The reason for this analogy is because standard computers are linear, and it's pretty easy to prove brains are not.

If you break it down further, a 'quantum computer' is just a variable system of functions that merely incorporates numbers greater than 1's and 0's and also allows simultaneous representation, then the reality that it can still be broken down into a string of numbered code.

There's honestly nothing more interest about the function being represented by a 7, or even single functions having double or triple digits when it comes to Quantum variable sets.

By the way, it's 'made to seem really far' based on this linear technology often and this is not the case. It makes perfect sense that linear computers are not designed to exploit several variable algorithmic functions, so naturally they would be bad at doing things Quantum computers do, and look bad at it in comparison.
edit on 2-11-2016 by imjack because: (no reason given)



posted on Nov, 2 2016 @ 06:56 PM
link   
a reply to: GetHyped

Very old debate just search for key word "Turing test" which OP probably intended.



posted on Nov, 2 2016 @ 07:37 PM
link   
a reply to: Cauliflower

I'm not seeing what the Turing test has to do with creationist blog posts, neurologists and claims of dualism.



posted on Nov, 2 2016 @ 07:46 PM
link   

originally posted by: roadgravel
In a similar light, a computer doesn't store a number. It stores a representation that can be retrieved and reinterpreted as the same number. 01 is stored (in a physical form) and later treated as a binary value which is then interpreted to be the number value 2.


Sort of. Most data in a computer stores not just the byte, but also an additional byte as to what the data represents. To give an example, I can pass a computer the byte 01000001 which is the number 65. This same byte however can also represent the ASCII character A. So normally, in addition to the byte, I'm also going to give the computer some data as to whether that byte needs to be interpreted as a 65 or as an A. This becomes relevant because if I feed the computer 65+66 it has to know if the answer I want back is the integer 131 or the string AB.

I could take this a step further too, Godel figured this stuff out before we even had computers. All objects can be described with a single (large) number. I could take the word Hello, which as ASCII characters is made up of the byte string 0100100001100101011011000110110001101111 and could also express that as the number 310939249775. They have the same meaning.

From what I've read, our brains process stuff a bit more abstractly. It stores connections between objects. Computers on the other hand don't really store connections, they store lists of objects and access those lists by index numbers.

There's a lot of research into AI that tries to take a mathematical approach to accessing data in the same way that people do, but it's computationally intensive and really not all that advanced even though the problem has had the worlds greatest minds working on it for 50 years. An example of this would be image recognition, it's a field computers are getting better at, but simply telling the difference between two similar items like a box of butter sticks and a box of teabags is near impossible.
edit on 2-11-2016 by Aazadan because: (no reason given)



posted on Nov, 2 2016 @ 07:55 PM
link   

originally posted by: TheRedneck
a reply to: ColeYounger

No, I'm making a point. There's a difference between language and writing that is seriously germane to artificial intelligence. Artificial intelligence requires much more than the ability to recognize symbols or wave patterns; it requires the ability to comprehend and form concepts from the information received.

Your link is simply an attempt to record the words/alphabet corresponding to languages which are being replaced.

Thus far no one has found a way to record the ideas conveyed, only symbols and wave patterns that we correlate into language as part of our own intelligence. That is a critical difference.

TheRedneck


To try and explain this another way. We can record an alphabet, sounds, and so on but language also carries with it other aspects as well like pop culture references and contextual humor. Simply reading a language, centuries after it has been written down doesn't convey all of the ideas that were recorded because we lack the context the language was written in.

One example that comes to mind here is books. If you read a modern fiction novel you're going to pick out a bunch of pop culture references. 100 or 200 years ago authors included these same sorts of what was then modern day references, but they completely bypass us today when reading the novel.

Even a so called universal language like math is subject to this. If we wanted to express 2.5+2.5 to ancient egyptians they wouldn't understand it because they used fractions rather than decimals. But if you give many modern every day people today the problem 5/2 + 5/2 they're going to struggle with it, or atleast have to think about it a bit. 200 years from now we may not use a decimal system or even a base 10 system, at which point the mathematical principals may remain but it will be much harder to understand.



posted on Nov, 2 2016 @ 08:05 PM
link   
a reply to: Aazadan

Are you aware of the more realistic, action-based work by cognitive scientists - ala Andy Clark, Ned Block, Karl Friston, Anil Seth, etc?

If you really think about the nature - and sheer complexity of the work - and yet, to realize how primitive it is relative to actual living systems - it seems like it will take a long, long time before autonomous systems 'self-organize' to produce layered and hierarchical systems based on real interactive "experience".

Right now, philosophers, engineers and scientists are speculating on how this could work; recent robotics shows how the robot can "learn" through interactions, but what it learns is so simple relative to real living systems, qnd furthermore, encodes information in a "dissociative" way (the robot is made metal, chips, and sensors; not far-from equilibrium molecular dynamics that underlies the existence and functionality of actual living beings.

Thus, AI research seems to be very based in an idealistic metaphysics, and so, will have to develop a more realistic metaphysics if they wish to generate complex artificial life.



posted on Nov, 2 2016 @ 08:23 PM
link   
a reply to: Astrocyte

I am not, I do little to no reading, it's just not something I have time for. I'm actually in an AI class right now. I've found that the fantasy of AI is a lot cooler than the reality, I'm actually pretty dismissive of the field in general. That can all change one day, but right now I think AI or more specifically Machine Learning which is what most people refer to with AI is largely just a waste of CPU cycles in most cases (though it does occasionally come up with a very impressive result).

You mention robots learning, that's something I'm pretty familiar with, and why I'm somewhat dismissive of AI in general. Humans can learn quickly, and are capable of making accurate inferences. With computers it takes hundreds or thousands of sessions worth of training data, and they apply that data to new scenarios in an inefficient way. Even the systems that "efficiently" learn feel to me like they're much closer to brute forcing problems than towards actually being intelligent.



posted on Nov, 2 2016 @ 08:37 PM
link   
a reply to: Astrocyte




It is entirely wrong to call the brain "empty" - clearly, a buddhist-centric perspective.


Can't you understand that the title "The Empty Brain" is an allegory? It's the author's literary device! Read the OP!

I posted the thread because I thought it was interesting. But wait! Enter the self-proclaimed geniuses. The know-it-alls. They know more than the OPs author, Robert Epstein, who is a senior research psychologist at the American Institute for Behavioral Research and Technology in California. A PhD of Harvard University, he is the author of 15 books and more than 250 scientific and mainstream articles, as well as the former editor-in-chief of Psychology Today.
According to some of the posters here, he's just ignorant. They can easily disprove his findings.


I welcome a good debate, but there are people on these forums that are literally psychopathic.
They are so damned self-important that they think their opinion is the last word. They would tell Einstein that his
gravitational waves theory is silly, because they understand physics better than he did.

The arrogance of some people is unbelievable.



posted on Nov, 2 2016 @ 08:42 PM
link   
a reply to: ColeYounger

Edit: That came across harsher than intended.

I love good philosophical conversations on technology (in fact, I meet a former philosophy professor of mine once every week or two for that very thing) but some people have actual knowledge on these concepts. Some understand brains (as much as anyone can) and some understand computers very well. In particular with the computer knowledge, comes knowing that brains and computers work on very different principals. Sometimes the results look the same, but those results are arrived at through very different means.

Brains work on much more of a symbolic logic system that connects properties between objects it looks for similarities. At the heart of all computer operations though is a comparison, greater than, less than, or equal.
edit on 2-11-2016 by Aazadan because: (no reason given)



posted on Nov, 2 2016 @ 08:53 PM
link   
a reply to: Aazadan

Yes, its a little depressing - that some people can be so excited about something which is objectively unimpressive relative to the real thing (i.e. real life).

Question is: could you combine origin of life research (autocatalysis, etc) with AI? It's plausible, except our most impressive invention might be tiny in comparison to the scale our culture has conditioned us with seeing as possible (i.e. Jetsons).

Computers can function as external memory storage, and for that, they're very useful. No one, I don't think, doesn't enjoy the way and manner the internet allows us to access information whenever we want it.

But some people forget that it is STATIC information - so the internet is never going to "come alive" as some very silly, not very-reflective people believe. There is no "phase shift" because there is no organized center of information, no ecological system of needs, no "self-organizational" nature that is independent of the Human USERS of the internet.

Yes, some people are very imaginative.







 
12
<< 1    3  4 >>

log in

join