It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Our brains don't 'process information' like a computer

page: 3
12
<< 1  2    4 >>

log in

join
share:

posted on Nov, 2 2016 @ 09:02 PM
link   
a reply to: ColeYounger

I'm a scientist. My area of research is mind-brain-world interactivity.

I'm not saying this with ignorance, but with loads of books knowledge i.e. cognitive science, philosophy of mind, etc.

The "empty" brain is a concept that has been pursued by researchers like Anthony Chemero and Daniel Hutto (look em up on Amazon to see) in which content is see to be "non-essential", leading to a sort of "empty container" consciousness.

Now, if the issue is cognitive contents, I agree that cognitions seem 'ancillary' and dependent on dynamics; however, it must be understood that dynamical processes are synonymous with affective process; thus, the 'content' of brain-mind might be legitimately regarded as the "feeling" which flows between Human beings. Brains then could be regarded as 'containers' of the relational contents (feelings) structured by cognitive dynamics.

ps. I get how you could be irritated by this, but I think you chose the wrong person to criticize. I largely agree with the authors conclusions, just was clarifying that "empty" could be better defined as "amorphous" i.e. dynamical, and that teh 'content' would bet the feelings which indicate the state of the system.



posted on Nov, 2 2016 @ 09:07 PM
link   
a reply to: Aazadan

And that's the issue: semiotics is not the same as binary-logic. Computers are the creation of human linear-thinking (as you said earlier). Whereas living beings make-sense of their worlds by modelling within their dynamical processes "good" and "bad", and inhibiting/enabling movement based upon its lived experience of the environment.



posted on Nov, 2 2016 @ 09:24 PM
link   

originally posted by: Astrocyte
a reply to: Aazadan

And that's the issue: semiotics is not the same as binary-logic. Computers are the creation of human linear-thinking (as you said earlier). Whereas living beings make-sense of their worlds by modelling within their dynamical processes "good" and "bad", and inhibiting/enabling movement based upon its lived experience of the environment.


Beautifully put forth.



posted on Nov, 3 2016 @ 03:39 AM
link   
a reply to: Aazadan

Most data doesn't have meta-data stored to specify how the data should be interpreted. That would be unnecessarily wasteful. Instead, the onus is on the reader to know what the data represents (hence file formats are a thing).
edit on 3-11-2016 by GetHyped because: (no reason given)



posted on Nov, 3 2016 @ 07:45 AM
link   
a reply to: GetHyped

The early AI machines probably would have been best described as simply parsers.
Encrypted information was projected into a holographic Hilbert space along with random filler.
By matching the beat of the encryption device apposite information was aligned.
Like Venus arising from the Quantum foam the message appears.



A young Hekawi walking beside the cliff.
Then you feel it..



You turn your head to look and there in the cliff is an opening...
The psychosis begins..

I have forgot much, Cynara! gone with the wind,

And you walk with your shadow at your side...



posted on Nov, 3 2016 @ 09:22 AM
link   
Here's the thing though... He's right in the regard that the brain itself is not a computer. But that's about where I think the analogy ends.

Why is that? Somewhere it's been found that each nerve cell is it's own complete computing device. That is, a neuron on it's own has processing, storage, and can perform various computing functions entirely on its own. But it's not bit based on 0's and 1's, it's doing something akin to processing stuff using analog wavetables. The information nerves send to each other and go about processing is in waveform packets.

The better analogy for the brain is that it's a distributed computing package. So a more apt analogy is like having a very large server farm or perhaps something like the internet itself with cloud storage and distributed processing. Again each nerve that makes up the brain is it's own self-complete computer. Alas some groups of nerves do have specialization, but this is because it's more efficient to process related information over a shorter distance. So instead of saying the brain is a computer, yet is more a model akin to a network of computers with various backbone routing and server packages is more like it.



posted on Nov, 3 2016 @ 09:47 AM
link   

originally posted by: GetHyped
a reply to: Aazadan

Most data doesn't have meta-data stored to specify how the data should be interpreted. That would be unnecessarily wasteful. Instead, the onus is on the reader to know what the data represents (hence file formats are a thing).


That only matters when you're reading data from a file and even then, not always. For example, I was working with Genetic Algorithms the other day, and had to write out the results of each generation to a text file. The contents in the file were strings when written, but had to be told to be read as bytes and ints.

Another example would be a side project I've been working on for a year which involves writing information to a database and retrieving it. The type of data being written has to match the data type for the field it's writing to (int/int matching for example), and the program itself also needs to know the type of data it's reading from the database when querying values is an int vs string vs whatever.



posted on Nov, 3 2016 @ 11:01 AM
link   
a reply to: pauljs75

Nail. Head. Bang.

Each neuron functions as a small, simple analog computing device. It functions by establishing relative weight to the various inputs... i.e. an input pattern which elicits positive feedback (pleasure) is remembered as a good pattern and the result is easily passed on to the output. A pattern which elicits negative feedback (pain) has the opposite effect and is blocked.

There may also be neurons whose job it is to remember negative feedbacks and pass those to the output for corrective action to be taken. But the processing is the same.

With billions of neurons all acting together, the result is that neural pathways corresponding to pleasure and pain stimuli and the proper corrective actions are established. The magic happens not inside the neuron, but in the connections (synapses) and how they are interlinked.

An artificial electronic version of a neuron could be easily built, in production, for maybe 50 cents. That's not the issue for AI. The issue is that 86 billion neurons would then cost $43,000,000,000. If one wished to interlink them, and if one connection could be made per second, and assuming 16 inputs each, that would be a little over 43,600 years to accomplish the feat of wiring them all up. And that doesn't include the power supply leads or the sensors to supply initial inputs.

The beauty of life is it is self-replicating, and nowhere is that ability more fantastically indicated than in the intricately wired brain.

TheRedneck



posted on Nov, 3 2016 @ 01:31 PM
link   
I remember reading somewhere that the neurology of the nematode worm C. Elegans was mapped out, and to some extent they were able to get single neuron cells to perform some basic math functions when wired up to some electronics. So in that regard they were able to see that the neurons themselves were computing complete.

The thing that's crazy though in comparison is that to some extent neural networks can also be simulated. (Some overhead in doing that, but it's easier than trying to manufacture a hardware-based neural network.) The thing is, AI researchers playing with that stuff have on occasion had those setups get to a point where they can't really figure out what's going on.

Some years ago I played an old version of the game N.E.R.O. which was a bit of dabbling with a relatively simple AI. (Link here: www.aigameresearch.org...)
What I did was make a side game of the bot behavior in training mode to make maze runners, and the game used a genetic algorithm to create a simulated neural net. I basically trained bots to run a maze by killing off ones that couldn't reach the goal marker in a certain timeframe along with manually culling ones that had other undesirable behaviors. By the time I got good maze runners that didn't go in circles or crowd against the walls, the data save file for the sim net was approaching a gigabyte. So the mechanism for "learning" this way is really something complex. It also crashed the game on my computer. I haven't played it in a long while, but I'm guessing I'd still run into that same problem.

Of course that doesn't qualify me for much (there's people that know a hell of a lot more than I do), but I still find the topic to be interesting and occasionally read up on it.
edit on 3-11-2016 by pauljs75 because: Making link more clear



posted on Nov, 3 2016 @ 02:18 PM
link   
They both use electricity and think about stuff, so they are the same.

-Drops the Mike to thunderous applause-



posted on Nov, 3 2016 @ 02:32 PM
link   
a reply to: Aazadan


That only matters when you're reading data from a file and even then, not always. For example, I was working with Genetic Algorithms the other day, and had to write out the results of each generation to a text file. The contents in the file were strings when written, but had to be told to be read as bytes and ints.


It seems that you're talking human-readable file formats in your example, though. That still requires assumptions in the same manner as binary data, i.e. "interpret values of type string between non-escaped quotation marks as an array of bytes representing unicode characters " or whatever.

My point is that the meta-data is not inherently a requirement as at the fundamental level a pattern of bits is just that, it's up to the reader (be it man or machine) to interpret that bit pattern as convention dictates.

For example, bitmap files. The first two bytes determine the file type. There's no meta-data embedded in it, it's just a pattern of 16 bits. The onus is on the reader to correctly interpret that pattern as two ASCII characters and act accordingly to the rules dictated by by the file format. Otherwise... things go south.

So yes, you can have file formats that have type info embedded with the data values (i.e. your example, JSON files, XML files, etc.) but this is not an inherent requirement of data storage in general.
edit on 3-11-2016 by GetHyped because: (no reason given)



posted on Nov, 3 2016 @ 02:44 PM
link   

originally posted by: AshFan
They both use electricity and think about stuff, so they are the same.

-Drops the Mike to thunderous applause-


The more I learn, the more I realize I and others just don't know. Electricity or energy, or electrical energy?




But Science classes are different than English classes. In Science, reality rules, and if a large group of non-scientists tries to change the description of the real world, tries to define coulombs as being units of energy, then that large group falls into error. It doesn't matter how many people "vote" for the change, because Nature isn't listening. If "electricity" originally means electric charge, and if people try to change it so that the word "electricity" now means energy, then we have a special word for their actions: MISTAKEN TERMINOLOGY.

I don't quite know how to solve the problem regarding the word "electricity." Too many reference books contain the errors. The word has been misused for so many decades that I am tempted to follow the lead of the scientists: just give up! Just admit that the word Electricity is irretrievably contaminated, and simply abandon it. Abandon it silently, that way nobody has to be called out for public embarrassment. Yet doing this silently has caused serious problems in the past. It doesn't fix the problem, it just covers it up.

Abandoning the word electricity might defend Science against the brain-damage caused by contradictory terminology, but it does nothing to fix all of the reference books which are filled with confusing explanations of "electricity." More importantly, if we quietly abandon the word "electricity" without discussion, this will do nothing to help all of the poor souls who are currently confused by the incorrect "electricity" concepts. Neither will it give any aid to all of the poor science students who are butting their heads against the contradictory material still present in their science textbooks.


amasci.com...



posted on Nov, 3 2016 @ 03:47 PM
link   
a reply to: InTheLight

If it helps with that particular example... we consider "electricity" as a contextual-driven word where I work. If we're talking about electrical charge, then 'electricity' is considered to mean electric charge. If we're talking electronics, it is considered to mean electric current. If the context is vague, we typically add a word to clarify like "electrostatic" or "electric current."

Typically on here, I mean electric current when I say electricity. It seems to go over better than to try and differentiate. The author is right though... it can be confusing, especially for a layman.

Another good example is the gauss. It is a unit of magnetic flux density, but at one time it was also used to describe magnetizing force... what is now called the oersted.

TheRedneck



posted on Nov, 3 2016 @ 04:08 PM
link   
a reply to: TheRedneck

I just don't believe that the biological energy we create in our brain can be the same 'stuff' that we call electricity.



posted on Nov, 3 2016 @ 04:16 PM
link   
a reply to: InTheLight

It really isn't.

The messaging that occurs in the brain is not even energy, although it is carried by ions. The messaging is information. Think about it like this: the hard drive in your computer is made of matter. The encoded bits stored in it are magnetic, made of energy. The program that those bits represent is neither... it is information.

There is an electric current running through the brain, but it is not even what we think of when we think about electricity. Instead of being made of electrons as we're used to seeing, neural electrical currents are made of ions... charged particles. Each input has acceptors that detect certain ions, and the output releases ions to the inputs of the next stage. Inside the neuron... I'm really kinda in the dark about the nature of that. Sorry.

TheRedneck



posted on Nov, 3 2016 @ 04:28 PM
link   
a reply to: TheRedneck

It is very complex, that is for sure.




posted on Nov, 3 2016 @ 04:43 PM
link   
a reply to: TheRedneck




With billions of neurons all acting together, the result is that neural pathways corresponding to pleasure and pain stimuli and the proper corrective actions are established. The magic happens not inside the neuron, but in the connections (synapses) and how they are interlinked.


We sometimes forget how much we abstract. Are neurons the source of consciousness? Well, when they're damaged, we seem to lose a corresponding functionality within ourselves i.e we become deaf if we incur damage to the audial regions of the temporal cortex; blind if we incur damage in visual cortices; we lose the capacity to 'know what we feel' if we suffer a stroke in our insula; or, conversely, snipping the vagus nerve will block all interoceptive communication with the viscera - and so leave you epistemologically "removed" from the normal thinking process, as Damasio showed in Descartes Error, in his examples, in people with orbitofrontal cortex damage and there own inability to factor in "learned knowledge" i.e. what Damasio calls a "somatic marker" which records how something which has 'hurt us' should prompt our feeling bodies to respond with antipathy when presented with a past exemplar (learned pattern). Damasio shows how the pattern is indeed "learned" - i.e. increased galvanic skin response; it just can't be 'communicated' or worked upon, by the consciousness i.e. forebrain.

Also, how many components would a AI system have? I think the biggest problem is the fact that the neuron alone has BILLIONS of molecular components interacting in very complex ways, with soooo many variables to keep track of, from water (the most easily ignored property, but something which recent research (Ho, 2008, 2012; Pollack, 2013) sees as relavant to action potentials) to ions (sodium, calcium, magnesium, potassium, the actual factors that cause depolarization in the cell) to microtubules (the internal structures affected) etc.

Anything we do is always an approximation that doesn't even capture the inherent complexity of the real thing.
edit on 3-11-2016 by Astrocyte because: (no reason given)



posted on Nov, 3 2016 @ 04:44 PM
link   
a reply to: InTheLight

It's definitely related to electricity, as we know our ability to measure electricity.

But yes, thought and feeling seem 'above', yet curiously continuosu with, the phenomenona of EM.



posted on Nov, 3 2016 @ 08:23 PM
link   
a reply to: Astrocyte

You're right: we don't have the ability to exactly duplicate the brain, nor do I wish we did. There are still aspects of human intelligence that we can't begin to explain in detail. For instance, the "gut feeling" some people get. Imagination. Inspiration. Self-awareness.

My work thus far has ignored those for the time being. My objective is simply to create a minor intelligence, say, comparable to a frog, that can learn from experience and failure. To do that, I am postulating that the neuron can be closely approximated using electronic means.

We know the following: each neuron has several inputs and one output. The inputs and outputs, though quantized by ionic acceptance, operate on an analog basis. Thus, the neuron is an analog computing device. The potential computations performed by an analog device are limited... it can sum, difference, detect thresholds, etc. We also know that use can either strengthen or weaken synaptic connections. Based on that, my proposal is to construct summing amplifiers with self-adjustable weighted inputs, adjusted through some means of RF or magnetic feedback throughout the system.

The operation is simple: as patterns of base inputs produce positive feedback, those inputs producing the pattern are increased in weight. As patterns produce negative feedback, the weight of the corresponding inputs is decreased. Over time, neural patterns should be produced which will provide outputs that allow the system to learn how to achieve positive feedback.

But as I stated earlier, the construction is immensely complex.

I have looked into using a computer to simulate such a neural net, but that in itself is problematic. Firstly, a minimum of a terabyte of RAM would be required, assuming the neuron object could be simulated using a single kilobyte...which I deem possible. Secondly, even at 5 GHz clock speeds, I am estimating a minimum of 64 cores would be required. Thirdly, the OS would have to be proprietary, written to eliminate memory leaks and concentrate solely on running the simulation.

In other words, I would have to design and build, from scratch, a supercomputer that would rival or possibly exceed anything existing on the planet, then program the thing from the ground up. I would also have to develop, design, and build thousands of sensors and actuators to act as the I/O for the system.

Frankly, I am not up to that challenge. My degree is in EE, not CPE, and I could not dream of financing such a project.

TheRedneck



posted on Nov, 4 2016 @ 12:39 AM
link   

originally posted by: InTheLight

originally posted by: AshFan
They both use electricity and think about stuff, so they are the same.

-Drops the Mike to thunderous applause-


The more I learn, the more I realize I and others just don't know. Electricity or energy, or electrical energy?


Well, don't choke on casual layman's use of terminology vs how an engineer or physicist would use the words, or your head will explode on ATS given the gross misuses of things like energy, field, frequency and quantum.

Neurons have electric fields associated with them. But they don't work like wires. Other than saltation between nodes of Ranvier or through gap junctions, electron flow (or "electricity" if you will) is not how neurons function. The electron flow in a neuron is through the cell membrane, at right angles to the signal.

As the action potential propagates down the axon, the electrons move at right angles to it. Except for saltation. So it's not at all like the way a wire conducts a signal.




top topics



 
12
<< 1  2    4 >>

log in

join