It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Memory Uploads and Consciousness

page: 2
5
<< 1    3 >>

log in

join
share:

posted on Jul, 27 2013 @ 06:09 PM
link   
reply to post by neoholographic
 


Look buddy, you're just ignorant. Several members have tried to help you out, and you have no clue what you're talking about.

There's no point in trying to reach you on this one.

Quantri is out of here!



posted on Jul, 27 2013 @ 06:47 PM
link   
This deserves to be posted in the Philosophy and Metaphysics forum as the underlying assumption - that the brain "creates" consciousness - is unproven.

But, you posted it here, perhaps because you believe all the ridiculous zeal behind this claim. I've been studying neuroscience for awhile and have been interested in this question in particular; if it IS possible (a whimsical idea, in my opinion) we are at the very least 100 hundred years away from being able to test it.

Some scientists who spend their days involved in neuroscientific research allow themselves to be too giddy about the 'possibilities' that they'll make downright quixotic predictions within a time-scale that is entirely improbable.

The difficulties and complexities are somewhat overwhelming. "Optimists" believe we should believe nonetheless. Of course, I support the project to understanding the brain-mind connection; I'm not that afraid of death, so the question of uploading my brain to a computer is not that important to me; rather, figuring out the causes of neurological diseases, improving memory, focus and attention, are far more practical, and in my opinion, interesting subjects.

An additional problem is financial; the scientists involved in mapping out the brain have to compete for funds with scientists involved in real life changing research.

Will the brain ever be figured out? Are the quadrillion or so synaptic connections between the brains 100 billion neurons too daunting for our finite minds to handle? No, eventually the knowledge will be gathered and computers will help us discover the relevant pieces of information. But this "project" of moving mind via a "connectome" to a computer is so complex that may be half through the next century - sometime in 2150 - we could see that become a reality. And of course, this is all based on the assumption that the "connectome" carries our consciousness.



posted on Jul, 27 2013 @ 06:58 PM
link   
reply to post by QuantriQueptidez
 


Too bad the biological sciences - genomics, epigenomics, neuroscience - lag WAY BEHIND the computer sciences.

This is why technologists who make predictions about "transferring consciousness" to a computer make absurd predictions about when this might be possible; Ray Kurzweil wrote a book called "live long enough to live forever". LOL. He thinks, or rather, hopes, that half way through this century brain scientists will have all a full knowledge of the brain.

Being involved in computer technology could perhaps give someone that misconception; computer sciences have changed our world. Our T.Vs get better every year, our phones get better every year; computers compute more, graphics become more life-like, more fields are becoming automated, and more and more areas of life are becoming further integrated. This seems to mean "progress", but alas, nature is so much more complex than the computers we design.

At this point, we don't even have a computer that can handle the 1 quadrillion synaptic connections that our brain does; let alone make billions of computations every second.



posted on Jul, 27 2013 @ 07:54 PM
link   

Originally posted by neoholographic

First, consciousness has to be outside of the material brain. How else can you recall specific memories at will and how can the material brain know the difference between these memories and which memories you wish to recall?


None of this makes sense. First of all a computer is material and can distinguish between different memories without confusing them. And what is this about recalling specific memories at will. What do you mean by that and how does it relate to the supposed location of "consiousness"?



posted on Jul, 27 2013 @ 08:09 PM
link   

Originally posted by QuantriQueptidez
reply to post by neoholographic
 


Look buddy, you're just ignorant. Several members have tried to help you out, and you have no clue what you're talking about.

There's no point in trying to reach you on this one.

Quantri is out of here!


Translation:

Damn, you're right.

The fact is people turn the material brain into a magical mechanism that can do all things. This is purely based on the priori that consciousness is an emergent property of the material brain.

This is fantasy island stuff.

How does the material brain impose it's will on the material brain lol?

Again, the material brain can't recall specific memories at will. The material brain doesn't know the difference between specific memories. The material brain doesn't know what memories I wish to recall.

The brain can process vast amounts of information. You turn the material brain into something magical when you say the material brain can impose it's will on itself. It's just Kooky talk.

If I'm thinking of 5 business ideas, how does the material brain know which idea I want to think about in that moment? How does the material brain get the message that I wish to recall that specific idea at that moment in time?

Again, the brain processes information it isn't something magical that can do all things.



posted on Jul, 27 2013 @ 08:25 PM
link   

Originally posted by neoholographic
How does the material brain impose it's will on the material brain lol?
So a material thing cannot regulate itself? Or are you saying a material system cannot exist in which different elements within that system can exert influence on each other?
edit on 27-7-2013 by Tearman because: (no reason given)



posted on Jul, 27 2013 @ 08:39 PM
link   
reply to post by Tearman
 


Nope unless you have some scientific evidence that this is remotely possible.

How can the material brain recall specific memories at will? How does the material brain know the difference between these memories? How does the material brain know which memories I wish to recall? How does the material brain activate the right brain cell to recall a specif memory and how does it know I wish to recall this specific memory?

I just recalled a memory of a trip to Chicago in 1985. How does the material brain know I wish to recall that specific memory? How does the material brain know the difference between a trip to Chicago and a trip to Florida?

At the end of the day, the brain processes information it isn't a magical object sent from Middle Earth LOL.



posted on Jul, 27 2013 @ 10:26 PM
link   
reply to post by Tearman
 


You're not going to be able to reason with this guy.

His ability to reason is stuck in grade school.



posted on Jul, 27 2013 @ 10:30 PM
link   
reply to post by Astrocyte
 


I refer you to the quantum computer I referenced in my first post.

Pretty well invalidates your linear thought process regarding incremental technological progress.

There are spurts of innovation that push us way ahead in terms of capabilities. These are technological "game changers".

There is no good reason to think that the circumstances which exist today do not make for a fertile field of technological growth.

“Necessity is the mother of invention.” ― Plato
edit on 27-7-2013 by QuantriQueptidez because: (no reason given)



posted on Jul, 28 2013 @ 12:10 AM
link   

Originally posted by neoholographic
reply to post by Tearman
 


Nope unless you have some scientific evidence that this is remotely possible.

How can the material brain recall specific memories at will? How does the material brain know the difference between these memories? How does the material brain know which memories I wish to recall? How does the material brain activate the right brain cell to recall a specif memory and how does it know I wish to recall this specific memory?

I just recalled a memory of a trip to Chicago in 1985. How does the material brain know I wish to recall that specific memory? How does the material brain know the difference between a trip to Chicago and a trip to Florida?

At the end of the day, the brain processes information it isn't a magical object sent from Middle Earth LOL.


You're not making ANY sense. What reason do you have to believe the material brain isn't capable of doing these things? Computers can do these same things. Are you saying there is something immaterial about computers as well?



posted on Jul, 28 2013 @ 12:11 AM
link   

Originally posted by QuantriQueptidez
reply to post by Tearman
 


You're not going to be able to reason with this guy.

His ability to reason is stuck in grade school.

I was holding out on the possibility that there may actually be some kind of reason behind his argument. As it stands, I don't see it.



posted on Jul, 28 2013 @ 01:07 AM
link   
reply to post by QuantriQueptidez
 

I read a story similar to this where the organic brain is replaced by a synthetic kind.

"Learning to Be Me" by Greg Egan, p. 448, "The Year's Best Science Fiction - Eighth Annual Collection" by Gardner Dozois c. 1991

Essentially, the story is that because the synthetic brain leaves out neuron death, the personality inevitably changes after X amount of time. What it means is a person that has their brain replaced is increasingly different from who they would have been with an organic brain. The catch, from a plot standpoint, is that people are lied to and told they're the same person.
edit on 28-7-2013 by jonnywhite because: (no reason given)



posted on Jul, 28 2013 @ 01:13 AM
link   

Originally posted by jonnywhite
reply to post by QuantriQueptidez
 

I read a story similar to this where the organic brain is replaced by a synthetic kind.

"Learning to Be Me" by Greg Egan, p. 448, "The Year's Best Science Fiction - Eighth Annual Collection" by Gardner Dozois c. 1991

Essentially, the story is that because the synthetic brain leaves out neuron death, the personality inevitably changes after X amount of time. What it means is a person that has their brain replaced is increasingly different from who they would have been with an organic brain. The catch is that people are lied to and told that they're the same person, even as their brain is being sucked out and replaced.
edit on 28-7-2013 by jonnywhite because: (no reason given)
For better or for worse?



posted on Jul, 28 2013 @ 08:32 AM
link   
reply to post by jonnywhite
 


That's because it's not the same brain. An organic brain goes through neuronal death constantly. There is a delicate somewhat balancing act between neurogenesis and cell death going on at all times. The genetic component that gives individual variances in these brain functions would certainly lead way to aspects of personality.

If you replace the brain with synthetic neurons that have the same ability for neuronal death, yet simply clean up the genetic damage accumulated over one's lifetime, and increase telomere length, you'd just have a fresher brain. I would think your personality might revert back to more characteristics you had in your youth, but it seems you'd still be you.
edit on 28-7-2013 by QuantriQueptidez because: (no reason given)



posted on Jul, 28 2013 @ 11:27 AM
link   
reply to post by QuantriQueptidez
 



Pretty well invalidates your linear thought process regarding incremental technological progress.


Or, perhaps you just aren't educated enough to know where you went wrong.

What understanding do you have of neuroscience? How do you think scientists compile knowledge of the brain? Here's the two technologies we so far use to study the brain: electron microscopy and microtomes; the latest machines combine these two technologies.

There's a major problem with this method: it takes a lot of time.

Put it this way. Before information is passed into a computer - and that's where the "quantum computer" comes onto the scene - scientists need to cut extremely thin slices of the brain and analyze them under an electron microscope. Now, the reason why computers cant do this right now is simple: we don't have a taxonomic understanding of the various neurons. Currently, there are about 30 different types - but we haven't quite finished yet; there are more; in addition, we don't even know what each neuron does - which other neurons they're more likely to form connections with, etc.

And this problem is only the tip of the iceberg. The entire brain needs to be mapped; a 1 mm piece of brain has MILLIONS of neurons in it. In addition to needing to know the different classes of neurons there are, we also need to know the mechanism of how they relate with other neurons; this is the cellular information; which receptors are found in each particular neuron?

Before computers come onto the scene, we need the KNOWLEDGE to feed into these computers; at the current time, our methods are too time effective. It would be nice if we could develop a brain imaging machine with the resolution of an electron microscope - the only technology that allows us to see neurons/axon/synapses - but the problem is an MRI produces an image with 1000X less resolution than light microscopy; which in turn produces an image 1000X less than electron microscopy.

Sebastian Seung, author of "connectome" - a computational neuroscience and professor at MIT - pretty much the leading thinker in this field, predicts our knowledge of the brain will be complete to produce a "connectome" - the sum of all possible synaptic connections - a veritable neural "map" of our minds - by the END OF THIS CENTURY.

The biggest constraint is our sheer lack of knowledge. Computers simply cannot figure out this information for us. We first need to provide the basics - to map the brain, which involves understanding the classes and types of neurons, which neurons they hook up with, etc, BEFORE we create any algorithms that will know what to look for and what to tell us.
edit on 28-7-2013 by Astrocyte because: (no reason given)



posted on Jul, 28 2013 @ 04:05 PM
link   
reply to post by Astrocyte
 


End of the century?

When did you bother with this research?

You realize how long they thought it was going to take to map the human genone, vs when it was complete, correct?

We're already mapping out the human brain. Obama put in a good chunk of monies for it.

We will have a full understanding of the human brain before mid century.



posted on Jul, 28 2013 @ 06:50 PM
link   
reply to post by QuantriQueptidez
 




You realize how long they thought it was going to take to map the human genone, vs when it was complete, correct?


The brain is far more complex than the human genome project was.

The Human genome has 6 billion genes; the human brain has 100 billion neurons. There are roughly 1 quadrillion synaptic connections between the brains neurons,which makes up the "connectome". As mentioned before, not all neurons are the same; they have different shapes and forms, and it's assumed, functions; this would imply that they make special connections with other specific neuron types.

I get you want to believe Ray Kurzweil - a technologist, and not a neuroscientist - but the fact is, currently, we are no where nearproducing an encyclopaedic knowledge of the human brain. I study this subject; I'm currently in school for neuroscience; besides the problems Sebastain Seung outlined in his book "connectome", there is still the question of "what do glia cells do?" (this is what I study). Glia are the non-neuronal cells in our brains, making up a good 90% of brain tissue. Numbers wise, there are about 1 trillion glia cells of various sizes and shapes, from astrocytes, oligodendrocytes, microglia, and a multitude of others within these 3 basic branches. Some neuroscientists think these cells contribute to cognition; that it isn't just the "neurons". Of course, the neurons contain the basic information, but the glia may perform an important ancillary role beyond feeding glutamate and sucking up potassium, recycling neurotransmitters (astrocytes) mylenating axons (oligrodenderocytes) or performing immunological duties (microglia) amongst other things.

If this shows us anything, it shows how naive and incomplete our knowledge of the brain is. Like I said, it's mainly technologists who are making sanguine predictions about the "mid-century". I don't even know of an actual biological scientist i.e. someone who actually studies and follows the progress in brain sciences - who agrees with that absurd timetable. Those who hope to transfer their consciousness are making plans to cryonically freeze their brains, or, plastesize them, in hopes of avoiding death. And, obviously, this is all based on the assumption of emergence: that the brain "creates" consciousness, rather than acts as an antenna.

Seung even pokes fun of Kurzweil, letting him, his readers and even himself know, that one day, we are all going to die.



posted on Jul, 28 2013 @ 10:14 PM
link   
reply to post by QuantriQueptidez
 


Also, you seem to be confused about what the human genome project accomplished.

6 billion genes are sequenced. A great accomplishment. But does that mean that we now know how to create a living creature? Of course not. Knowing genes doesn't say anything about knowing WHAT GENES DO - which genes they work with, and the numerous other epigenetic factors involved.

If you want to compare numbers. Compare 6 billion genes with 1 quadrillion synaptic connections. How long did the human genome project take? Now, if you know how to do intermediate mathematics, you should be able to estimate how much longer it would take to figure out the differences between the 30-50 types of neurons that exist, figuring out their mechanism of action, and then mapping out their connection in an individual brain.

And this ignores the prerequisite knowledge; we should be able to have a one-on-one correspondence between certain neuronal pathways and any particular thought, emotion, feeling, etc. Mapping the brain means integrating the visual cortex, parietal cortex, frontal lobes, cerebellum, brain stem, and subcortical pathways which are all activated in looking at a bee pollinate a flower; you see it, feel a certain way about it, and might think something about it; unconscious processes proprioceptively note the bee's movement relative to it's spatial environment; unconsciously, your body adjusts itself to stay balanced; unconsciously, you breath, sweat, etc.

Again, 1 quadrillion computations. And before any simulation can be designed, we need to first discover the neurons involved in each of these processes. A computer model for example would need to be able to create a feedback loop between the internal experience and the external environment; visualization, hearing, smelling, tasting, touching, proprioception, interoception, balance, etc all those basic senses that go into normal human experience would have to be integrated into a computerized human/environment synthesis.

Besides the enormous technical difficulties, again, philosophically, it's all "guess-work".

The value of a connectome would first be in it's usefulness in treating certain mental disorders like schizophrenia, autism, post traumatic stress disorder, OCD, depression etc. Modern day eschatologists like Ray Kurzweil gets his fans all giddy about living forever in a computer program - but it's not gonna happen in their lifetime, and theres a good chance that it isn't possible to begin with.



posted on Jul, 29 2013 @ 10:12 AM
link   

Originally posted by neoholographic
I think a couple of discoveries tell us more about the brain, memory, consciousness and where we may be headed as a species.

You had Scientist implanting false memories into mice. I also remember seeing a Through the Wormhole where a scientist was working on implanting memories of how to drive a car for someone who has never driven a car before or memories on how to be an expert at a video game that you never played before.

This tells a a couple of things.

First, consciousness has to be outside of the material brain. How else can you recall specific memories at will and how can the material brain know the difference between these memories and which memories you wish to recall? This tells us that the brain processes information at the time an event occurs. In order to recall that event, the brain cells that processed that information have to be activated. There has to be a consciousness outside of the material brain that exerts it's will on the material brain and activates brain cells that will recall a specific memory from say 1985 when I went on a vacation to visit relatives in Chicago.

Secondly it's what I call memory uploads. I know it sounds Matrix like but it could be a future reality. Imagine uploading the memory of famous surgeon or someone who never took a physics course, uploading the memories of a Theoretical Physicist and they're able to write complicated equations.

In this case, the brain could easily reach information overload. Maybe this could lead to the beginning of Robo-Sapien. Maybe we will be able to upload a brain onto a computer chip and the first Robo-Sapiens would have a 10-chip brain which would be like 10 brains to process more information.

The future could be very interesting if we don't Nuke ourselves back to the Flinstone era first.


edit on 27-7-2013 by neoholographic because: (no reason given)


Consciousness does not exist outside the material brain, consciousness is an emergent property of the brain. No brain - no consciousness. There doesn't need to be "a consciousness outside of the material brain that exerts it's will on the material brain and activates brain cells that will recall a specific memory..." when the material brain does exactly this.

You can implant memories insofar as you can incite the structures of the brain (ex- Neurons, dendrites, axons, etc) to behave in a certain way; as far as I know memories only exist because experiences cause a specific constellation or cascade of activity in the brain, it doesn't matter whether that pattern of activity comes about due to past experiences or the tinkering of a scientist.

Edit: I take issue to calling the mice's memories "false"; they're just memories. They aren't any more fake because they were caused by a scientist vs. caused by experience. Rather a differentiation between "experience-bound memories" and "artificial-memories" or the like should be used.
edit on 29-7-2013 by Tetrarch42 because: (no reason given)

edit on 29-7-2013 by Tetrarch42 because: Addendum and spelling/punctuation



posted on Jul, 29 2013 @ 10:39 AM
link   
reply to post by Tetrarch42
 





There doesn't need to be "a consciousness outside of the material brain that exerts it's will on the material brain and activates brain cells that will recall a specific memory..." when the material brain does exactly this


You do realize though that this is a philosophical presupposition, correct?

Those who disagree with the premise that consciousness is just brain function disagree because it isn't very intelligible; it doesn't describe mechanism of action; it simply presumes based on its limited understanding that because the brain can give or take away consciousness, that then means that the brain IS consciousness.

And the argument that it "isn't" a necessary assumption is again, a personal feeling about you define the word "necessary". Does necessary mean that the brain changes consciousness, therefore the brain equals consciousness? Or, does necessary mean that the mental and physical are two completely different things, therefore, consciousness needs to be explained without recourse to the physical. In other words, it's what you find most interesting that determines what you consider necessary. If the idea of a "magical" creation of a new property from nothing more than the collective interactions of trillions of physical neurons sounds reasonable to you, than you will accept the premise that the brain is "all there is" to consciousness; you won't bother wondering further. However, if consciousness - cognition, and value - are things that seem to transcend any physical substance, then you will insist that science is at present incomplete; that the mental must be a basic part to the natural process; from insects, to animals, to humans, the mental is a common property.

Neo-Darwinian evolution does not provide any logical explanation for why the universe should have developed in this way; why did "consciousness" arise when it didn't exist at the beginning? Contemplate that for a second and think how absurd that sounds; that life emerged is strange enough; that creatures with "consciousness" emerged out of that process adds a degree of unintelligibility to the claim "consciousness = brain" that some people have a difficult time accepting.







 
5
<< 1    3 >>

log in

join