It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
When you read those words and start THINKING as you say and the neuronal activity associated with that memory is activated, how does the brain know that neuronal activity is associated with that memory?
A simulation is a copy of the real thing.
Again no evidence lol!!
You said the brain fires a specific way, how does it know which neuronal activity is associated with specific memories??
What abstract math are you talking about exactly??? What questionable assumptions are you talking about exactly?
The firing is a result of the search algorithm. Different pathways are triggered by different stimuli so it will look different depending on the stimuli.
Again, NO EVIDENCE. I can say there's Hobbits on Kepler 22b but it means nothing if I don't provide any evidence to support my claim.
Do you even know what I mean when I say the symmetry is broken between Heisenberg and Schrodinger? Do you know what it mean and why they're equivalent?
So until you start quoting some actual scientific evidence instead of saying "our brain operates like a google search engine" and people should just accept what you're saying because you said, I'm done responding to any of your bloviating.
He answers the question Does the Moon only exist when someone is looking at it.
In mid-2004, John Conway and Simon Kochen of Princeton University proved the Free-will Theorem. This theorem states "If there exist experimenters with (some) free will, then elementary particles also have (some) free will." In other words, if some experimenters are able to behave in a way that is not completely predetermined, then the behavior of elementary particles is also not a function of their prior history. This is a very strong "no hidden variable" theorem.
The theorem states that, given the axioms, if the two experimenters in question are free to make choices about what measurements to take, then the results of the measurements cannot be determined by anything previous to the experiments. Since the theorem applies to any arbitrary physical theory consistent with the axioms, it would not even be possible to place the information into the universe's past in an ad hoc way. The argument proceeds from the Kochen-Specker theorem, which shows that the result of any individual measurement of spin was not fixed independently of the choice of measurements. As stated by Cator and Landsman regarding hidden variable theories:[3] "There has been a similar tension between the idea that the hidden variables (in the pertinent causal past) should on the one hand include all ontological information relevant to the experiment, but on the other hand should leave Alice and Bob free to choose any settings they like."
Counterfactual quantum cryptography (CQC) is used here as a tool to assess the status of the quantum state: Is it real/ontic (an objective state of Nature) or epistemic (a state of the observer's knowledge)? In contrast to recent approaches to wave function ontology, that are based on realist models of quantum theory, here we recast the question as a problem of communication between a sender (Bob), who uses interaction-free measurements, and a receiver (Alice), who observes an interference pattern in a Mach-Zehnder set-up. An advantage of our approach is that it allows us to define the concept of "physical", apart from "real". In instances of counterfactual quantum communication, reality is ascribed to the interaction-freely measured wave function (ψ) because Alice deterministically infers Bob's measurement. On the other hand, ψ does not correspond to the physical transmission of a particle because it produced no detection on Bob's apparatus. We therefore conclude that the wave function in this case (and by extension, generally) is real, but not physical. Characteristically for classical phenomena, the reality and physicality of objects are equivalent, whereas for quantum phenomena, the former is strictly weaker. As a concrete application of this idea, the nonphysical reality of the wavefunction is shown to be the basic nonclassical phenomenon that underlies the security of CQC.
Song's work also shows consciousness is not like other physical systems like neurons, atoms or galaxies. "If consciousness cannot be represented in the same way all other physical systems are represented, it may not be something that arises out of a physical system like the brain," said Song. "The brain and consciousness are linked together, but the brain does not produce consciousness. Consciousness is something altogether different and separate. The math doesn't lie."
Daegene Song obtained his Ph.D. in physics from the University of Oxford and now works at Chungbuk National University in Korea as an assistant professor. To learn more about Song's research, see his published work: D. Song, Non-computability of Consciousness, NeuroQuantology, Volume 5, pages 382~391 (2007). arxiv.org...
HOW CAN SHE KNOW THERE'S NOTHING SPECIAL ABOUT A CONSCIOUS OBSERVER IF SHE DOESN'T KNOW HOW MEASUREMENT WORKS???
How can the machine carry out a measurement if a conscious observer didn't make a choice to create the machine?
In other words, if some experimenters are able to behave in a way that is not completely predetermined, then the behavior of elementary particles is also not a function of their prior history. This is a very strong "no hidden variable" theorem.
Again, this is just a meaningless statement. Song showed you this is the case through the math of quantum theory one of the most powerful Scientific theories that has been produced. He's not saying this because it's something he wants to believe, he's saying this because it's what the math shows.
When an observer observes their own reference frame this fundamentally changes the math of quantum theory.
It is showing that non-conscious systems are capable of collapsing the wave function, not just "machines" as we define them. If non-conscious systems can collapse the wave function then the universe can easily exist without conscious observers to collapse the wave functions and make it exist.
I would like if Song could be much more specific about what this means. When he is talking in terms of Turing machines and the halting problem he is talking my language, but it's entirely unclear what it means to say "an observer observes their own reference frame". He tries to argue there is some physical interpretation of this idea but his explanation seemed extremely vague and didn't answer any of my questions.
Again, NO EVIDENCE. Show me the evidence that supports what she's saying. Where is this LOADS OF RESEARCH?
Quantum mechanics needs no consciousness
It has been suggested that consciousness plays an important role in quantum mechanics as it is necessary for the collapse of wave function during the measurement. Here we formulated several predictions that follow from this hypothetical relationship and that can be empirically tested. Experimental results that are already available suggest falsification of these predictions. Thus, the suggested link between human consciousness and collapse of wave function does not seem viable. We discuss the implications of these conclusions on the role of the human observer for quantum mechanics and on the role of quantum mechanics for the observer’s consciousness
Why decoherence solves the measurement problem
Although the solution, within standard quantum physics, of the problem of outcomes has been published several times, many authors continue to treat measurement as an unsolved fundamental dilemma. The solution lies in the formation of entangled subsystems, the non-local nature of the measurement state, and the resulting distinction between mixed-state local outcomes and the pure-state global outcome. Upon "measurement" (i.e. entanglement), the quantum system and its measurement apparatus both decohere and collapse into local mixed states while the unitarily-evolving global state remains coherent and un-collapsed. The states we observe are the local, collapsed states. Considerable experimental evidence supports this conclusion. Theoretical objections to this conclusion are rebutted, and a new perspective on measurement and entanglement is noted.
In quantum mechanics, quantum decoherence is the loss of coherence or ordering of the phase angles between the components of a system in a quantum superposition. One consequence of this dephasing is classical or probabilistically additive behavior. Quantum decoherence gives the appearance of wave function collapse, which is the reduction of the physical possibilities into a single possibility as seen by an observer. It justifies the framework and intuition of classical physics as an acceptable approximation: decoherence is the mechanism by which the classical limit emerges from a quantum starting point and it determines the location of the quantum-classical boundary[citation needed]. Decoherence occurs when a system interacts with its environment in a thermodynamically irreversible way. This prevents different elements in the quantum superposition of the total system's wavefunction from interfering with each other. Decoherence was first introduced 1970 by the German physicist H. Dieter Zeh and has been a subject of active research since the 1980s.[1]
Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[2] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion).[3] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.
Decoherence does not generate actual wave function collapse. It only provides an explanation for the observation of wave function collapse, as the quantum nature of the system "leaks" into the environment. That is, components of the wavefunction are decoupled from a coherent system, and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue. Specifically, decoherence does not attempt to explain the measurement problem. Rather, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive. Moreover, our observation tells us that this mixture looks like a proper quantum ensemble in a measurement situation, as we observe that measurements lead to the "realization" of precisely one state in the "ensemble".
These striking statements to the contrary, I do not believe that either detailed theoretical calculations or recent experimental results show that decoherence has resolved the difficulties associated with quantum measurement theory. This will not be a surprise to many workers in the field of decoherence; for example, in their seminal paper on decoherence as a source of spatial localization, Joos and Zeh (1985) state “Of course no unitary treatment of the time dependence can explain why only one of these dynamically independent components is experienced.” And in a recent review on decoherence, Joos (1999) states “Does decoherence solve the measurement problem? Clearly not. What decoherence tells us is that certain objects appear classical when observed. But what is an observation? At some stage we still have to apply the usual probability rules of quantum theory.” Going back a few years, an informative and lively debate on these issues can be found in the Letters column of the April 1993 Physics Today (starting on page 13 of that issue and continuing over many pages), in response to an earlier article in that journal by Zurek (1991). An enlightening discussion of the measurement problem has been given by Bell (1990), and there also are extensive discussions of both the measurement problem and the role of decoherence in the philosophy of physics literature. A careful analysis of the measurement problem has been given by Brown (1986), who reviews earlier work of Fine (1969) and others. Rebuttals to the claim that decoherence solves the measurement problem have been given in the books of Albert (1992), Barrett (1999) and Bub (1997), with Bub’s treatment closet in spirit to the formulation given below. A detailed analysis of decoherence within the consistent histories approach has been given by Kent and McElwaine (1997), and discussions of decoherence in the context of the many-worlds approach can be found in Bacciagaluppi (2001) (who gives an extensive bibliography on decoherence as it relates to the measurement problem) and in Butterfield (2001). Despite the existence of these and other prior discussions, I think it worthwhile to revisit the substantive issues, particularly in the light of recent claims that decoherence resolves the measurement problem.
Show me the evidence that says the wave function collapses. Secondly where did I say consciousness cause the wave function to collapse? Where did I even say the wave function collapses?
Secondly, Decoherence doesn't solve the measurement problem and that's why Scientist still debate the measurement problem. That's why you still have debates about 4 or 5 different interpretations of QM surrounding the measurement problem.
Decoherence does not generate actual wave function collapse. It only provides an explanation for the observation of wave function collapse, as the quantum nature of the system "leaks" into the environment.
And in a recent review on decoherence, Joos (1999) states “Does decoherence solve the measurement problem? Clearly not. What decoherence tells us is that certain objects appear classical when observed. But what is an observation? At some stage we still have to apply the usual probability rules of quantum theory.”
While decoherence explains why a quantum system begins to obey classical probability rules after interacting with its environment (due to the suppression of the interference terms when applying Born's probability rules to the system), it does not explain what an observation actually is. Thus, it does not explain why the environment is seen to be in one definite state rather than in a superposition of states.
Again, you don't understand what decoherence is or is not.
If decoherence solves the measurement problem, why is there any uncertainty as to which measurement will occur?
Again, if the measurement problem was solved you wouldn't have most Physicist still accepting Copenhagen.
Do you even know the difference between low/high entropy? Why are humans high entropy systems.
A mixed state cannot be described as a ket vector. Instead, it is described by its associated density matrix (or density operator), usually denoted ρ. Note that density matrices can describe both mixed and pure states, treating them on the same footing. Moreover, a mixed quantum state on a given quantum system described by a Hilbert space H can be always represented as the partial trace of a pure quantum state (called a purification) on a larger bipartite system H otimes K for a sufficiently large Hilbert space K.
Say you have a particle that has a 60/40 for up/down. In a pure state, it would be in superposition and interference terms will be present. When decoherence occurs it goes to a mixed state and the interference terms will not be present but you still will have a quantum ensemble of probable states.
When a measurement occurs and a single outcome is observed, you don't see an ensemble of states. WHY? If decoherence solved the measurement problem then observation should look mathematically like a mixed state but it doesn't. It goes from a wave function to a Dirac delta function when a measurement occurs and we perceive a single state. This is the measurement problem and it's why Decoherence doesn't solve it AT ALL.
The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.
Although entropy must increase over time in an isolated or “closed” system, an “open” system can keep its entropy low — that is, divide energy unevenly among its atoms — by greatly increasing the entropy of its surroundings. In his influential 1944 monograph “What Is Life?” the eminent quantum physicist Erwin Schrödinger argued that this is what living things must do. A plant, for example, absorbs extremely energetic sunlight, uses it to build sugars, and ejects infrared light, a much less concentrated form of energy. The overall entropy of the universe increases during photosynthesis as the sunlight dissipates, even as the plant prevents itself from decaying by maintaining an orderly internal structure.