It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

How does the material brain initiate the material brain?

page: 9
8
<< 6  7  8   >>

log in

join
share:

posted on May, 22 2015 @ 08:58 PM
link   
a reply to: neoholographic


You have read about entropy but you don't understand what you're talking about so you say these asinine things WITHOUT A SHRED OF EVIDENCE to support anything that you're saying.

Well perhaps if you took a moment to stop insulting me you would realize I'm talking about entropy from within the context of information theory and not thermodynamics (aka Shannon entropy). If you wrote a novel and then you tried to compress the text of that novel as much as you possibly could, you would compress the original text until it looked like a totally random string but you would not be able to compress it any further than that. If I was trying to pack the maximum amount of information I possibly could into this post then it would look completely random, and the more random it is, the more entropy it has. This is why we call a source of random numbers a source of entropy in computing.


Entropy as information content

Entropy is defined in the context of a probabilistic model. Independent fair coin flips have an entropy of 1 bit per flip. A source that always generates a long string of B's has an entropy of 0, since the next character will always be a 'B'.

The entropy rate of a data source means the average number of bits per symbol needed to encode it. Shannon's experiments with human predictors show an information rate between 0.6 and 1.3 bits per character in English;[10] the PPM compression algorithm can achieve a compression ratio of 1.5 bits per character in English text.


So there is absolutely nothing invalid about saying that a human has a higher level of entropy than an elementary article. Compared to the universe as a whole we may not have a high level of entropy but relative to quantum scale objects our level of entropy is almost infinitely larger.


There's debates today as to why the universe began in a low entropy state.

And if your description of entropy was correct why would a perfectly symmetrical singularly with a very simple structure have a low level of entropy? I said that simple things have less entropy than complex things because they require less information to be defined, so it makes perfect sense to say that a singularity is a low entropy state.


You have it mixed up. You think a high level of entropy means there's a high level complexity and information. This is just EMBARRASSINGLY WRONG. A system that has a higher level of complexity and information is in a low entropy state as it's further away from equilibrium.

Ok well it seems to me you're the one who is embarrassingly wrong here. If entropy doesn't increase with the complexity/information density of the system then why do we say the universe started in a low entropy state and gains entropy as it evolves and becomes more complex? Just watch this video and educate yourself:

edit on 22/5/2015 by ChaoticOrder because: (no reason given)



posted on May, 22 2015 @ 09:35 PM
link   
a reply to: ChaoticOrder

You said:


So there is absolutely nothing invalid about saying that a human has a higher level of entropy than an elementary article. Compared to the universe as a whole we may not have a high level of entropy but relative to quantum scale objects our level of entropy is almost infinitely larger.


First, everything in the video supports what I've said. You're just all over the place trying to find something coherent in the noise of your nonsense.

Yes it's INVALID. You jump from self aware androids, decoherence and now entropy and you don't understand what you're talking about.

At the end of the day, you said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

UTTER NONSENSE!!

This just illustrates why everything you have been saying is INVALID. You never provide any Scientific evidence to support the nonsense you say. You then try to skip to something new once you realize how silly your previous statement sounds. So back to what you said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

Show me the Scientific evidence that says when a higher entropy system interacts with a lower entropy system that forces the wave function to collapse to a more rigid direction and it diminishes superposition.

Show me the SCIENTIFIC paper or experiment that says this. This news should be front page news on Scientific websites because the measurement problem is solved and we know that the wave function collapses and why it collapses. I just saw a debate about wave function collapse vs. apparent wave function collapse but if your're right this debate was silly and Physicists throughout the world need to stop debating these things.

Also show me where in the math of quantum theory is this DEFINITIVE wave function collapse found.

I'm tired of you hoping around when you make these silly statements. So we will stick to your claim and maybe we will finally get some Scientific evidence.



posted on May, 22 2015 @ 09:55 PM
link   
a reply to: neoholographic


First, everything in the video supports what I've said. You're just all over the place trying to find something coherent in the noise of your nonsense.

You said, and I quote, "A system that has a higher level of complexity and information is in a low entropy state", which is absolutely incorrect! How can you possibly think that video supports such a claim when it clearly explains why an increase in information corresponds to an increase in entropy? It explains exactly the same thing I I've said in my last two posts. Your ability to skip around the fact you were wrong is simply astonishing.


At the end of the day, you said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

No that's not what I said "at the end of the day", it's just what you want to focus on. It's one theory I brought up to explain the measurement problem, I also posted a reference to a recent paper which explains why decoherence is enough to solve the measurement problem. There are plenty of physicists who believe that decoherence is the answer to the measurement problem, whether my interpretation of decoherence is correct or not.

I never claimed that my personal theory about high entropy systems interacting with low entropy systems is a rock solid theory or even a well developed theory, I already told you it's just an easy way to think about it using simplified terminology to describe the process of "realness" leaking between systems, the same way heat will be transfered between two objects with a different temperature, until both of them reach an equilibrium.

I don't really know why you are so compelled to believe that the collapse cannot be easily explained. Is there a reason your belief framework requires the collapse to occur in a specific way or do you just not believe in any wave collapse theory? If not then its unclear to me exactly what your interpretation of QM is since you seem to accept the free-will theorem, I assume you must believe in a non-deterministic universe.
edit on 22/5/2015 by ChaoticOrder because: (no reason given)



posted on May, 22 2015 @ 10:46 PM
link   
a reply to: ChaoticOrder

No it's not incorrect because a system in a low entropy state is further away from equilibrium. This is just common sense when looking at statistical thermodynamics. A new deck of cards is in a lower entropy state vs. a deck of cards after a game of 52 pickup. So yes, a system further away from equilibrium is in a lower state of entropy and that's why I quoted Schrodinger from his book What is Life?

Also, you make these silly claims and offer zero evidence. So I will keep asking where's the evidence? You said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

I will keep quoting this until you show some SCIENTIFIC EVIDENCE that says the high entropy system acts as an observer and when it interacts with a low entropy system the wave function collapses.

You just said:

I don't really know why you are so compelled to believe that the collapse cannot be easily explained.

This is just ASININE!

You have debates about collapse from M.I.T. to Harvard and I'm supposed to accept someone on a message board who doesn't provide any evidence and who says the silliest things when you say COLLAPSE CAN BE EASILY EXPLAINED.

Show me the Scientific Evidence that shows the wave function collapses. Show me the experiment. You do know there a debate between Physicist from Harvard to Oxford on whether wave function collapse even occurs. So how do EASILY EXPLAIN collapse or even if collapse occurs when some of the most brilliant Physicist throughout the world who are still debating the issue?

Why should anyone in a right state of mind accept anything you're saying when you make nonsensical statements like this with NO EVIDENCE.



posted on May, 23 2015 @ 12:21 AM
link   
a reply to: neoholographic


No it's not incorrect because a system in a low entropy state is further away from equilibrium. This is just common sense when looking at statistical thermodynamics. A new deck of cards is in a lower entropy state vs. a deck of cards after a game of 52 pickup. So yes, a system further away from equilibrium is in a lower state of entropy and that's why I quoted Schrodinger from his book What is Life?

Of course the deck is in a lower entropy state before it has been shuffled, because there are patterns in the ordering of the cards which allow it to be compressed. The order of a shuffled deck is more random and therefore it cannot be compressed as much as an unshuffled deck, which means more information is required in order to describe the order of the deck, hence it has a higher entropy. You said that a high entropy equates to a low complexity, but clearly a shuffled deck is more complex than an unshuffled deck because it has a higher information density.


You have debates about collapse from M.I.T. to Harvard and I'm supposed to accept someone on a message board who doesn't provide any evidence and who says the silliest things when you say COLLAPSE CAN BE EASILY EXPLAINED.

I don't expect you to believe anything I say, but when I say it's easy I mean there is not something fundamentally mysterious about the way the collapse works and it can be explained with traditional QM physics. I'm aware that many physicists still debate over the answer but that doesn't mean there isn't a clear cut answer. I have no idea why you are so focused on this topic of the measurement problem because you still haven't explained exactly what your interpretation of QM is or how the measurement problem relates to consciousness.



posted on May, 23 2015 @ 01:09 AM
link   
a reply to: ChaoticOrder

First, you're all wrong about entropy. You said:

You said that a high entropy equates to a low complexity, but clearly a shuffled deck is more complex than an unshuffled deck because it has a higher information density.

A shuffled deck isn't more complex than an unshuffled deck. An unshuffled deck contains more available arrangements but it takes outside energy to put the unshuffled deck in an arrangement that increases complexity.

Take the words I went to the park.

Then you have this string of letters abcd...z

The words I went to the park have a lower state of entropy than a-z. A-Z contains more information because there's more available states that the letters a-z can be in but it takes energy to arrange the letter a-z in order to produce a novel and take the string of letters a-z and put them in a lower state of entropy.

You have it mixed up. You're equating available arrangements with complexity. Just because a system has more ways it can be arranged doesn't mean it's more complex. In fact, it means it's more random until it's arranged in a more ordered way and that takes energy.

Here's one of my favorite videos on Boltzmann brains that explains this beautifully.



You said:

I don't expect you to believe anything I say

This is the first thing you have said that makes any sense BRAVO!

Again, I will repeat the nonsense you said and you haven't provided a shred of evidence to support it.

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

Please show the Scientific evidence that says a high entropy system interacts with a lower entropy system and this forces wave function collapse. You said this nonsense now please back it up with some Science that says this. If you have a "clear cut" answer I would like to see it and I'm sure every Physicist from America to China would like to hear it also.

You said:

I'm aware that many physicists still debate over the answer but that doesn't mean there isn't a clear cut answer.

Let's see SCIENTIFIC EVIDENCE of this "clear cut" answer then we can notify Nature and other journals that someone on a message board with no evidence has all of the answers and they can stop accepting papers from Physicist on this topic.



posted on May, 23 2015 @ 02:00 AM
link   
a reply to: neoholographic


A shuffled deck isn't more complex than an unshuffled deck. An unshuffled deck contains more available arrangements but it takes outside energy to put the unshuffled deck in an arrangement that increases complexity.

This makes absolutely no sense. Of course energy is required to change the state of the deck, that doesn't invalidate anything I said. Just because low entropy states are much less common than high entropy states doesn't mean low entropy states can be considered more complex. They are more uncommon, but not more complex. A string containing only the same character could hardly be considered more complex than a totally random string of the same length.

The simple fact is an unshuffled deck has a lower information density than a shuffled deck because it's harder to compress. I guess it really comes down to how you want to define "complexity" but it seems pretty clear to me that if something requires more information to be defined then it is more complex. It's behavior wont necessarily be more complex than a system defined with less information but it's structure is harder to define, which makes it technically more complex.


Take the words I went to the park.

Then you have this string of letters abcd...z

The words I went to the park have a lower state of entropy than a-z. A-Z contains more information because there's more available states that the letters a-z can be in but it takes energy to arrange the letter a-z in order to produce a novel and take the string of letters a-z and put them in a lower state of entropy.

The mere fact you can compress the second string to "a-z" shows it has a low entropy. The second string would make a poor password because it has a very clear pattern. When you choose a password it will tell you to choose something with the highest possible entropy, not the lowest possible entropy. The phrase "I went to the park" would actually make a reasonable password, better than the second string anyway. I'm haven't talked about the number of possible states because I'm talking in terms of compression. The a-z string is longer than the first string but it actually contains much less information than the first string, the information density is the ratio of actual information to available states.


You have it mixed up. You're equating available arrangements with complexity. Just because a system has more ways it can be arranged doesn't mean it's more complex. In fact, it means it's more random until it's arranged in a more ordered way and that takes energy.

I never said the number of available states equates to the level of complexity. I could have a string containing a billion characters but if they were all the same character the entropy would still be extremely low. What I said is that the complexity of any given isolated system is a measure of how much information is required to describe everything about the system. The amount of information you are left with after compressing the information as much as possible defines the level of complexity. A string containing the same character repeated a billion times can obviously be defined with a small amount of information and thus it can be said to have a low complexity.


Here's one of my favorite videos on Boltzmann brains that explains this beautifully.

That video doesn't say anything I don't agree with and it doesn't invalidate anything I said. He says the entropy of a shuffled deck is higher than an unshuffled deck, which I completely agree with. The problem seems to be we are defining complexity differently. You're saying it's low when entropy is high but I'm saying it corresponds directly with entropy. In any case I don't see why we need to argue over semantics when we're essentially saying the same thing.
edit on 23/5/2015 by ChaoticOrder because: (no reason given)



posted on May, 23 2015 @ 03:01 AM
link   
Well after doing a little research it turns out the type of complexity I'm talking about has a name: Kolmogorov complexity


In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity (also known as descriptive complexity, Kolmogorov–Chaitin complexity, algorithmic entropy, or program-size complexity) of an object, such as a piece of text, is a measure of the computability resources needed to specify the object. It is named after Andrey Kolmogorov, who first published on the subject in 1963.[1][2]

For example, consider the following two strings of 32 lowercase letters and digits:

abababababababababababababababab

4c1j5b2p0cv4w1x8rx2y39umgw5q85s7

The first string has a short English-language description, namely "ab 16 times", which consists of 11 characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, which has 32 characters.

More formally, the complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (the sensitivity of complexity relative to the choice of description language is discussed below). It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself. Strings, like the abab example above, whose Kolmogorov complexity is small relative to the string's size are not considered to be complex.


Another good example which the wiki mentions is the Mandelbrot set. Even though Mandelbrot fractals looks extremely complex they can actually be generated with a rather simple algorithm. So the amount of information required to define a Mandelbrot fractal is actually rather small. But if you have just a randomly generated image and you try to create an algorithm which will replicate that image you will find yourself writing a rather long algorithm because a random image has a high level of entropy and therefore a high level of complexity.



posted on May, 23 2015 @ 09:46 AM
link   
a reply to: ChaoticOrder

Again, you make no sense.

Again, you provide no evidence to support what you're saying.

First it was self aware androids, then you skipped to decoherence, next high entropy objects can collapse the wave function now your whole post is about the complexity of a string and it has nothing to do with the nonsense you have said. Here's what you said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

The reason why you jumped from Decoherence when you realized you had no understanding of decoherence is to talk about collapse of the wave function and how easily measurement can be explained.

You spent a few post on this and then when you realized their wasn't a shred of evidence to support what you're saying you try to leap to something else. I want the SCIENTIFIC EVIDENCE that shows a high entropy object can interact with a low entropy object and force the wave function to collapse.

What you're quoting about Kolmogorov complexity has nothing to do with what you said which is this:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

Again, there's a pattern here. You come out with all this wild nonsense, you don't provide any Scientific evidence and then when you realize it's nonsense you try to move on to something else.

Kolmogorov complexity is related to the entropy of the source. Here's a part of the Wiki page you forgot to post. Remember, this is about Entropy and your assertion that high entropy states interact with low entropy states and force the wave function to collapse.


It can be shown[14] that for the output of Markov information sources, Kolmogorov complexity is related to the entropy of the information source. More precisely, the Kolmogorov complexity of the output of a Markov information source, normalized by the length of the output, converges almost surely (as the length of the output goes to infinity) to the entropy of the source.


This simply supports everything I'm saying and has nothing to do with what you have said. Kolmogorov complexity has nothing to do with your claim or a systems entropy like the human body. In fact most people say you can't apply it to the human body and there's ZERO EVIDENCE that it applies to a higher entropy object interacting with a lower entropy object and collapsing the wave function.

Kolmogorov complexity can be used to debate in favor of Intelligent Design as Dembski used it for specified complexity:


Dembski's proposed test is based on the Kolmogorov complexity of a pattern T that is exhibited by an event E that has occurred. Mathematically, E is a subset of Ω, the pattern T specifies a set of outcomes in Ω and E is a subset of T. Quoting Dembski[16]

Thus, the event E might be a die toss that lands six and T might be the composite event consisting of all die tosses that land on an even face.

Kolmogorov complexity provides a measure of the computational resources needed to specify a pattern (such as a DNA sequence or a sequence of alphabetic characters).[17] Given a pattern T, the number of other patterns may have Kolmogorov complexity no larger than that of T is denoted by φ(T). The number φ(T) thus provides a ranking of patterns from the simplest to the most complex. For example, for a pattern T which describes the bacterial flagellum, Dembski claims to obtain the upper bound φ(T) ≤ 1020.


en.wikipedia.org...

So again, everything I said refutes your silly assertions and you have things mixed up about entropy. The reason you're having these problems is because you're trying to mix all of these different things into the nonsense you come up with and when you're called on that nonsense and asked to present Scientific evidence to support it you jump to something else. Sorry, no jumping this time.

Show the scientific evidence that says a high entropy system interacts with a low entropy system and forces the wave function to collapse. This was your whole reason for jumping from Decoherence which you didn't understand to entropy. You said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

UTTER NONSENSE!!

But if you have Scientific evidence like published papers or lab experiments to support this claim, I'm sure myself and every other Physicist in the world will want to see it so they can stop wasting time debating the issue.



posted on May, 23 2015 @ 08:39 PM
link   
a reply to: neoholographic


Kolmogorov complexity is related to the entropy of the information source.

This simply supports everything I'm saying and has nothing to do with what you have said.

Of course the complexity is related to the entropy, I said that as the entropy increases the complexity will increase because random systems are harder to define than systems which contain patterns which are easy to compress. That's exactly what that statement means. It doesn't mean a system with a low entropy is more complex than a system with a high entropy, which is what you explicitly claimed earlier on.


Kolmogorov complexity has nothing to do with your claim or a systems entropy like the human body. In fact most people say you can't apply it to the human body and there's ZERO EVIDENCE that it applies to a higher entropy object interacting with a lower entropy object and collapsing the wave function.

Kolmogorov complexity has everything to do with my claim about how the wave function collapses. I said that simple objects such as elementary particles are much easier to define because they are less complex, meaning less information is required to define an elementary particles compared to a chemical element which is a collection of elementary particles. Kolmogorov complexity is exactly the type of complexity I'm talking about, it is dictated by the minimum amount of information required to describe an object such as a string or a particle. I have also shown that as the entropy of an object increases the Kolmogorov complexity of the object increases, but in reality we don't even need to talk about entropy. We can simply talk about high complexity systems and low complexity systems becoming entangled instead of high and low entropy systems.

Low complexity system require less information to be defined, even their exact position isn't well defined, so in some sense they are less real than systems which require a lot of information to be described. There is no reason this concept cannot be extend beyond elementary particles and chemical elements, we can consider larger objects including humans. Obviously we aren't perfectly isolated from our environment and so we aren't a closed system, but it's still possible to approximate the number of particles and the types of particles and the way they are laid out inside any given person. Another way is to measure the amount of information contained in our DNA. Also I should point out that the arguments made by Dembski appear to be complete pseudo-science, the wiki article you referenced mentions this under the criticisms section:


The soundness of Dembski's concept of specified complexity and the validity of arguments based on this concept are widely disputed. A frequent criticism (see Elsberry and Shallit) is that Dembski has used the terms "complexity", "information" and "improbability" interchangeably. These numbers measure properties of things of different types: Complexity measures how hard it is to describe an object (such as a bitstring), information measures how close to uniform a random probability distribution is and improbability measures how unlikely an event is given a probability distribution.

When Dembski's mathematical claims on specific complexity are interpreted to make them meaningful and conform to minimal standards of mathematical usage, they usually turn out to be false.[citation needed] Dembski often sidesteps these criticisms by responding that he is not "in the business of offering a strict mathematical proof for the inability of material mechanisms to generate specified complexity".[21] On page 150 of No Free Lunch he claims he can demonstrate his thesis mathematically: "In this section I will present an in-principle mathematical argument for why natural causes are incapable of generating complex specified information." Others have pointed out that a crucial calculation on page 297 of No Free Lunch is off by a factor of approximately 10^65.[22]



But if you have Scientific evidence like published papers or lab experiments to support this claim, I'm sure myself and every other Physicist in the world will want to see it so they can stop wasting time debating the issue.

How many times do I have to explain to you it's just my personal interpretation of how decoherence can cause quantum objects to take on classical behavior. I tried to explain it in terms of Shannon entropy and Kolmogorov complexity, which led to a long debate about entropy and complexity. The only reason you continue to demand evidence for my personal hypothesis is because you know I'll have a hard time finding any mainstream evidence for it, well who knows, there might be some evidence out there, I haven't bothered to check. All I know is that many of the physicists in the debate you mention are arguing the position that decoherence can fully explain the collapse and I find myself gravitating towards that position.
edit on 23/5/2015 by ChaoticOrder because: (no reason given)



posted on May, 23 2015 @ 09:26 PM
link   
a reply to: ChaoticOrder

Everything you said makes no sense.

How many times do I have to explain to you it's just my personal interpretation of how decoherence can cause quantum objects to take on classical behavior. I tried to explain it in terms of Shannon entropy and Kolmogorov complexity, which led to a long debate about entropy and complexity. The only reason you continue to demand evidence for my personal hypothesis is because you know I'll have a hard time finding any mainstream evidence for it, well who knows, there might be some evidence out there, I haven't bothered to check. All I know is that many of the physicists in the debate you mention are arguing the position that decoherence can fully explain the collapse and I find myself gravitating towards that position.

Show me where most physicist say wave function collapse can be explained by decoherence. You keep saying these silly things with no evidence. Many Physicist who look to decoherence say the wave function doesn't collapse.

Show me the Physicist arguing your position. I want to see one, JUST ONE that says this in a published paper:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

You just throw out silly claim after silly claim with no evidence.

You said:

Kolmogorov complexity has everything to do with my claim about how the wave function collapses.

Show me the SCIENTIFIC EVIDENCE that says Kolmogorov complexity has anything to do with high entropy objects interacting with low entropy objects and forcing the wave function to collapse.

You didn't stop there with this nonsense.

You said this leads to diminishing superposition.

Where's the evidence to support any of this nonsense?



posted on May, 23 2015 @ 10:45 PM
link   
a reply to: neoholographic


Everything you said makes no sense.

Maybe not to you.


Show me where most physicist say wave function collapse can be explained by decoherence.

I never said "most" physicists say that, I said many do. I already referenced a recent paper arguing that decoherence is enough to explain the measurement problem and there are many more papers on the subject arguing the same position.



Why decoherence solves the measurement problem (2013)

The solution of the quantum measurement problem, entirely within conventional quantum physics, has been published on at least four occasions (Scully, Shea, & McCullen, 1978; Scully, Englert, & Schwinger, 1989; Rinner & Werner, 2008; Hobson, 2013) . A similar solution has been presented by (Dieks, 1989; Dieks, 1994; Lombardi & Dieks), who propose it as a fundamental postulate that amounts to a new "modal interpretation" of quantum physics. Yet many articles in this and other journals continue to treat measurement as an unsolved fundamental problem whose resolution requires either exotic interpretations or fundamental alterations of quantum theory. For example, Adler (2003) has published an article titled "Why decoherence has not solved the measurement problem," despite the fact that, as will be reviewed below, decoherence has solved the measurement problem.



Measurement in Quantum Mechanics: Decoherence and the Pointer Basis (2012)

What, then, is the final verdict of the decoherence theory? Has it resolved the conceptual problems of quantum mechanics? There are many who believe that the conceptual problems of quantum mechanics are still unresolved and decoherence does not answer many issues. At the end of the day we can say that the decoherence explanation takes away some of the mystery from the idea of wave function collapse and provides a conventional mechanism to explain the appearance of a classical world. Many physicists find this a satisfactory explanation and there is no doubt that the experiments discussed clearly show how decoherence washes away quantum coherences providing a fairly convincing evidence for explaining the absence of Schödingers Cats in the real world. For all practical purposes, the decoherence explanation finds favour as a satisfactory settlement of the quantum measurement problem.



Decoherence, einselection, and the quantum origins of the classical (2003)

The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues.



Show me the SCIENTIFIC EVIDENCE that says Kolmogorov complexity has anything to do with high entropy objects interacting with low entropy objects and forcing the wave function to collapse.

I never said Kolmogorov complexity can explain such a thing, I said it can be used as a tool for determining how complex any given object/system is. It is a way to measure the purified information content of the system, the smallest amount of information required to describe everything about the system. The reason I brought up Kolmogorov complexity was because we were using different definitions of what complex means, now we have a very clear way of defining what complex means and there's no reason to argue over semantics, the logic is clear.
edit on 23/5/2015 by ChaoticOrder because: (no reason given)



posted on May, 23 2015 @ 11:51 PM
link   
a reply to: ChaoticOrder

Everything you posted is MEANINGLESS as to what you said.

Of course there's some Physicist who think decoherence solves the measurement problem but there's also many that don't because Decoherence doesn't tell you why you perceive a single outcome and you go from a wave function to a Dirac delta function.

You have papers like this all the time and there's papers that say Decoherence doesn't solve the measurement problem but it gets worse for you.

You said:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

THIS HAS NOTHING TO DO WITH WHAT YOU QUOTED ABOVE.

In fact, many people who support decoherence says the wave function doesn't collapse but you say the wave function is forced to collapse when a high entropy state interacts with a low entropy state.

Where is you evidence that this causes the wave function to collapse?

Look at one of the papers you quoted above said:

At the end of the day we can say that the decoherence explanation takes away some of the mystery from the idea of wave function collapse and provides a conventional mechanism to explain the appearance of a classical world.

This supports exactly what I've been saying. Many people look at Decoherence as a way to avoid wave function collapse BUT YOU SAID:

The high entropy system acts as the observer and when it interacts with the lower entropy system it makes a measurement which forces that system to take on a more rigid definition (aka forces the wave function to collapse randomly), leading to a diminishing of the superposition effect as both systems merge into a single system.

Where's the evidence??

The problem with this view is that they turn QM into a philosophy when they try to claim decoherence just gives the appearance of collapse. Here's more:

Quantum Experiment Verifies Nonlocal Wavefunction Collapse for a Single Particle


Physicists from Griffith University have demonstrated “spooky action at a distance” for the first time with no efficiency loophole by splitting a single photon between two laboratories and experimentally testing if the choice of measurement in one lab really causes a change in the local quantum state in the other lab.


scitechdaily.com...

One more:


Physicists snatch a peep into quantum paradox

Measurement-induced collapse of quantum wavefunction captured in slow motion.


www.nature.com...

Those who try to say decoherence solves the measurement problem have to explain how decherence causes this apparent collapse of the wave function when Scientist are seeing an actual collapse. Where is any evidence of an apparent collapse? How do you even show evidence of an apparent collapse when we can't measure the global wave function that people say is still evolving?

Your problems run deeper though.

First you have no evidence that decoherence causes collapse or explains why the wave function goes to a Dirac delta function when measured and observed. Why is there still probable states when decoherence occurs?

Secondly, where is the evidence that says a high entropy observer interacts with a low entropy system and forces the wave function to collapse?
edit on 23-5-2015 by neoholographic because: (no reason given)

edit on 24-5-2015 by neoholographic because: (no reason given)



new topics

top topics



 
8
<< 6  7  8   >>

log in

join