It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Real-world events always proceed in the direction of increasing entropy, even though the laws of physics don’t require it. The reason we never see events that reduce entropy is that they cannot leave behind any evidence of having happened, according to a new theory.
The mathematical laws of physics work just as well for events going forward or going backward in time. Yet in the real world, hot coffee never unmixes itself from cold milk. A theorist publishing in the 21 August Physical Review Letters offers a new explanation for this apparent conflict between the time-symmetry of the physical laws and the forward “arrow of time” we see in everyday events. When viewed in quantum terms, events that increase the entropy of the Universe leave records of themselves in their environment. The researcher proposes that events that go “backward,” reducing entropy, cannot leave any trace of having occurred, which is equivalent to not happening.
In the hypothesis of formative causation, discussed in detail in my books A NEW SCIENCE OF LIFE and THE PRESENCE OF THE PAST, I propose that memory is inherent in nature. Most of the so-called laws of nature are more like habits.
In the quantum world, an entropy-lowering demon would have a different chore, because in the quantum mechanical version of entropy, it isn’t heat that flows when entropy changes, it’s information. Lorenzo Maccone of the University of Pavia, Italy, and the Massachusetts Institute of Technology, describes a thought experiment to illustrate the consequences of reducing quantum entropy. An experimenter, Alice, measures the spin state of an atom sent by her friend Bob, who is otherwise isolated from Alice’s laboratory. The atom is in a combined state (superposition) of spin-up and spin-down until Alice measures it as either up or down.
From Alice’s perspective, her lab gains a single bit of information from outside, and it’s then copied and recorded in her memory and on her computer’s hard drive. That information flow from atom to lab increases entropy, according to Alice. Maccone argues that because Bob doesn’t see the result, from his perspective the spin state of the atom never resolves itself into up or down. Instead it becomes quantum mechanically correlated, or “entangled,” with the quantum state of the lab. He sees no information flow and no change in entropy.
Bob plays the role of Maxwell’s demon; he has total control of the quantum state of her lab. To reduce the entropy of the lab from Alice’s point-of-view, Bob reverses the flow of that one bit of information by removing any record of the atom’s spin from Alice’s hard drive and her brain. He does so by performing a complicated transformation that disentangles the lab’s quantum state from that of the atom.
Maccone writes that such a reversal violates no laws of quantum physics. In fact, from Bob’s perspective, the quantum information of the atom plus Alice’s lab is the same whether or not the two are entangled–there is no change in entropy as viewed from the outside. Such reversals could happen in real life, Maccone says, but because the Universe–like Alice–would retain no memory of them, they would have no effect on how we perceive the world. His paper goes on to show mathematically how this reasoning applies in general, with the Universe taking the place of Alice.
This is why you needed to actually read what I said before posting. Simply put, you saw the name Rupert and your brain shut down and you used the name Rupert as an excuse to remain ignorant.
Again, as long as Entropy increases and there's an arrow of time, the universe records every event that occurs.
This is just pure hogwash. Show me the peer reviewed paper that says this.
You're just making it up as you go along.
Life takes reverses entropy and creates order.
If you make the statement the entropy of the universe is constantly increasing, in the case of life, that would be false, because life is an activity of reversing entropy.
Its the creation of complex order, is the creation of complex order over time an act of entropy?
And eventually, even these clues to what occurred are dispersed and swallowed up by the random dance of quantum events.
In 1961, Rolf Landauer argued that the erasure of information is a dissipative process1. A minimal quantity of heat, proportional to the thermal energy and called the Landauer bound, is necessarily produced when a classical bit of information is deleted. A direct consequence of this logically irreversible transformation is that the entropy of the environment increases by a finite amount. Despite its fundamental importance for information theory and computer science2, 3, 4, 5, the erasure principle has not been verified experimentally so far, the main obstacle being the difficulty of doing single-particle experiments in the low-dissipation regime. Here we experimentally show the existence of the Landauer bound in a generic model of a one-bit memory. Using a system of a single colloidal particle trapped in a modulated double-well potential, we establish that the mean dissipated heat saturates at the Landauer bound in the limit of long erasure cycles. This result demonstrates the intimate link between information theory and thermodynamics. It further highlights the ultimate physical limit of irreversible computation.
Landauer's principle, first argued in 1961[1] by Rolf Landauer of IBM, is a physical principle pertaining to the lower theoretical limit of energy consumption of a computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment". (Bennett 2003
If no information is erased, computation may in principle be achieved which is thermodynamically reversible, and require no release of heat. This has led to considerable interest in the study of reversible computing.
Take sound, for example. Sound is air vibrating. You speak, and the motion of your vocal apparatus causes the air surrounding it to vibrate. Now these vibrations reduce in intensity over time, but they never actually cease altogether. The words you spoke last night are still vibrating in the atomosphere. So, for that matter, are the words of Socrates. Yet, no matter how sensitive a microphone you use, you will never pick up those words. Why not? Because the amplitude of the acoustic vibration has fallen below the average amplitude of Brownian motion, the random movement of air molecules. Those words are lost for ever. The record of them has been erased by the fundamental randomness of reality at the Planck scale.