It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
As described in a paper posted online and submitted to Physical Review Letters (PRL), researchers from NIST and several other institutions created pairs of identical light particles, or photons, and sent them to two different locations to be measured. Researchers showed the measured results not only were correlated, but also—by eliminating all other known options—that these correlations cannot be caused by the locally controlled, "realistic" universe Einstein thought we lived in. This implies a different explanation such as entanglement.
The NIST experiments are called Bell tests, so named because in 1964 Irish physicist John Bell showed there are limits to measurement correlations that can be ascribed to local, pre-existing (i.e. realistic) conditions. Additional correlations beyond those limits would require either sending signals faster than the speed of light, which scientists consider impossible, or another mechanism, such as quantum entanglement.
In the best experimental run, both detectors simultaneously identified photons a total of 6,378 times over a period of 30 minutes. Other outcomes (such as just one detector firing) accounted for only 5,749 of the 12,127 total relevant events. Researchers calculated that the maximum chance of local realism producing these results is just 0.0000000059, or about 1 in 170 million. This outcome exceeds the particle physics community's requirement for a "5 sigma" result needed to declare something a discovery. The results strongly rule out local realistic theories, suggesting that the quantum mechanical explanation of entanglement is indeed the correct explanation.
Merely by existing, all physical systems register information. And by evolving dynamically in time, they transform and process that information. The laws of physics determine the amount of information that a physical system can register (number of bits) and the number of elementary logic operations that a system can perform (number of ops). The universe is a physical system. This paper quantifies the amount of information that the universe can register and the number of elementary operations that it can have performed over its history. The universe can have performed no more than 10120 ops on 1090 bits.
I do some Computer Programming on the side and I build websites, the world as a simulation makes more sense to me rather than the world being the result of some random physics. There's no such thing as randomness to me. Everything is governed by a set of rules, laws and values and you just get variations of these rules and laws.
originally posted by: Mianeye
Let's hope we never find the source code for this simulation, weird # might start to happen
In quantum computation, series of quantum gates have to be arranged in a predefined sequence that led to a quantum circuit in order to solve a particular problem. What if the sequence of quantum gates is known but both the problem to be solved and the outcome of the so defined quantum circuit remain in the shadow? This is the situation of the stock market. The price time series of a portfolio of stocks are organized in braids that effectively simulate quantum gates in the hypothesis of Ising anyons quantum computational model. Following the prescriptions of Ising anyons model, 1-qubit quantum gates are constructed for portfolio composed of four stocks. Adding two additional stocks at the initial portfolio result in 2-qubits quantum gates and circuits. Hadamard gate, Pauli gates or controlled-Z gate are some of the elementary quantum gates that are identified in the stock market structure. Addition of other pairs of stocks, that eventually represent a market index, like Dow Jones industrial Average, it results in a sequence of n-qubits quantum gates that form a quantum code. Deciphering this mysterious quantum code of the stock market is an issue for future investigations.
Researchers at the company could unveil a quantum computer that is superior to conventional computers by the end of next year.
Google's engineers just achieved a milestone in quantum computing: they’ve produced the first completely scalable quantum simulation of a hydrogen molecule.
That’s big news, because it shows similar devices could help us unlock the quantum secrets hidden in the chemistry that surrounds us.
originally posted by: neoholographic
I do some Computer Programming on the side and I build websites, the world as a simulation makes more sense to me rather than the world being the result of some random physics. There's no such thing as randomness to me. Everything is governed by a set of rules, laws and values and you just get variations of these rules and laws.
More and more people are talking about the universe as a simulation. Here's a talk from Neil deGrasse Tyson, James Gates and others on the topic.
Here's another talk called Our Universe as a Hologram
The point is, Scientist are finding it easier to explain our universe as a construct od 2D information rather than it being an objective material reality. This goes back to Plato and the Allegory of the Cave and Digital Physics which has been around for awhile. There was a recent study that talked about Einstein and how there's no signal between what he called "spooky action at a distance."
As described in a paper posted online and submitted to Physical Review Letters (PRL), researchers from NIST and several other institutions created pairs of identical light particles, or photons, and sent them to two different locations to be measured. Researchers showed the measured results not only were correlated, but also—by eliminating all other known options—that these correlations cannot be caused by the locally controlled, "realistic" universe Einstein thought we lived in. This implies a different explanation such as entanglement.
The NIST experiments are called Bell tests, so named because in 1964 Irish physicist John Bell showed there are limits to measurement correlations that can be ascribed to local, pre-existing (i.e. realistic) conditions. Additional correlations beyond those limits would require either sending signals faster than the speed of light, which scientists consider impossible, or another mechanism, such as quantum entanglement.
In the best experimental run, both detectors simultaneously identified photons a total of 6,378 times over a period of 30 minutes. Other outcomes (such as just one detector firing) accounted for only 5,749 of the 12,127 total relevant events. Researchers calculated that the maximum chance of local realism producing these results is just 0.0000000059, or about 1 in 170 million. This outcome exceeds the particle physics community's requirement for a "5 sigma" result needed to declare something a discovery. The results strongly rule out local realistic theories, suggesting that the quantum mechanical explanation of entanglement is indeed the correct explanation.
phys.org...
Local realism is dead and has been dead for awhile but there will be some who cling to this because the thought that our universe isn't the sum of all that exists scares them because if the universe is virtual and a simulation then that means there limited view of "reality" that's based on localism is false.
This brings us to Entanglement. Entanglement is starting to look like a very key player in just about everything. Entanglement is tied to the holographic universe, black hole thermodynamics, gravity and more.
If you look at entangled particles, they only make sense in the context of the language of computation. Calling them particles conjures up the image of particles of sand or salt and of course if you have 2 particles of sand at the opposite ends of the beach, they will not be correlated. I think semantics plays a role in why these things may be hard to grasp at times.
Instead of particles they should be called pixels and space-time should be called a screen. So what we call particles are more like pixels on a space-time screen rather than particles of sand.
I can write a program where red dots are moving randomly on the screen and then I write into the program that everytime a red dot gets entangled with another red dot when I click on the red dot the dot it's entangled with turns green. I don't need any signal betwwen the dots because they all share the same screen. If I blow the computer screen up to the size of the universe, the dots will still be connected no matter where they are on the screen. Now, it's just a matter of processing information.
I think this is entanglement and there's no need for any spooky action at a distance because what we call "particles" are more like pixels and space-time is mor like a computer screen. When the program is being processed, vast amounts of data on the 2D horizon is being projected. M.I.T. Professor Seth Lloyd calculated the Computational Capacity of the Universe.
Computational capacity of the universe
Merely by existing, all physical systems register information. And by evolving dynamically in time, they transform and process that information. The laws of physics determine the amount of information that a physical system can register (number of bits) and the number of elementary logic operations that a system can perform (number of ops). The universe is a physical system. This paper quantifies the amount of information that the universe can register and the number of elementary operations that it can have performed over its history. The universe can have performed no more than 10120 ops on 1090 bits.
arxiv.org...
You can look at our universe like a supercomputer and the multiverse as a Quantum Computer. This is why claasical physics and quantum mechanics have been at odds but many Scientist recognize this and you have theories being proposed that confine the classical universe to our brane or pocket while quantum mechanics plays out through all branes or pockets.