It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
To appreciate the feasibility of computing with almost no energy or heat, lets consider the computation that takes place in any ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 10^25 (ten trillion trillion) atoms in a kilogram of matter are actualy extremely active. Despite its apparent solidarity, the atoms are all in motion, sharing electrons, changing particle spins, and generating rapidly moving electromagnetic fields. all of this activity represents computation, though not MEANINGFULLY organized. -The singularity is near
In terms of computation, and just considering the electromagnetic interactions of a 1 kilogram rock, there are atleast 10^15 changes in state per bit per second, which represents about 10^42 (million trillion trillion trillion) calculations per second. YET THE ROCK REQUIRES NO ENERGY INPUT AND GENERATES NO APPRECIABLE HEAT -Singularity is near
If we were to add up the processing power of all human brains on the planet right now, we would have an estimated only 10^19 calculations per second.-Singularity is near
we are unable in principal to run software programs backwards. At each step, the input data is discarded-erased- and the results pass on to the next step. The act of erasing data generates heat, and therefor requires energy. According to the law of thermodynamics, that information bit is essentially released into the surrounding environment, thereby increasing its entropy.....
The fundamental concept is that if you keep all of the intermediate results and then run the algorithm BACKWARDS, upon completion of the calculation, you end up where you have started, have used NO energy, and generated NO heat. And along the way, you have still calculated your result.-Singularity is near
Because of essentially random thermal and quantum effects, logic operations have an inherent error rate. we can over come errors with error detection codes. however fixing an error is irreversible and requires energy. Generally error rates are low. If we have say 1 error per 10^10 operations, we have succeeded in reducing energy requirement by a factor of 10^10, though not eliminating energy dissipation altogether. -Singularity is near
Conventional models that have the distinction that the basic components are microscopically reversible. This means that the macroscopic operations of the computer is also reversible. This fact allows us to address the question “what is required for a computer to be maximally efficient? The answer is that if the computer is built out of microscopically reversible components then it can be perfectly efficient. How much energy does a perfectly efficient computer have to dissipate in order to compute something? The answer is that the computer does not need to dissipate any energy.
computation are performed by manipulating nano-scale rods, which are effectively spring loaded. after each calc, the rods containing intermediate values return to their original position, thereby implementing reverse computation. The device has 10^12 (trillion) processors and provides an overall rate of 10^21 calc per second, enough to simulate one hundred thousand human brains PER CUBIC CENTIMETER.
There is a direct proportional relationship between the energy of an object and its potential to perform computation. the PE in a kilogram of matter is very large as we know from Einstein (E=MC^2). The potential of matter to computer is also governed by planks constant: 6.6X10^-34 joule-seconds. This is the smallest scale at which we can apply energy for computation. We obtain the THEORETICAL limit of an object to perform computation by:
dividing the total energy(avg energy of each atom x the # of particles) by plancks constant(6.6X10^-34 joule-seconds)
Since the amount of energy is so large, and plancks constant is so small, we get an extremely large number: about 5X10^50 CPS for 1 kilogram of matter....
If we relate this to the most CONSERVATIVE estimate of human brain capacity (10^19 CPS) it represents the equivalent of about 5 billion trillion human civilizations...
A 1 kilogram, perfectly efficient cold computer would be able to perform the equivalent of all human thought, over the last 10,000 years in 1/10,000 of a NANOsecond.
Originally posted by truthquest
That is most definitely one of the most nonsensical ideas I've ever read. Compute BACKWARDS? What a joke? Lets see, all you have to do is start with the answer to the problem, then go back and figure out how you got the answer. Ummm... no. You can't start with the answer without FIRST doing the algorithm! Come on, this cannot be serious?
and then run the algorithm BACKWARDS, upon completion of the calculation
Originally posted by VonDoomen
Originally posted by truthquest
That is most definitely one of the most nonsensical ideas I've ever read. Compute BACKWARDS? What a joke? Lets see, all you have to do is start with the answer to the problem, then go back and figure out how you got the answer. Ummm... no. You can't start with the answer without FIRST doing the algorithm! Come on, this cannot be serious?
What are you talking about? Did you even bother to read my post? REVERSIBLE means a process that once done, can be reversed.
and then run the algorithm BACKWARDS, upon completion of the calculation
============================================
The way this works is you run the computation(algorithm) forward to compute the answer. you then send the answer out(output, requires energy) and then you then run the algorithm in reverse(*returning the everything to its initial state*) before you ran the algorithm.
============================================
The rock example is a perfect example of this except that we dont get output of info from a rock, thus why rocks dont generate any appreciable heat except due to random thermal/quantum effects (very low amount).
Reverse programming and reversible logic gates have already been proven to work and conserve energy. The only reason they havent been implemented on a vast scale yet is because the programming and software is very tricky and cost/time consuming to develop.
[edit on 12/2/2009 by VonDoomen]
Originally posted by VonDoomen
Today we use what is called IRREVERSIBLE COMPUTING, meaning
we are unable in principal to run software programs backwards. At each step, the input data is discarded-erased- and the results pass on to the next step. The act of erasing data generates heat, and therefor requires energy. According to the law of thermodynamics, that information bit is essentially released into the surrounding environment, thereby increasing its entropy.....
The fundamental concept is that if you keep all of the intermediate results and then run the algorithm BACKWARDS, upon completion of the calculation, you end up where you have started, have used NO energy, and generated NO heat. And along the way, you have still calculated your result.
=========================================
Conventional models that have the distinction that the basic components are microscopically reversible. This means that the macroscopic operations of the computer is also reversible. This fact allows us to address the question “what is required for a computer to be maximally efficient? The answer is that if the computer is built out of microscopically reversible components then it can be perfectly efficient. How much energy does a perfectly efficient computer have to dissipate in order to compute something? The answer is that the computer does not need to dissipate any energy.
=========================================
If a most efficient supercomputer works all day to compute a weather simulation problem what is the minimum amount of energy that must be dissipated according to the laws of physics? The answer is actualy very simple to calculate, since it is unrelated to the amount of computation. The answer is always equal to zero- Edward Fredkin, Physicist.
Originally posted by truthquest
The explanations seem like dramatic over-simplifications... most notably the idea a rock can be re-arranged into something comparable to an intelligent being.
Originally posted by VonDoomen
Im sorry to truthquest. I just realized there was one portion I forgot to add into my first post, which would have (hopefully) made more sense.
Im gonna edit it in after this post. It should have gone in before eric drexlers nanocomputer.
=========================================
Conventional models that have the distinction that the basic components are microscopically reversible. This means that the macroscopic operations of the computer is also reversible. This fact allows us to address the question “what is required for a computer to be maximally efficient? The answer is that if the computer is built out of microscopically reversible components then it can be perfectly efficient. How much energy does a perfectly efficient computer have to dissipate in order to compute something? The answer is that the computer does not need to dissipate any energy.
=========================================
If a most efficient supercomputer works all day to compute a weather simulation problem what is the minimum amount of energy that must be dissipated according to the laws of physics? The answer is actualy very simple to calculate, since it is unrelated to the amount of computation. The answer is always equal to zero- Edward Fredkin, Physicist.
Originally posted by LordGoofus
To be honest as a developer I don't see how this could possibly work. We already have processes wrapped within "transactions" that allow the process to be reversed if an unexpected event occurs. In fact most serious database management systems & enterprise applications do this on a daily basis.
I can take a bit, set it to one, then set it back to zero, but that doesn't "undo" the energy used, it does the exact opposite. You use electricity to move the heads on the hard drive and magnetise the bit (setting it to 1), then you move the heads on the hard drive and demagnetise the bit (setting it to 0). Each time energy (electricity) is required to move the heads, so you're actually doubling the energy requirements.
Rolf Landauer showed in 1961 that reversible logical operations such as NOT (turning a bit into its opposite) could be performed without putting energy in or taking heat out, but that irreversible logical operations such as AND (generating bit C, which is a 1 if and only if both outputs A and B are 1) do require energy.
www.research.IBM.com/journal/rd/053/ibmrd0503c.pdf
Subscription require
In 1973 Charles Bennett showed that any computation could be performed using only reversible logical operations.
www.research.ibm.com/journal/rd/176/ibmrd1706g.pdf
-The singularity is near
To appreciate the feasibility of computing with almost no energy or heat, lets consider the computation that takes place in any ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 10^25 (ten trillion trillion) atoms in a kilogram of matter are actualy extremely active. Despite its apparent solidarity, the atoms are all in motion, sharing electrons, changing particle spins, and generating rapidly moving electromagnetic fields. all of this activity represents computation, though not MEANINGFULLY organized.
In terms of computation, and just considering the electromagnetic interactions of a 1 kilogram rock, there are atleast 10^15 changes in state per bit per second, which represents about 10^42 (million trillion trillion trillion) calculations per second. YET THE ROCK REQUIRES NO ENERGY INPUT AND GENERATES NO APPRECIABLE HEAT.