posted on Sep, 5 2009 @ 04:00 PM
Let's see if I can get this right.
Binary code. In current computer language, there is a base. On and off. Everything done by a computer, every calculation, every representation is a
combination or sequence of zero and 1. On and off. Every calculation is a derivative of ones and zeros.
A quantum computer comes from the idea of a quark. The signal-state that can be represented is either 1, 0, (and here's the kicker) BOTH. It can
have a value of one AND zero at the same time. Confusing, I know, but in quantum mechanics, the idea of a quark is old hat.
What this would represent is almost too big to explain.
In the computer world, things usually go in doubles. from 4bits, to 8bits, to 16bits, to 32, 64, 128, etc... An example would be the Nintendo game
systems. I'm not sure which is which, but the N64 was a product of that system. RAM and Video goes the same route. Almost anything digital will
participate in teh doubles rule. BUT, there are a few exceptions.
With this quantum computer, the numbers wouldn't double. They would increase exponentially. It's the difference between 2d and 3d. A whole new
dimension. Instead of going from 8 to 16 bits, it would have jumped straight from 8 to 64 bits.
Let's put it into perspective. You start with an 8 megabyte stick of RAM for your computer. Then next logical upgrade for quarks would be 64 megs.
Here's where it gets fun... it jumps from 64 megs... to 4 gigs. then 4 to 16. In regards to system speeds, we would jump from 2.6gigahertz to 6.76.
Think of the YEARS we had between a 2.6 Athlon to what we have today. The next stage after that would be around 40 gigahertz.
This would affect ANYTHING digital. Phones, communication, computer processing, data storage, broadband, wifi. With teh same rate of advancement,
we'd be able to have hollographic displays run from a thumbnail-sized computer that could perform every function available in a modern office
building. In a matter of two or three years.