reply to post by VonDoomen
when i say simulating a Person, I mean simulating everything about that person, including the physical aspect down to the molecules, down to
the neuronal connections.
At that point, you would need more mass than is required to be that person, in order to simulate that person molecule-for-molecule.
Nothing simulates a molecule quite like a molecule.
The only way around this gets into manipulating something like this through relativistic means - a computer of smaller mass could be placed well
outside the influence of gravity within our galaxy and, given an extreme difference in field density, simulate the behavior of another mass
(molecule-per-molecule) at a faster rate than the molecule being simulated is actually moving.
Otherwise - all simulations are merely functional models to describe behavior. An aluminum bar doesn't need to be simulated at the molecular level
until certain regions exceed the capabilities of a simpler model. For example - a wall is a wall, the only time you need to simulate it at a
molecular level is if, say, I were to punch that wall. In a concrete wall, you wouldn't even need to simulate it at a molecular level - were it the
wall of this apartment, where I could dent it - you may have to get into simulating particles and soft bodies, but will not likely have to resort to
the molecular level.
Of course - this begs the question of whether or not this universe is a sort of computer. Let's say we wanted to compute some very complex physics -
we could create a parallel or sub-dimension that 'collapses' back into our own dimension - data is gleaned from that universe from its own entropic
demise and used to answer whatever questions we had. Considering this may all be happening within a fraction of a second to them, and
information-energy equivalency; this would be one hell of a data-crunching computer. Though it would be almost impossible to say what they would be
attempting to simulate - or how our presence fits in. Are we the "ghost in the machine" - the product of quantum uncertainty - or the entire
purpose for the simulation?
reply to post by Dance4Life
Hmm... I've been doing some more digging... right now the biggest thing to come about is, essentially, taking a data-bus and tying it to laser data
buses directly - which means computers in five to ten years may be radically different - I can only imagine the kinds of crazy that will happen with a
laser hyper-transport bus connecting a hundred or more "phenom x12" cores with several terabytes of RAM and less latency than existing buses. Buses
with faster switching and gating times than the processor, itself, are practically a complete inversion of computers - it is usually the other way
around, the bus runs practically an order of magnitude slower than the processor clock.
It also means we could go to much wider buses and run them without risk of cross-talk. A thousand "angel hair" laser fibers could be run across a
single layer of board where traditional board design would have been restricted to 128 or fewer total, and lay considerable burden on the designers to
prevent cross-talk and equalize the length of traces (so that signals arrive at the same time... when you're talking about switching in the
gigahertz, it can mean the difference between chronic data corruption and not).
Anyway - looking for publications regarding actual photon processors is like looking for a mound of sugar in an ocean of salt. A lot of stuff that
looks similar, but is not what you're looking for.
Of interest in my findings, however:
www.rand.org...
The DoD has been looking into photonics applications since 1989 - and that covers some of their notes. Apparently, at least at that point in time,
photonics was in the highest-priority of technologies to develop and acquire.
www.wired.com...
This is a rather interesting piece of the past.... a lot of the talk of photonics comes from the early 90s, it seems. Silicon seems to have vastly
exceeded industry expectations and therefor delayed the market-drive for photonics.
To quote:
Secondly, by making transistors smaller, one can make chips that run faster while using less power. This increase in the number of transistors on
a chip has souped-up operating speeds to supersonic levels. Today's densest CPU chips, which are clocked at up to 100-Mhz and have 32-bit data paths,
contain just a few million transistors per chip and run at speeds nearing 100-million instructions per second. Even within the limits of current
technology, the year 2000 could bring us 500-Mhz, 64-bit chips packed with more than 100-million transistors that could cruise at more than 2-billion
instructions per second.
RAM, the basic component of computer memory, will undergo a similar growth in capacity. Semiconductor manufacturers claim that they will be able to
produce gigabit (a billion bits) RAM chips within the next 10 years. By comparison, today's RAM chips just recently hit the 16-Mbit, or
16-million-bit, level. (An interesting aside: one of those manufacturers also predicts that by the year 2000, most RAM will go into HDTV sets, not
traditional computers.)
Kind of strange reading something from the past, like that - both how correct and how wrong it was.
A far more radical approach to chip development argues for the abandonment of electronics in favor of photonics - a technology based on the use of
photons, the basic particles of light. Digital photonic processors have already been demonstrated by scientists at AT&T's Bell Laboratories, but
commercial applications are still many years away. AT&T's experimental photonic processor uses the world's smallest laser to send light through a
chip composed of many microscopic lenses and mirrors etched into quartz.
Photonic processors outshine electronic chips in several ways. Most important, light can carry more information than electricity, and more quickly.
AT&T scientists estimate that photonic processors will process more than 1,000 times as much information as today's most powerful supercomputers.
With photonic processors, one can also expand the capabilities of the chip's data paths - a profoundly important feature. The biggest performance
bottleneck of future systems will not be the speed of the processor, but rather the speed of getting information on and off the chip. With photonic
processors, a computer could have an optical backplane that provided more than 1,000 I/O channels, each running at speeds of more than a gigabit per
second, compared with the 5-Mbyte-per- second speed of standard personal computer buses.
- They nailed that one. That's exactly what is driving photonics today. ... Of course, they couldn't have imagined companies like IBM getting a
terabit out of a fiber line... no one knows what to do with a terabit now... much less seventeen years ago.
Here's the wikipedia article on Photonics:
en.wikipedia.org...
Now, I'm going to do a bit of digging on photonic transistor breakthroughs... that's really where the bread and butter is - we need the photon
transistor before we can make the photon processor... well - if we want something that behaves like our present computers.
This is about the best thing I could find with regards to the potential for light-transistors:
www.photonics.com...
Though taking that and making a CPU out of it is probably quite a ways off.