It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

REVERSIBLE COMPUTING: Computers 10^10 more energy efficient!

page: 3
17
<< 1  2   >>

log in

join
share:

posted on Dec, 28 2010 @ 03:37 PM
link   

Originally posted by harrytuttle
The fundamental problem with Kurzweil's predictions is that he believes thought/consciousness is caused by neural calculations (as does most all of western medical science).

The truth is that neural calculations are caused by consciousness.

So until you can get consciousness bound into these computers of the future, they will always be just dumb machines.

You can program all you want for the machine to mimic thought and decision making processes, but they will always only be models.
edit on 28-12-2010 by harrytuttle because: clarification


While I agree that Kurzweil and the rest of most neuroscientists have it wrong in holding that consciousness arises from the physical/neural (I think the brain, neural calculations, thoughts, etc. all arise from consciousness), there have been interesting theories put forward recently about consciousness as a purely physical phenomena. Physical meaning within the realm of what science currently considers to be natural/physical (which is always changing as science progresses).

Here's the homepage of Chris King, a professor/lecturer who has worked out some quite heavy mathematics to explain consciousness: www.math.auckland.ac.nz...




posted on Dec, 29 2010 @ 04:56 PM
link   
reply to post by VonDoomen
 



when i say simulating a Person, I mean simulating everything about that person, including the physical aspect down to the molecules, down to the neuronal connections.


At that point, you would need more mass than is required to be that person, in order to simulate that person molecule-for-molecule.

Nothing simulates a molecule quite like a molecule.

The only way around this gets into manipulating something like this through relativistic means - a computer of smaller mass could be placed well outside the influence of gravity within our galaxy and, given an extreme difference in field density, simulate the behavior of another mass (molecule-per-molecule) at a faster rate than the molecule being simulated is actually moving.

Otherwise - all simulations are merely functional models to describe behavior. An aluminum bar doesn't need to be simulated at the molecular level until certain regions exceed the capabilities of a simpler model. For example - a wall is a wall, the only time you need to simulate it at a molecular level is if, say, I were to punch that wall. In a concrete wall, you wouldn't even need to simulate it at a molecular level - were it the wall of this apartment, where I could dent it - you may have to get into simulating particles and soft bodies, but will not likely have to resort to the molecular level.

Of course - this begs the question of whether or not this universe is a sort of computer. Let's say we wanted to compute some very complex physics - we could create a parallel or sub-dimension that 'collapses' back into our own dimension - data is gleaned from that universe from its own entropic demise and used to answer whatever questions we had. Considering this may all be happening within a fraction of a second to them, and information-energy equivalency; this would be one hell of a data-crunching computer. Though it would be almost impossible to say what they would be attempting to simulate - or how our presence fits in. Are we the "ghost in the machine" - the product of quantum uncertainty - or the entire purpose for the simulation?

reply to post by Dance4Life
 


Hmm... I've been doing some more digging... right now the biggest thing to come about is, essentially, taking a data-bus and tying it to laser data buses directly - which means computers in five to ten years may be radically different - I can only imagine the kinds of crazy that will happen with a laser hyper-transport bus connecting a hundred or more "phenom x12" cores with several terabytes of RAM and less latency than existing buses. Buses with faster switching and gating times than the processor, itself, are practically a complete inversion of computers - it is usually the other way around, the bus runs practically an order of magnitude slower than the processor clock.

It also means we could go to much wider buses and run them without risk of cross-talk. A thousand "angel hair" laser fibers could be run across a single layer of board where traditional board design would have been restricted to 128 or fewer total, and lay considerable burden on the designers to prevent cross-talk and equalize the length of traces (so that signals arrive at the same time... when you're talking about switching in the gigahertz, it can mean the difference between chronic data corruption and not).

Anyway - looking for publications regarding actual photon processors is like looking for a mound of sugar in an ocean of salt. A lot of stuff that looks similar, but is not what you're looking for.

Of interest in my findings, however: www.rand.org...

The DoD has been looking into photonics applications since 1989 - and that covers some of their notes. Apparently, at least at that point in time, photonics was in the highest-priority of technologies to develop and acquire.

www.wired.com...

This is a rather interesting piece of the past.... a lot of the talk of photonics comes from the early 90s, it seems. Silicon seems to have vastly exceeded industry expectations and therefor delayed the market-drive for photonics.

To quote:


Secondly, by making transistors smaller, one can make chips that run faster while using less power. This increase in the number of transistors on a chip has souped-up operating speeds to supersonic levels. Today's densest CPU chips, which are clocked at up to 100-Mhz and have 32-bit data paths, contain just a few million transistors per chip and run at speeds nearing 100-million instructions per second. Even within the limits of current technology, the year 2000 could bring us 500-Mhz, 64-bit chips packed with more than 100-million transistors that could cruise at more than 2-billion instructions per second.

RAM, the basic component of computer memory, will undergo a similar growth in capacity. Semiconductor manufacturers claim that they will be able to produce gigabit (a billion bits) RAM chips within the next 10 years. By comparison, today's RAM chips just recently hit the 16-Mbit, or 16-million-bit, level. (An interesting aside: one of those manufacturers also predicts that by the year 2000, most RAM will go into HDTV sets, not traditional computers.)


Kind of strange reading something from the past, like that - both how correct and how wrong it was.


A far more radical approach to chip development argues for the abandonment of electronics in favor of photonics - a technology based on the use of photons, the basic particles of light. Digital photonic processors have already been demonstrated by scientists at AT&T's Bell Laboratories, but commercial applications are still many years away. AT&T's experimental photonic processor uses the world's smallest laser to send light through a chip composed of many microscopic lenses and mirrors etched into quartz.

Photonic processors outshine electronic chips in several ways. Most important, light can carry more information than electricity, and more quickly. AT&T scientists estimate that photonic processors will process more than 1,000 times as much information as today's most powerful supercomputers.



With photonic processors, one can also expand the capabilities of the chip's data paths - a profoundly important feature. The biggest performance bottleneck of future systems will not be the speed of the processor, but rather the speed of getting information on and off the chip. With photonic processors, a computer could have an optical backplane that provided more than 1,000 I/O channels, each running at speeds of more than a gigabit per second, compared with the 5-Mbyte-per- second speed of standard personal computer buses.


- They nailed that one. That's exactly what is driving photonics today. ... Of course, they couldn't have imagined companies like IBM getting a terabit out of a fiber line... no one knows what to do with a terabit now... much less seventeen years ago.

Here's the wikipedia article on Photonics: en.wikipedia.org...

Now, I'm going to do a bit of digging on photonic transistor breakthroughs... that's really where the bread and butter is - we need the photon transistor before we can make the photon processor... well - if we want something that behaves like our present computers.

This is about the best thing I could find with regards to the potential for light-transistors: www.photonics.com...

Though taking that and making a CPU out of it is probably quite a ways off.



posted on Jan, 5 2011 @ 01:41 PM
link   
reply to post by Aim64C
 


well at our current level of computation, yes simulating a human down to that detail is merely a pipe dream.

I do however believe it will be possible, and will require less mass. Remember we're talking about simulating with photons, not actual physical molecules.

One thing in your post I really liked tho?

The idea of putting computers into deep gravity wells(time dilation). I did a paper for school on time travel. and one of the ideas was that if you could somehow situate yourself in the center of jupiter, you would experience a good amount of time dilation.
Im rather embarassed to say i never considered putting a computer into this situation!
Thanks for the new idea, its not every day I genuinely hear something new!



posted on Jan, 6 2011 @ 02:12 PM
link   
I might be barking up the wrong tree, but... If you are changing the particles inside the rock to something more meaningfull, then at what point would the rock stop becoming a rock?



posted on Jan, 6 2011 @ 07:18 PM
link   
reply to post by Solarity
 


Thanks for the question.

And the answer is no. The rock would for all purposes be the same.
Take one atoms of this rock for example.
we have a proton, electron, and a neutron. We could if the electron is spinning in one direction, it is considered a 1 If it is spinning the other way, we would say it is a 0. And that is the beginning of storing information in a binary manner. This is just one example. But not, the rock would still be a rock. In the future, anything could be a computer!



posted on Jan, 7 2011 @ 02:24 AM
link   
So the spin of the atom or particle has no effect on what its actually making?



posted on Jan, 14 2011 @ 04:45 PM
link   
reply to post by Solarity
 


It could have a difference. but that is only 1 characteristic of an atom. To be honest, I do not know what the best characteristics are to use, im just giving an example.

just remember, the particle states is what determines/tell the computer 1 or 0. the rock is not "intelligent". we are just embuing meaning into the rock, through its atomic state.



posted on Jan, 14 2011 @ 04:47 PM
link   
Here is a cool update on this story.

According to this video, there is a 16-bit quantum computer. Traditionally, these computers have been limited to 1-2 bits. This provides an exponential leap in ability.




posted on Jan, 14 2011 @ 04:47 PM
link   
Gosh reading this i'm like thinking.
An alien civ comes and sees us, they're "wow they still war. *sigh*... OMG they actually figured out the most efficent way of computing"

This is super future stuff



posted on Apr, 18 2011 @ 03:00 AM
link   

edit on 18-4-2011 by Communicater because: Posted on wrong vine



posted on Sep, 5 2012 @ 06:58 PM
link   
reply to post by Aim64C
 


I was reading over this thread again, and this post in particular i wanted to comment on.

Yes, you could require an insane amount of computational medium if you wanted to perfectly simulate a human. But i think we are missing one important variable, and that is time. I am not sure what the shortest amount of time is possible with physics, but that would be the time frame that reality computes itself constantly.

Our simulation of reality would not need to done down to that exact timescale. Considering humans only see 30-60 frames per second, there would really be no point in simulating anything at the minimum time scale provided by physics. I believe this would drastically cut down the computational power required to simulate something.



new topics

top topics



 
17
<< 1  2   >>

log in

join