It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The new "bit"... IBM shatter's Moore's Law. New era of quantum computing has arrived!

page: 4
37
<< 1  2  3   >>

log in

join
share:

posted on Jan, 14 2012 @ 07:50 AM
link   
reply to post by YouAreLiedTo
 


Evolution has no peak. It simply produces what works.


I believe us, therefore, to be the pinnacle of logic. Because if we weren't, we wouldn't understand the universe.

Ok yes, we don't yet understand the universe, but we have yet to come across something that is without explanation nor understanding to our brains. The fact that we have learned so much in such short time proves that our brains function logically at the top of the chart. For if we did not understand logic, how could we use the logic of the universe to our advantage?


Therefore, we can at least say that in terms of evolution, it always produces the same logic processing system. And as such, we are the pinnacle of logic.

Each of our brains can construct entire universes and beyond. Each one of our brains is infinite, yet made up of finite matter.



posted on Jan, 14 2012 @ 08:04 AM
link   
A smaller required area for data storage does not automatically mean you will have more storage for the same cost, nor that it will be fast. I think a more relevant question is what the cost to produce it is and how fast it is, and in a less degree the space it will take.

Although 12 atoms is very small, I have also heard of transistors with a depletion layer of only 5 atoms thick. At this scale, the usual transistor models no longer apply, and a new way of engineering is required. Se we are already at the limits of the current technology.



posted on Jan, 14 2012 @ 03:08 PM
link   
reply to post by YouAreLiedTo
 



For the title? The only place to go from here is QBit quantum computing...


That has been the "only place to go" for the past 10 years of computer development.

That doesn't bring us "closer" to having functional quantum computing. It means we have pushed the existing technology and process as far as we know how to do it.


It physically is impossible to get any smaller at this point with the way magnetic charges work...


Wait 5 years.

It doesn't have to be magnetic fields, directly. Silicon manufacturing has been pushed beyond several plateaus in the past using a number of different methods. For example, we could begin seeing 'holographic' memory structures over the top of existing storage materials, once we have run out of linear advances.


Anything past this will be in the field of quantum computations...


No, not really. There are many phenomena out there that can be exploited to various ends. I remember, only a couple years ago, reading about how a special arrangement of graphine was able to cause electrons to behave as though they were under a magnetic field several thousand times stronger than we can create at the macroscopic level.

Does that apply to this? I don't really know. The point is - there are plenty of other phenomena out there, aside from quantum mechanics, to exploit.

We are, actually, quite behind the curve. Processing and data busing should be done by optical components. However, several key advances in silicon lithography and internal data buses proved to be more economical than making the leap to light-based computing.

And if you want to talk about something that can utilize quantum mechanics.... the photon is it (electrons don't really cut it. For one - they travel very slow; only their potential travels at roughly the speed of light). If you want to build a data and processing system off of quantum mechanical principles - the photon is going to allow many more direct interactions with the various components.



How is this sensationalized?


Because there is absolutely no relevance to quantum computing.

To form an analogy... this is like a breakthrough in fuel cell technology that allows virtually any hydrocarbon to be used in the fuel cell with twice the efficiency of existing internal combustion engines. That would be a -very- important and influential development.

But you post it with the title of: "GE shatters the oil industry - new era of free energy has arrived!"

Which is exactly how relevant this development is to quantum computing.



posted on Jan, 24 2012 @ 09:53 PM
link   
reply to post by Aim64C
 

A photon based technology will fulfill the illuminati's--light, light bearers, etc.--long-term goals to a "T". By building and overlaying data computing and transportation on a foundation of both moving and steady-state photons will fully incorporate light itself, once again, into mankind's dominant communication systems.



posted on Jan, 24 2012 @ 10:00 PM
link   
This isn't really THAT new. They've been talking about quantum computing and trying to store binary numbers on atoms for over a decade. If they can mass produce this, and it's reliable, that's one thing. But this is probably 10-15 years away, and even then, there could be a more important breakthrough that occurs...



posted on Jan, 24 2012 @ 10:15 PM
link   
Moore's Law stopped being Moore's Law a few years ago, otherwise we would be using 12 GHz computers right now. The problem is heat and programming tendencies and methods, their solution was to make dual/quad core cpus that aren't really any faster running single applications, the speed is the same. Not many applications are made that take advantage of multiple core cpus (Photoshop is one for all you UFO "enthusiasts", PS runs like a dream on dual/quad core cpus). I expect this might change in the future, but for now Moore's Law has run into these problems.

There is no set in stone exponential growth other than the number of mouths to feed on planet Earth, Moore's Law ran into its problems that need to be solved just like anything. With exponential growth comes exponential problems, we are tripping over our own proverbial feet.
edit on 24-1-2012 by RSF77 because: (no reason given)



posted on Jan, 24 2012 @ 11:03 PM
link   

Originally posted by RSF77
The problem is heat and programming tendencies and methods,...

With exponential growth comes exponential problems, we are tripping over our own proverbial feet


Thermodynamics. Landauer limit is relevant principle. Wondering what Rolf would say to IBM today? Gibbs-slaps all around?

See?


Although the research took place at a temperature near absolute zero, .


No no no... show..don't theory..


the scientists wrote that the same experiment could be done at room temperature with as few as 150 atoms

edit on 24-1-2012 by emberscott because: (no reason given)



posted on Jan, 24 2012 @ 11:23 PM
link   

Originally posted by RSF77
Moore's Law stopped being Moore's Law a few years ago, otherwise we would be using 12 GHz computers right now. The problem is heat and programming tendencies and methods, their solution was to make dual/quad core cpus that aren't really any faster running single applications, the speed is the same. Not many applications are made that take advantage of multiple core cpus (Photoshop is one for all you UFO "enthusiasts", PS runs like a dream on dual/quad core cpus). I expect this might change in the future, but for now Moore's Law has run into these problems.

There is no set in stone exponential growth other than the number of mouths to feed on planet Earth, Moore's Law ran into its problems that need to be solved just like anything. With exponential growth comes exponential problems, we are tripping over our own proverbial feet.
edit on 24-1-2012 by RSF77 because: (no reason given)


Not necessarily...

We are already running 8.4GHz on a single core...



Again, granted, we are super-cooling components here...

But it is still a proven scientific reality none-the-less...

And there is plenty of software that is made to fully optimize the use of 6-8 core processors...

Most people just don't happen to use them. But when the technology is cheap enough to produce, why would you not get a multi-core processor that is ready for the next generation of software/OS?

I got my wife a SandyBridge for less than the cost of an XBox on black Friday. Why in the world wouldn't you get it?



posted on Jan, 25 2012 @ 12:46 AM
link   

Moore's Law stopped being Moore's Law a few years ago, otherwise we would be using 12 GHz computers right now.

Moore's Law has little to do with frequency.


Moore's law describes a long-term trend in the history of computing hardware: the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

en.wikipedia.org...'s_law


And essentially it has, although it might be a little slower in the future.


The problem is heat and programming tendencies and methods, their solution was to make dual/quad core cpus that aren't really any faster running single applications,


Silicon scales poorly past about 3.5ghz and it can be difficult to make applications that scale across many cores. So chip designers have been busy slowly adding cores, slowly increasing frequency, adding dynamic frequency (turbo boost & turbo core) as well as increasing the number of instructions per clock.

A modern processor like the i7-2600 has a 3.4ghz base frequency and 3.8ghz turbo frequency. Per core performance is 2.7 times higher than a Pentium 4 at the same frequency which means at minimum an i7 is about 3 times faster in a single threaded application than an old Pentium 4 @ 3.4ghz and at maximum about 11 times faster in a multithreaded application. So a modern processor will be much faster in single threaded applications and massively faster in multithreaded ones.

Most modern demanding programs usually scale well with 2-4 cores, but not perfectly. This isn't 2006 anymore.

Die size for the Pentium 4 (Prescott) is 122mm^2 with 125 million transistors. Came out in about 2005. Density of is therefore 1 million transistors per square mm. Die size of quad core 2nd gen i7 (Sandy Bridge) is 216mm^2 with 1160 million transistors. Bear in mind the Sandy Bridge die also includes integrated graphics and north bridge. Came out in 2011. Density is therefore about 5.5 million transistors per square mm.

If Moores law was completely accurate, we would expect about 8 million transistors per square mm. But this, Sandy Bridge vs Prescott would suggest that transistor density goes up by about 80% every two years (1.8^(6/2) = 5.5). In reality this is just one example, and it's transistor density rather than the total number of transistors in a chip. But in any case it's pretty close to Moore's law and shows the general tend. The number of transistors has gone up by a factor of 8 however.

The sweet-spot for desktop processors at the moment is four cores. That won't change soon. The replacement to intels current generation of desktop Core series processors will only be slightly faster (+100mhz boost and 5-10% higher IPC) than their predecessors because the core count is about the same and so is the architecture (the integrated graphics will be much faster though) but Moore's Law will still not be broken by much because the die size will decrease from 217mm^2 to 172mm^2 while transistors increases from 1.16 billion to 1.4 billion.

On the other hand, server and workstation software typically scales better with cores so they have more at slightly lower frequency.

Graphics cards have different design requirements - the fastest GPU at the moment is the 7970 which has 4313 million transistors, at 925mhz. It has 2048 'cores' but this is sort of PR. Graphics scales almost linearly with cores, unlike normal CPU performance.
edit on 25/1/12 by C0bzz because: (no reason given)

edit on 25/1/12 by C0bzz because: (no reason given)

edit on 25/1/12 by C0bzz because: (no reason given)



posted on Jan, 25 2012 @ 10:31 AM
link   
Informative responses. I wonder if graphene is all its cracked up to be, there will be another push in the speed of single processors/cores? I don't know too much about graphene, but it seems a lot of people think it will solve the problem of heat in very fast processors?

Either way, I think there will probably be all kinds of new problems related to Moore's Law. Once heat problems are overcome I tend to think eventually there will be another. For instance, there is bound to be a point in time when the complexity of processors gets so intense we won't be able to continue making faster ones until a major scientific breakthrough. I suppose computing could push science ahead, but I still think there are quite a few barriers ahead in the future as far as computing goes. Maybe I'm not thinking out of the box enough, with quantum computing and all that.

I just have an negative opinion of dual/quad core cpus, mainly because I think they ran into a brick wall a few years back and this was their answer to expectations like Moore's Law, yet no one was prepared for it so it seems kind of ahead of its time and quickly pushed on to the shelves. It's innovative though, it will be interesting to see where they take this, perhaps it will lead to a revolution in programming and processors. Maybe some time in the future people will write programs that can take advantage of an infinite amount of cores.

The main reason I wouldn't get a quad core for my own purposes is money right now, along with the fact it wouldn't give me any real benefit in the very near future. I tend to buy stuff after the price goes down, I just bought a good dual core cpu for pretty cheap so maybe they have already and I just didn't notice.

Also... Intel.


Not liking some of the things they are up to.
edit on 25-1-2012 by RSF77 because: (no reason given)



posted on Jan, 25 2012 @ 11:25 AM
link   
reply to post by DaRAGE
 


My thought was just to double the number of atoms, and store it as redundant info.

Add a parity bit, if parity check fails on one data set, refer to the redundant info.



posted on Jan, 25 2012 @ 06:13 PM
link   
reply to post by vogon42
 


Yes well that would work too. Sound more practical too ;-P



new topics

top topics


active topics

 
37
<< 1  2  3   >>

log in

join