It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The Replacement of the silicon chip?

page: 1
4

log in

join
share:

posted on May, 14 2008 @ 09:45 PM
link   
The silicon chip is reaching it's limits as software of all kinds demands more out of common processors and the transistors that make them work. Current silicon processors are also burning up more and more electricity as they become more and more powerful. Well, there's new transistor on the block, and it isn't the long awaited carbon-transistor, this baby is in a class all on it's own. It's made of a compound called Gallium Nitride (GaN), which is more efficient in power use, faster in information transfer and far colder than you current silicon transistor based chip. This article will explain the rest.

Alternative To Silicon Chip Invented By Student
www.sciencedaily.com...

ScienceDaily (May 13, 2008) — Even before Weixiao Huang received his doctorate from Rensselaer Polytechnic Institute, his new transistor captured the attention of some of the biggest American and Japanese automobile companies. The 2008 graduate's invention could replace one of the most common pieces of technology in the world--the silicon transistor for high-power and high-temperature electronics.


Is this the face of the death of silicon? Or more wishful thinking? Do you suppose this will be put on the back burner while large computer corporations continue to flood the market with obsolete silicon chips? Which are not only environmentally harmful, but expensive, and don't do too well in hot environments. Your thoughts?



posted on May, 15 2008 @ 01:12 AM
link   
I think these people will buy buddy's chip but like most businesses will continue with the current technology until it's incredibly obsolete. That, or the "free market god" could make one of the companies competitive (aka desperate) and they could adopt it first. I await the future...



posted on May, 15 2008 @ 02:58 AM
link   
If I were a betting man, I'd bet that major corporations will keep using silicon until their transistor sizes are down to 10-20nm, and only thin switching to something altogether different, like quantum computing if it no longer requires supercooling, or possibly some kind of optical computing technology.

Knowing the industry, I'd say that this kind of transistor will only see specialized applications.



posted on May, 15 2008 @ 05:53 AM
link   

Originally posted by projectvxn
Current silicon processors are also burning up more and more electricity as they become more and more powerful


thats not true, the market demands energy efficient chips because there cooler making them more overclockable.

the new chips stay way cooler than the last generation of dual/
quad cores and the pentiums.



posted on May, 15 2008 @ 12:17 PM
link   
reply to post by purplemonkey
 



That's is only true to a point. They require external materials and devices to keep them cool. And fans take power. Thermal Gel is oil based. What I would like to see is cool-running processor technology rather than processor cooling technology.



posted on May, 15 2008 @ 12:32 PM
link   
reply to post by mdiinican
 


Quantum computing is advancing everyday but there are still major hurtles in creating an interface that wouldn't collapse the quantum channel it was transmitting on.

There's also been alot of talk about the carbon-processor which is supposed to run super-cool.(not like cryogenic, but certainly much cooler than standard silicon chips).

But now here we have something that is well within our means, yet will allow for far better computers. The basic idea is still the same as the silicon structure, but now you can do far more with that structure, at lower temperatures, higher energy efficiency, and higher speeds.

Silicon is already incredibly obsolete. I think the only thing that could hold back a chip like this is greed.



posted on May, 15 2008 @ 04:12 PM
link   

Originally posted by projectvxn
reply to post by mdiinican
 


Quantum computing is advancing everyday but there are still major hurtles in creating an interface that wouldn't collapse the quantum channel it was transmitting on.

There's also been alot of talk about the carbon-processor which is supposed to run super-cool.(not like cryogenic, but certainly much cooler than standard silicon chips).

But now here we have something that is well within our means, yet will allow for far better computers. The basic idea is still the same as the silicon structure, but now you can do far more with that structure, at lower temperatures, higher energy efficiency, and higher speeds.

Silicon is already incredibly obsolete. I think the only thing that could hold back a chip like this is greed.



The only reason we build anything is greed. Unless this kind of transistor can be made in IC form on the same machines that are used to make silicon chips, it's probably not going to catch on; the investment would be massive. Switching materials is a huge step, even if it sounds simple. It would take years to work out what the subtle differences are, and how to get around little quirks in the process. The costs could seriously be in the billions of dollars.

Even with Gallium nitride chips, there's no advantage in the minimum size of the transistors. They've got advantages in general, sure, but they'll hit rock bottom in terms of size right along with silicon. Transistors in general only have a paltry few decades before we have to go and find something we can make denser.

Of course, if the advantages are great enough, it will definitely see specialist applications. You can see all kinds of strange devices when cost is no issue. The aerospace industry, for example, already uses all kinds of strange materials, in the chips, packaging, and PCB substrate.



posted on May, 15 2008 @ 07:44 PM
link   
It would be money well spent to upgrade our digital infrastructure physically at the very basic level. Think of the advances computer based research fields can accomplish, they may even be able to figure out that quantum interface issue. New programming languages would have to be developed. This would recreate an already hurting and almost dead software/hardware industry in America. Couple these efforts with energy changes and you'll be spending trillions in the long run. But debts will be paid, new industries created, people will be working, more education funding...So forth and so on. These are the reasons to change an infrastructure. And societies in general have been changed deeply by just the right innovation, but there were people who took risks and saw to it that we live in a world of technological advancement.

Without risk, nothing can come of innovation.



posted on May, 15 2008 @ 10:34 PM
link   

Originally posted by projectvxn
reply to post by purplemonkey
 



That's is only true to a point. They require external materials and devices to keep them cool. And fans take power. Thermal Gel is oil based. What I would like to see is cool-running processor technology rather than processor cooling technology.


No, that's not true at all. The Thermal Design Power (TDP) of processors is steadily going down because people demand more energy-efficient designs. The Pentium D (circa 2005) topped out at 130 watts. Core 2 Duo (circa 2007) topped out at 75 watts. The latest Core 2 chips top out at 35 watts. And that's just desktop processors. Intel has mobile processors with a TDP less than 5.5 watts.

In other words, the amount of external cooling processors require has steadily been decreasing. Intel's latest Atom processors for ultra-mobile computers don't even require a fan.



posted on May, 16 2008 @ 01:43 AM
link   
My old X850 had about 160 million transistors and had a TDP of 90 watts. My 8800GT has about 720 million transistors and has a TDP of 90 watts. Atleast 4 times the performance. My old Pentium 4 650 had a single core rated TDP of 88 watts, and 169 million transistors. My new Quad Q6600 had a TDP of 95 watts and has a total of 582 million transistors. At stock clocks at 2.4ghz, each core is 40% faster than my old P4 and I have four of them. These technologies are only three years apart.

Not only are these dramatic increases in efficiancy, they're dramatic increases in speed for rouphly the same thermal design power, not only this, but maximum thermal temperature goes up with each line of processors, for example, the Core 2 quad line, has it constantly being increased. QX9770 has a higher maximum temperature than the QX9650, which is again higher than the G0 Kentsfield, which itself is higher than the B3 revision Kentsfield.

And even my Q6600 is heavily UNDERCLOCKED when shipped. I can get 3ghz (From 2.4ghz) with stock cooler, stock voltage, while still being in thermal specifications. The coolers can handle it. The silicon can handle it. Why not dramatically increase TDP for a dramatic increase in performance? There is no good reason not to - we're at no limit with what we're doing now, and won't be for many years.

For the record I have my Q6600 at 3.6ghz, faster than a $1700AUD processor.
It would of course be fantastic for a totally new manufacturing process, but let's renember you have Intel spending millions if not hundreds of millions researching this kind of thing, yet hasn't been done. What would be the yields for it? I mean sure, there IS 10nm and experimental chips out there, but the yeilds would be so extremely low that it wouldn't be worth the investment. I know someone who gets engineering sample videocards, last one I knew about was a sample of the 2900XTX... 18000 3dmark06 almost two years ago - for obviously reasons it wasn't released to the market, yeilds were EXTREMELY low, meaning that the majority made had defects in them.


In conclusion, there has been a massive amount of new manufacturing technologies out there for computer chips, however, you simply won't get the return for your investment. To get that the manufacturing process has to be mature, so I say - keep funding experimentation. However, I would easily say for the near future on Nellie it would be the smartest idea to stick with 45nm then to 32nm.




The Pentium D (circa 2005) topped out at 130 watts. Core 2 Duo (circa 2007) topped out at 75 watts. The latest Core 2 chips top out at 35 watts. And that's just desktop processors.

Actually the Core 2 line is topping out at 136 watts. However, it's atleast five times as fast as its Pentium D counterpart for the same use of power. With a $50 cooler you can push the 3.2ghz QX9770 to 4.5ghz within maximum thermal design limits, completely stable.


Thermal Design Power: 136 watt.

processorfinder.intel.com...

The new duals put out:

Thermal Design Power: 65 watt.

processorfinder.intel.com...

That's the same as the E6700, an early Core 2, the Extreme version of an early Core 2 dual (X6800) was 75 watt.

The 5.5 watt processors are usually slow piles of crap not even worth considering.
Whoever buys one I laugh at, because the only reason they use so little power is because they are severely underclocked, read - HANDICAPPED. They are NOT more efficiant than things that use more faster, they're just so friggen slow in order to save power.

If you want to save power get something that's only what you need, then enable all the power saving options for it. For the amount of electricity it uses processors are far more efficiant then they've ever been, with even more power saving options than there's ever been.





In other words, the amount of external cooling processors require has steadily been decreasing. Intel's latest Atom processors for ultra-mobile computers don't even require a fan.

Any modern processor can run without a fan or even without a heatsink, however, it will automatically underclock itself so much that you'll be handicapped - just like Atom. Intel did it for there Core 2 demo back in 2006.

And you probably don't want to try removing the heatsink.


[edit on 16/5/2008 by C0bzz]



new topics

top topics



 
4

log in

join