It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

A Revolutionary Gaming Processor

page: 1
0

log in

join
share:

posted on Nov, 1 2004 @ 06:31 PM
link   
Now that processors are getting smaller and smaller, and more powerful by the minute, they are requiring cooling systems because they produce so much heat. Instead of concentrating so much heat to a tiny little area, why not make processors somewhat larger, so it is easier to remove the heat from them?

Not so huge that they are unmanagable, but at least 5" by 5". That way, it is not to big, and we can successfully remove heat faster and easier without expensive water cooling towers, or even liquid nitrogen (see "A Computer That Is Too Fast", bottom of page 1).

With larger processors, you might think that we could crank out a ton of computing power from them. No! Just keep them the same speed, and then overclock them to the extremes (8+ GHz)
without pricey components.

I know that I post a lot about computers, but I guess it's just my thing.


[edit on 11/1/04 by diehard_democrat]



posted on Nov, 1 2004 @ 06:42 PM
link   
lol was that a joke.

Are you saying we should go back to the wheel. If you prefer old computers you can buy them used.

As for the cooling down of the processor, they'll just create new materials that are resistant to high temperatures. No need for it to be cooled down.

Small processors have a lot more practical applications than large bulky ones as the ones that your suggesting.

Think of it this way: Why buy a watch, when you can get a clock? Although the answer is ridiculously obvious. The question is just there to point the differences between bulky and large components compared to that of light weight useful components.

I also suggest checking out www.howstuffworks.com and typing in processor in the search engine to find out where processors are used. Computers are not the only machines that use processors.



posted on Nov, 1 2004 @ 06:49 PM
link   
Yeah, I know that they are not just in computers. They're in calculators, and there's even one in my telescope, and walkmen too.

And no, I didn't mean that we go back and buy old processors, just turn a 3.6 GHz Pentium 4 from 3cm by 3cm into something like 10cm by 10cm by spreading out the internal components. That way, you have the same speed, but it is easier to cool because everything is spread out more. Hence, better for the processor if you overclock it, and with that cooling boost, you could overclock way beyond what you normally could with a traditional processor.

Try to be supportive if you're going to post here, people.



posted on Nov, 1 2004 @ 08:00 PM
link   

Originally posted by diehard_democrat
Try to be supportive if you're going to post here, people.


Try to be open minded when you post here. If you want to make a post so that everyone will pat you on the back and agree and tell you how great your thoughts are, you're at the wrong place, because when you're wrong there's something you're missing and other people need to know the truth too. Yes-men are for wankers.

Recently Slashdot put up an article on a guy who used a car radiator to water-cool his processor, seems to have worked pretty well.

www.winbeta.org...



posted on Nov, 1 2004 @ 08:44 PM
link   
Heres the future of CPU cooling


Cooligy�s Active Micro-Channel Cooling technology utilizes highly efficient means to absorb heat from the chip�s hot spots and quickly dissipate it to keep the chip cool. The Cooligy cooling system employs a heat collector fabricated from a thin layer of micro-machined silicon that fits on top of a microprocessor package. A very dense area of Micro-Channels etched into the silicon enables fluid to circulate through the heat collector and efficiently absorb and take away heat. Cooligy�s system has been shown to effectively cool microprocessor hotspots of up to 1000 watts per square centimeter.


Here is the co's website. www.cooligy.com...



posted on Nov, 1 2004 @ 08:53 PM
link   
While it wouldn't really hurt ergonomically if a chip was bigger, there are timing considerations. Even though the signals are moving almost the speed of light, a few microns can make a difference in the speed of a computation.



posted on Nov, 1 2004 @ 08:55 PM
link   
To answer your question-it is a matter of economics. The smaller the processor is, the more they can get out of a sheet of material.

A 5" X 5" processor would cost 50-100K per unit or more. Just think how many 1/4" X 1/4" processors you can fit in 5" by 5".



posted on Nov, 1 2004 @ 09:08 PM
link   
Despite the fact that the majority of the chips they make test bad anyway - so it is exponentially more wasteful.



posted on Nov, 1 2004 @ 09:25 PM
link   
IMO, anytime you work with semiconductor wafers that chips are made from you also on top of the cost considerations which come first , need to take into account the properties of semiconductors as a whole.

The term semiconductor means not quite a conductor and not quite an insulator. It is a carbon based material that has a specific resistance to the flow of electrons that is much greater than the best conductor which is silver, the second best conductor copper, and the third best conductor which is gold, even aluminum is a better conductor.

If you increase the distance between point A and point B in a circuit based on semiconductors you increase the reactance/resistance between those two points as well if the same thickness of the layers is observed. The heat created by the increased surface area and the higher resistance between the two points would offset any gains in speed. It would also be imposible to overclock because it would melt under the stress if ran at the same speed as a chip of smaller size overclocked because the surface area would be higher producing more heat through friction caused by the material itself.



posted on Nov, 1 2004 @ 09:37 PM
link   
Liquid cooling or Liquid Nitrogen are the ways to go for extreme purposes; just make sure your system doesnt spring a leak :X Besides, the smaller the processor the more I can fit into my spacious tower that holds a motoherboard larger than my damn moniter!



posted on Nov, 1 2004 @ 09:42 PM
link   
I wish my computer was good enough ot need a liquid nitrogen processor. I wish U was rich enough to buy one. If the processors get bigger than there will be less room in the tower which means the tower would have ot get bigger or they would need to make other stuff smaller.



posted on Nov, 1 2004 @ 09:55 PM
link   

Originally posted by taibunsuu
While it wouldn't really hurt ergonomically if a chip was bigger, there are timing considerations. Even though the signals are moving almost the speed of light, a few microns can make a difference in the speed of a computation.

Agreed



posted on Nov, 1 2004 @ 10:32 PM
link   
I work in a fab making chips. Can't say which one and for whom.

In addition to economics, timing, is yield in relation to manufacturing. Even though these types of chips are made in cleanroom enviroments, there are all sorts of variables affecting the prospective yield of a single individual wafer on which the die are grown.

So shrink the die and you 1) get better speed and 2) get more die per wafer to increase your chances against those variables which kill die.

Increasing the die per wafer in turn tries to get around all the little nasty things that can happen to that wafer in the manufacturing process. The cleanest cleanroom can still leave particles on your wafer and alot of that is from people handling the wafers. More advanced fabs would use a pod based system to seal the wafers away from people while in between processes. But like every other fab, the wafers have to go into the process tool which are hard to keep clean over time.

Then there are scratches which kill die. There are photolithography issues which can mess up parts of a wafer or the whole thing. Problems with metal/insulator layers can make areas too thick or too thin and kill die. Plasma/wet etch can over or under etch areas on a wafer. Copper, which is hot right now, is real tricky to work with and can kill wafers and process tools which don't process copper levels. Nasty, nasty stuff. Kills fabs dead.
And the list goes on and on and you can mix and match any manufacturing problems and really erode your yield down to where its economically pointless.

Smaller is better for those in-demand chips, if you the individual consumer are to be able to afford those chips.

So instead of making a big chip for cooling issues, the appropriate action is to multi-core slower and cooler chips together, so when leveraged together their average processing power is beyond that of a single chip flogging itself as fast as it can to do the same task.



posted on Nov, 2 2004 @ 05:52 AM
link   
Capt Proton hit the nail on the head. The future doesn't lie in more clock cycles, but more cpu cores crammed onto a single chip, clever software/firmware to take advantage of the multiple cores and 64 bit processing (128Bit in the semi-distant future), as well as new higher speed buses (PCIX for example). This applies to both CPUs and GPUs (and as network speeds increase probably NPUs..meh, I made that term up, but it sounds good hehe).

Smaller dies = less distance for a signal to travel = faster clock speed, BUT
faster speed = more wasted energy given off in the form of heat.

I'm not an engineer but my understanding is that this heat is inevitable. Yes, you probably will see cpu's rated as 10Ghz in the future. But it won't be one single 10Ghz core. It will probably be say, 2 x 5Ghz cores, or more realistically, 2 cpus each holding 2 x 2.6Ghz cores.

AMD seem to have realised the benefit of architecture over clock speed a while ago. For example, an AMD 4000+, which competes with and actually surpasses an equivalent P4 4Ghz in most situations, only runs at something like 2.4Ghz, also with lower thermal dissapation, thanks to clever engineering.



posted on Nov, 2 2004 @ 11:14 PM
link   
A number of people already posted the issue of large wafer vs cost and signal propagation accross larger chips, correct no need to repeat here.

The answer lies not is pumping higher and higher frequencies through ever smaller chips, but redesigning systems with multiple cores working in sync and async. You could probably get 40 cores on your 5by5 slab running at 2 ghz each and using some of the cooling tech you discussed. The next problem is evolving software to run in tandem threads. I'm a software developer, I can only dream of parallel programing and multi threaded procs beyond the duelies.

Most Macs are shipping as dual procs, I would not be supprised to see OSX2 or OSX3 going to multi-procs. The Gates gang is looking at wider busses and the Itanium chip for higher bandwidth.



posted on Nov, 4 2004 @ 06:35 PM
link   
AlabamaCajun, I must agree to what you said about multi-core processing units. I'm going to purchase a computer from Puget Systems that has two Intel Xeon Processors running @ 3.4 GHz each. I wish we could be able to custom build computers this way:

- The first main mother board goes on the top, which will have 6 2.0 GHz 64-bit processors
- The next motherboard will be for the harddrive(s)
- The one after that will be dedicated to the RAM (should be able to hold 40 GB's or something)
- Second to last is for the graphics card(s) and sound card(s)
- And lastly the final motherboard is for the power inlet, 8 USB's, 6 FireWires, an Ethernet, moniter outlet(s), and of course the mouse and keyboard ports.

This way, the heat dissipation is basically near the top of the tower, instead of just on one tiny spot on the motherboard.

Btw, I don't think they'd be called "motherboards", because they're not really the mother of the system. "mother-tower", now, that might work....

And it'll have liquid-nitrogen cooling for the processors, harddrives, graphics and sound cards, plus extra heatsinks for the RAM and USB and FireWire ports.

My best guess is that a computer like that will cost $15,000 or about 1/4 of the average income of Americans.

Hey, it's just a computer!


[edit on 11/6/04 by diehard_democrat]




top topics



 
0

log in

join