It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

CPU water heater?

page: 1
0

log in

join
share:

posted on Aug, 20 2009 @ 09:44 PM
link   
I was wondering if you could use a CPU as a water heater for your home? I know they make water coolers for CPU's but is the CPU's heat enough to keep water in a water heater hot? I ask because I usually leave the computer running as do many people so this would be a nice way to cool the CPU and make use of the energy being wasted.



posted on Aug, 20 2009 @ 09:51 PM
link   
Maybe at a data bank or server farm, but from a home PC? I think not. Water-cooled PCs are a closed-loop system so you'd need a heat-exchanger which would result in some heat loss, and given there isn't much to begin with, I think you'd be lucky to heat a cup of coffee.



posted on Aug, 20 2009 @ 09:59 PM
link   
I agree with the second post...HOWEVER...I did see a 1 egg skillet that could be placed on a CPU and it would cook an egg over enough time. Water heater? Not so much.



posted on Aug, 20 2009 @ 10:17 PM
link   
The amount of power put out by a CPU isn't sufficient to heat your water. Your CPU puts out maybe 200-300 watts, tops (the power supply for a computer is about that; so your CPU actually puts out much less power than that). But let's just say, for the sake of argument.

OK, how fast could that heat up a water heater full of water? You're going from room temperature to about 120 degrees (F), at least (let's say you're very green and keep the temperature relatively low). How many gallons, I don't know - let's say 40. No clue, since I don't own a house, but it's a SWAG.

To heat 1 gram of water 1 degree Celsius requires a calorie of heat.

So 40 Gallons = about 160 liters, give or take, = 160,000 grams of water.

Going from, say 70 degrees (F) (room temperature) to 120 degrees (F) is about 50 degrees F, or roughly 25 degrees Celsius.

Raising 160,000 grams of water 25 degrees Celsius would require 160,000 x 25 calories, or about 4 million calories, or about 16 million Joules. A Joule is a Watt-second. So to produce 16 million Joules, you'd have to run your 300-Watt CPU for 55 thousand seconds (or 15 hours).

This assumes that your don't lose any heat, which of course doesn't happen in reality. So no, you couldn't use the CPU to heat your water, though you might at least use it to pre-warm the water and save on how much energy you had to spend to get it hot. But to do that, you'd have to have your CPU connected to the water, which would be a pain. Also, as I pointed out, you don't put all the computer's energy into the CPU, so right now this is not a viable solution.

OTOH, if we were to seek to use waste energy to heat our water, we might be able to save a little money. For instance, the heat from our refrigerators is wasted. We pump the heat out of the refrigerator and let it out into our homes. Then (at least in the summer) we air condition our homes, pumping some of that refrigerator heat outside. Wasteful. Why not let the waste heat warm up our water a little bit, so that less energy is needed to heat it to a comfortable temperature?



posted on Aug, 20 2009 @ 10:40 PM
link   
Wow, that was an incredibly well thought out and scientific post.

A simple no would have sufficed


Star for you anyway



posted on Aug, 21 2009 @ 12:45 AM
link   
High end processors put out only around 140 watts of heat, such as the AMD Phenom (I), and the Core i7. However, your average Core 2 Duo or Core 2 Quad processor will put out 65-95 watts of heat. This is similar for most performance video cards, but higher end cards like the GT200 can put out up to around 200 watts. They only put out that much heat under dedicated conditions such as burn in stress testing, otherwise they are generally putting out less than half of that heat. As chiron613 said, this is no where near sufficient. On top of that processors generally hit thermal walls at about 70 degrees celcius for most CPU's and 100 degrees Celsius for videocards - again, not hot enough and you probably do not want to run them constantly like that as more heat equals more electron migration and therefore shorter life. If you want a water heater, then buy a water heater, or make one if you know how.

[edit on 21/8/2009 by C0bzz]



posted on Aug, 21 2009 @ 01:05 AM
link   
Obviously you folks haven't heard of a Flash Water Heater (or sometimes called Tankless Water Heater or Heat On Demand Water Heater)!

These are water heaters which heats water as needed, rather than storing it in a tank. In general, a flash water heater will be more efficient than a tank water heater, and will yield hot water consistently when it is required.

And guess what the earlier versions of these used to flash heat the water?

Wait for it...

Wait for it...

x286 CPUs!

No, they don't run the device, that is all handled in a ROM Chip on a Circuit Board attached to a Sensor. They use the x286 CPU to actually heat the Heat Exchanger that flash heats the water!

(x286 CPUs are also used in Automatic Cleaning Cat Liter Boxes and on anything we shoot into space as they are the most powerful chip that won't throw errors from Gamma Particles passing through the CPU)

However, using a CPU to heat a Heat Exchanger isn't powerful enough when the water flow is too high, meaning what is needed for showers, Dishwashers, etc. So, all of the modern Flash Water Heaters are now Gas Powered, using a Gas Burner to heat the Heat Exchanger. This allows them to heat a higher volume of water in the same amount of time.

But conceivably, yes...you could use a CPU as the basis for Flash Water Heating, but only for light duty. It certainly wouldn't be able to heat a Tank Water Heater (as a previous poster mentioned and detailed as to why not).



posted on Aug, 21 2009 @ 03:27 PM
link   
Wow thank you all. I am just trying to find ways to reduce my power bills just cause heh. How bout my attic? I live in Florida and it is hot as heck up there. Is there anyway I could use that heat? Seems with that much heat you could somehow rig a windmill in your attic using the exchange between cool outside air and sweltering inside air?

I am looking for any suggestions. Solar panels are still to expensive at this point though I could use it sparingly if there are some good ideas out there.

I like the idea of using the fridge and CPU heat to help maintain the water heater heat. I will have to think about how I could do that.



posted on Aug, 21 2009 @ 03:53 PM
link   
No,a cpu doesn't put out enough heat...and heat is the thing you try to minimize with a cpu.



posted on Aug, 21 2009 @ 03:55 PM
link   
reply to post by fraterormus
 


Hey what?? They honestly used a processer as a heating element?? They stuck a processor in and it was not there to do any binary crunching? Just to act as a heating element?

What could the manufacturers lay their hands on processors cheaper than they could make a heating element - (admittedly I'm really not sure what a flash heater element would look like or how it would function).

Would you have a like for that? I'm intrigued...



posted on Aug, 21 2009 @ 04:00 PM
link   
reply to post by Xeven
 


Solar hot water heaters use heat similar to what is in your attic! You could certainly route your water through the attic to gain some heat. You could also put heat exchangers (i.e. Radiators) in the attic or on the roof to get some pretty hot water. It may not be the 120F you want, but it may be hot enough for showers!




top topics



 
0

log in

join