posted on Jul, 9 2006 @ 03:10 PM
It doesn't make a lot of difference.
I'm going to leave out a lot here in the interest of clarity so as not to bore you, unless you just WANT to go diving off into reactive power and
j-omega-theta's, power factor and what have you.
But in the small, here's the scoop.
Most stuff takes X amount of power to operate. Let's take a blender, for instance. Loaded down with something slushy and viscous, let's say it's a
milk shake, your Waring might draw 500 Watts. Run from the 120V power line, it will draw a little more than 4 Amps, because power, which is what Watts
is, is the product of Voltage and Amperage.
If that motor is rewound to operate in GB, and you plug it into the 230V there, you're once more going to draw 500 Watts. Only now that it's 230V,
it will draw a little more than 2 Amps, since again, power (W) = Volts * Amps.
The power it's drawing is the same. When you change one of the components of the power, the other will have to alter to balance out. Increase the
volts, decrease the amps, and vice versa.
You probably lose a little less in IR losses in the 230V appliance, but the effect won't amount to much.
The 50Hz/60Hz issue will mean that the laminations of the GB motor may need to be a little different material, or you may need more of it, or you
might have to diddle the winding inductance a little higher. Universal motors, such as the one in your blender, don't really care a lot about the
line frequency. Others may require modification.
In general, going up in frequency (to, say, 400Hz) means the cores get smaller and lighter in transformers, motors become smaller etc. But the core
losses increase, too, so they're a little less efficient. In aircraft (usually) and some military gear (occasionally) you use 400Hz to get smaller,
lighter power supplies. And it makes that annoying 'military equipment room' whine instead of 60 cycle hum.