MadSci Network: Science History
Query:

Re: US has Electricity at 110V/60Hz, India has it at 230V/50Hz. Why?

Date: Thu Apr 26 16:08:13 2001
Posted By: Dave Lawrence, Staff, Microelectronics, Lawrence Consulting
Area of science: Science History
ID: 981917263.Sh
Message:

NOTE: I tried to send this answer several weeks ago, but I don't believe 
you got it. I've been having trouble with my email.

The short (and not so interesting) answer to your questions is this: the 
voltage and frequency chosen for commercial and residential power 
distribution are somewhat arbitrary. But once they're chosen, 
standardization for a geographic region (country, continent, etc.) is very 
important. It's kind of like having everyone drive on the right or left 
hand side of the road. The choice doesn't matter very much, but agreeing on 
the choice matters a lot!

That said, would any frequency and any voltage be as good as any other? And 
the answer is no. Let's consider the reasons for this by looking at what's 
good and bad about high and low voltage. From there we can zero in on a 
voltage range that's low enough to avoid the bad things about being high 
and high enough to avoid the bad things about being low.  Then we'll do the 
same thing for frequency. 

As most everyone knows, really high voltage is dangerous--deadly dangerous! 
So for safety reasons in a home, you want voltage to be as low as possible. 
Well then why not make it really low, say a volt or two, like a battery? 
The reason involves the relationships between electric voltage, current, 
resistance and power, so let's review them.

When current flows through a wire, the wire heats up as electrical energy 
is turned into heat. This is how electric stoves, toasters, hairdryers etc. 
work. If enough current flows, the wire can even glow, giving off energy in 
the form of light. That's how light bulbs work. The rate that electrical 
energy is being turned into heat and/or light is called power and is given 
by:

P = I*V

where P (power) is in units of watts, I (current through the wire) is in 
amps, and V (voltage difference between one end of the wire and the other) 
is in volts. So a watt is equal to a volt*amp. A 100 watt light bulb could 
be designed to be lit with 50 volts and 2 amps or 110 volts and 0.9 amps or 
220 volts and 0.45 amps, and so on. But they would be different bulbs; a 
220 volt bulb would be very dim if connected to a 110 volt outlet.

The equation relating the current flowing through a wire (or any object 
with resistance) to the voltage applied across its ends is called Ohm's 
law:

V = I*R

where V and I are the same as above and R is electrical resistance in units 
of ohms. Ohm's law says that for a wire of given resistance, the more 
voltage applied across its ends, the more current will flow.

Now if you substitute Ohm's law into the equation for power you get:

P = (I^2)*R     (that's, "p equals i squared times r" in plain English).

Having eliminated voltage, we can now see that as current increases in a 
wire, the power dissipated (that is, the rate that electrical energy is 
converted to heat and light) goes up very fast, namely as the square of the 
current.

That's all the physics we need for now; let's see what it's telling us. 
Suppose in your house there's a 10,000 watt electric stove.  The equation 
for power, P = I*V, says you'll get just as much heat with high current and 
low voltage as with low current and proportionally higher voltage. So the 
stove could be designed for, say, 10 volts and 1000 amps or 1000 volts and 
10 amps; it should get just as hot, just as fast either way. But it 
doesn't; the 1000volt*10amp stove gets a lot hotter a lot faster than the 
10volt*1000amp stove. The reason is that the stove isn't the only thing 
heating up; the cord from the stove that plugs into the wall is getting hot 
too, and that uses some of the power. And even a big fat copper (in other 
words, low resistance) cord will get hot when a lot of current flows 
through it. That's what the other equation for power, P = (I^2)*R is 
telling us. Even with a small resistance, high current means lots of heat 
because the current is getting squared! So to keep from wasting all your 
power in the cord you want current to be low. But that means voltage has to 
be high, which is dangerous. And that's how residential voltage standards 
were arrived at; 1000 volts is too dangerous, and 10 volts is too 
inefficient for high-power appliances. The balance was struck around 220 
volts, low enough to be safe and high  enough to be efficient with 
high-power appliances (like stoves). Nearly all countries (including the 
US) use 220 volts as the basic service into the house. In the US we also 
use 110 for low power applications, such as light bulbs and electronic 
equipment, where current is low enough for power loss in the cord to be 
negligible. It turns out that these products can be made a little cheaper, 
if designed to use lower voltage. For example, the filament in a 220V/100W 
light bulb would have to be thinner and longer (therefore more expensive to 
fabricate) than the filament in a 110V/100W bulb of equal life expectancy. 
This was a bigger issue in the early days of light bulb manufacturing than 
it is now, but Americans are used to 110V outlets, and changing everything 
to 220V would (for no good reason) scare the heck out of us!

What about frequency! You'll be glad to know that you've just learned most 
of what you need in order to see where the frequency standards came from. 
Let's follow our same approach and ask:  What's wrong with real low 
frequency? What's wrong with real high frequency? Then we'll try to see 
what a good compromise would be.

The lowest possible frequency is 0Hz or direct current (DC). What's wrong 
with that? Why not 110 or 220 volts DC? The answer begins with the same 
reason we found for using higher voltage with the stove. You just need to 
think on a bigger scale. Imagine a big city. It uses lots of electric 
power, so it's kind of like a high-power (VERY high-power) appliance! It 
gets that electric power from a huge power plant (usually several huge 
power plants) located tens or even hundreds of miles away. The power is 
delivered through wires from the plants to the city. So think of the city 
as a big stove, the power lines as a cord, and the power plants as the wall 
socket. Our real stove needed 10,000 watts; a big city might use 10,000  
million watts (10,000 megawatts). That's a million times as much power as 
the stove. If that much power were delivered at 220volts, the current would 
be more than 45 million amps (I = P/V), and we  know that when electric 
power is delivered along a wire, high current means lots of power being 
wasted in the wire (because P = I^2*R) Now 45 million squared is over 2000 
trillion! And remember, we're trying to get 10,000 million watts to the 
city. So even if we could keep R down to a millionth of an ohm (a 
micro-ohm) efficiency would be only a littled over 80%:

efficiency = (power to the city)/(power to the city + power wasted) = 
10,000 megawatts/(12,000 megawatts)

What would a 100 mile long, 1 micro-ohm, copper wire look like? The 
equation for resistance of a copper wire is:

R (ohms) = (1.5E-6)*Length /(pi*Radius^2)

where the length and radius are in centimeters. I'll let you figure out the 
radius. But it's way too big to be practical!

Fortunately there's a better way, and we know what it is: deliver the power 
(P=I*V) at high voltage and low current instead of low voltage and high 
current. If we increase the voltage by a factor of a hundred, the current 
could be reduced by a factor of 100 for the same 10,000 megawatts of 
generated power. But that reduces the power lost in the lines by a factor 
of 10,000 (because power varies with current SQUARED) so most of the 10,000 
megawatts could actually get to the city! That seems too easy! And you've 
probably figured out the catch: multiplying 220 volts by 100 means 22,000 
volts. We don't want that on the utility pole in front of our house--to say 
nothing of letting it inside!

So here's how it works. For most of the distance from the plant to your 
house, power is delivered at tens of thousands of volts. That would be 
dangerous if people could get close to it, so the power lines are suspended 
on those huge towers that hold them way up in the air. When the lines get 
to your house, the voltage is reduced (stepped down) to 220 and the current 
is increased (stepped up) from what ever it was to whatever you need. 
Actually it's a little more complicated than that because your house isn't 
the only place the power is going. So the voltage gets stepped down a 
couple of times, first at a sub station, to around a few thousand volts, 
then at the utility pole in front of your house to 220. There's only one 
cost effective way to "step down" a voltage without wasting power and 
that's with a transformer. These are placed on utility poles close to your 
house. The input to the transformer is high voltage/low current (from the 
power station), and the output is low voltage/high current (to your house). 
There's just one last catch; transformers have no moving parts, and use 
electromagnetic induction. Without moving parts, there's no such thing as 
electromagnetic induction with constant (direct) current! It has to be AC, 
which rules out 0Hz! 

Ok, how about 1Hz? Just kidding! Actually transformers do get more 
efficient with increasing frequency, but around 20Hz they can be made very 
efficient. By the way "efficient" here means that the power out of the 
transformer (to your house) is very nearly equal to power in (from the 
power station). The reason for higher frequency has to do with lightbulbs. 
At 20Hz,  oscillations in brightness are noticable (and annoying!) even 
with incandescent light bulbs. Flourescent bulbs actually go completely on 
and off with AC, and at 20Hz this flickering would be extremely annoying! 
Flickering goes away around 50Hz, the standard used in many countries. The 
60Hz standard adopted in the US comes from the use of the periodic voltage 
as a timing mechanism for electric clocks (60 minutes in an hour, 60 
seconds in a minute, so 60 cycles in a second).

There doesn't seem to be any advantage to frequencies above 60Hz, and there 
are several disadvantages. They would require that generators turn faster, 
or have more parts. In other words, they'd be more expensive. Also, wires 
with alternating current flowing through them emit electromagnetic 
radiation, and it turns out that the power radiated away (that is, wasted) 
through this process increases with increasing frequency (this is exactly 
the same physics that makes transformers only work with AC).

Well, I bet you never thought the answer to your question could be so long! 
And since you posted it in the "Science History" area, I should finish with 
a reference to the history of these standards. It's a very interesting 
story of American industry involving perhaps two of the greatest inventors 
of all time, Thomas Edison and Nicola Tesla. You can get started learning 
about it right here at the MadSci Network in the answer to question 
#910978615Sh.





Current Queue | Current Queue for Science History | Science History archives

Try the links in the MadSci Library for more information on Science History.



MadSci Home | Information | Search | Random Knowledge Generator | MadSci Archives | Mad Library | MAD Labs | MAD FAQs | Ask a ? | Join Us! | Help Support MadSci


MadSci Network, webadmin@www.madsci.org
© 1995-2001. All rights reserved.