|MadSci Network: Physics|
Michael, Voltage is not the key here, but power is. Power is measured in Watts or sometimes Volt-Amps (VA). Power, in electrical terms, is calculated as the voltage times the current. All of our US power systems have a voltage of nominally 110 volts. A 100 watt bulb is designed to use about 100/110 amperes (amps) of current or about 0.9 amps. The 20 watt bulb only uses about 0.2 amps. As the current flows through the bulb, it is really many, many electrons moving through the wires. As these electrons move through the bulb, they heat up a coil of wire that has a resistance to the flow of the electrons. As the wire heats up, it gives off photons of light in the visible part of the spectrum, that we can see, and also in the infrared part of the spectrum, that we can't see but that can heat other things up, like the glass of the bulb. The resistance of the 20 watt bulb to the flow of the electrons is higher, so fewer of the electrons can go through the coil. The wire is generally smaller, so as it heats up and glows, but fewer photons are produced - that is why the light is dimmer and the bulb is cooler. I hope this helped. Sincerely, Todd Jamison, Chief Scientist, Observera, Inc. [note added by MadSci Admin: The power rating of bulbs (in your case 20 and 100 Watts) is the *electrical* power that the bulb operates with. The amount of this energy that actually goes into visible light is only about 2.5% to 5% of the total (electrical) power. They are not efficient!]
Try the links in the MadSci Library for more information on Physics.