Nonsense About Your House Voltage
An Anonymous E-mail
Not so long ago (11 March 2018) an anonymous e-mail was received here from someone who pretended to understand electricity. This person tried to claim that the current goes down as the voltage goes up, thereby keeping the power (measured in watts) in my appliances constant. This is nonsense, of course, and is a complete mis-statement of the relevant laws of physics. Here is their e-mail :
Subject: voltage too high Hydro one
Reading your post--Hydro One costing you more with high voltage. Was just looking at your calculations...Hydro One meters are watt meters, meaning voltage and current are measured P=ExI. So when voltage is higher, current is lower, keeping watts the same. Keep studying.
If what this person says were true, then we could conveniently and safely run all of our ordinary 120 volt appliances on 240 volts. Or better yet, run them on a more efficient 550 VAC power line, for even greater transmission line efficiency.
So, why don't we do that? Quite simply, it would not work.
Voltage and Current
Let's analyze what this person is claiming :
Hydro One meters are watt meters — Yes.
meaning voltage and current are measured — Yes.
when voltage is higher, current is lower — Definitely not, and exactly the opposite is true!
keeping watts the same — Absolutely not true.
The claim that higher voltages result in lower electrical current completely ignores the fact that each of your appliances has an inherent resistance (R) to the flow of current and that this resistance is relatively constant under any particular operating condition. So for any applied voltage, this resistance determines the amount of current flowing through the appliance, not some theoretical "power rating". From this inherent resistance we can use Ohm's Law to calculate the power dissipated by your appliance by P = E x I, or more easily by P = E x E/R.
"Keeping Watts the Same"?
There is no natural device or process in the known universe that "keeps" the wattage or power the same. The power dissipated by an appliance (or any electrical load) is determined entirely by the amount of applied voltage and the inherent, internal electrical resistance of the appliance (simple resistance R for direct current and impedance Z for alternating current). So the power P which is dissipated by the appliance is always equal to E x E/R, or E x E/Z.
And the current I is E/R or E/Z, clearly and directly related to the voltage, not inversely related to the voltage, as this anonymous person tried to claim.
The power rating of an appliance is a simple statement by the manufacturer of the voltage at which the appliance is designed to operate and the current which the appliance will draw at that voltage. The power rating in watts is the product of the design voltage and resulting current, not the other way around.
The Rating of an electrical appliance indicates the voltage at which the appliance is designed to work and the current consumption at that voltage. These figures are usually displayed on a rating plate attached to the appliance, e.g. 230 volts, 3 amperes.
The rating of the appliance is related [to] the power it consumes. Power is measured in watts and is the product of volts and amperes. The example above would have a rating of 690 watts.
To claim otherwise is a blatant misrepresentation of the science and a neat piece of propaganda because many ordinary people have no way of knowing the truth.
Remember the warnings to use a step-down transformer on your 120V hair dryer when traveling in Europe? That's what this is all about. — Higher voltage, higher current, higher power.
The only exceptions to all of this are some modern electronic devices that contain "switching-mode" power supplies designed to operate on any AC voltage between, say, 100 and 250 VAC. Check the voltage specification on the label. The charger for your cell phone, for example, probably has this type of power supply. These "switching" power supplies are remarkable devices but your fridge, freezer, well water pump, washing machine, toaster, hot water tank, and hair dryer, among others, do not have this.
Prove it to Yourself
If the claim in the anonymous e-mail were true, then any 12 volt light bulb could easily take double the voltage and not suffer any ill effects. So let's do a little experiment to see if this power industry myth will work. Take any old 12V automotive light bulb, say an old parking lamp, a 12V car battery (you don't even have to take it out of your car), and a 12V car battery charger (just another source of 12V and easier to get than another battery), and connect them all in series, like so.
What happens? Well, the light bulb either burned out immediately or got very, very bright, on the threshold of burning out.
Why? Because we applied 24 volts to a bulb that is designed for only 12 volts. With 24V applied to the bulb, it tried to draw twice as much current as it was rated for, thus exceeding its power rating and probably burning up the filament as a consequence. This is exactly why we do not connect our 120V appliances to 240V! While many homes have a clothes dryer that is connected to 240 VAC, you wouldn't want to connect your fridge or dining room lights or your expensive TV set to 240 volts.
Remember what happened when your neighbour plugged her 120V hair dryer from home into the wall outlet in Europe without using a transformer? Well, Ohm's Law won again.
Not convinced yet? Think about how you know that it's time to replace the batteries in your flashlight. The light has gotten dim, right? So you put in fresh batteries and what happens? The light is bright again! That's because the old batteries didn't have enough voltage to provide enough current for the light to operate at full brightness, but when you put in fresh batteries with more voltage, then more current flowed and the light bulb could operate at its rated power.
Less voltage leads to less current, lower power. More voltage leads to more current, higher power. It is all according to Ohm's Law, first published in 1827 by German physicist, Georg Simon Ohm.
Where Does This Myth Come From?
This false claim about decreasing current probably comes from people who work exclusively on electrical transmission lines, those long power lines that cross the countryside going from power generating stations to transformer sub-stations. These lines can be used to deliver known amounts of power (that is, the power at any time can be relatively constant) and the operators want to use a higher voltage so that the current will be lower. With a lower current, there is less loss of power in the lines, the dreaded "I-squared-R loss".
But your home is neither a power transmission line nor a transformer sub-station, and is instead a "load" which requires a constant voltage. Each of your appliances has a characteristic load impedance (resistive and/or reactive) and will respond to a higher voltage by drawing a higher current, just as described above. The so-called power rating of any appliance is only that — a theoretical prediction made by the manufacturer about the power or energy that will be consumed by your appliance when operated at the rated voltage. This "power rating" is not a mathematical or physical constant. If your appliance is operated at a higher voltage than the rated one, then the power and energy ratings will be exceeded.
The Bottom Line
So don't fall for this false claim about the current decreasing as the voltage increases. It is a myth spread by a few people in the power industry to try to quash any criticism of some of the things that they do. As the voltage increases in your home, the current drawn by your appliances will also increase, the power used will increase, and the energy cost appearing on your electricity bill will increase accordingly.
If I haven't explained some part of this very well, please contact me.
Finally, my anonymous correspondent said "Keep studying". Yes, good advice, and in fact I have never stopped. I hope my anonymous correspondent will do the same.