soyuppy wrote:Can someone help elaborate or explain this Power loss and voltage problem show in the attach picture? This taken from one of the invitational test.
I understand that P(loss) = I*I*R or P(loss) = V*V/R, but it still does not explain the reasoning behind the answer? Base on the latter equation P(loss)=V*V/R, If R remain constant, as V increase, P(loss) would increase, wouldn't it?
Or do we just accept that this is a rule specified somewhere in electrical physic book? Whenever voltage is stepped up by X factor, P(loss) decrease by the X*X factor?
The confusion is over the meaning of V (voltage).
In the question they are increasing the voltage that the power line is carrying (i.e. increasing the line voltage from 11 kV to 33 kV). If the line is still distributing the same power to the customers then the current will decrease by a factor of 3 (since P = V*I), so P = I*I*R will show that the power lost in the power line will decrease by a factor of 9 assuming the resistance stays the same.
The problem with using P=R/(V*V) is that in this equation V is referring to the voltage drop between the two ends of the power line, not the voltage that it is carrying. To see what the voltage drop along the power line is, use V = I*R. This immediately shows that if the current in the power line decreases by a factor of 3 then the voltage dropped along the power line will also drop by a factor of 3, and now you can use P = R/(V*V) and get the same answer as above.
In these type of questions it is usually easier to use P = I*I*R rather than trying to calculate the voltage drop.