8 Replies Latest reply on May 30, 2020 6:42 PM by kkazem

    Power use of transformers, dependent on load?

    ntewinkel

      Hi all,

       

      I'm wondering if the power used by a transformer varies depending on the load put on the transformer.

       

      The specific situation is my 12 volt landscape/deck lighting - I have a transformer on the wall rated to supply a maximum of 120 watts. I recently upgraded to LED lights to replace the old 7 watt bulbs.

       

      Does the transformer now use less power from my house wiring, now that I have LED bulbs that draw less power on the output side?

       

      The reason for the question is that I'm wondering if it would make sense to replace the big transformer with a smaller transformer.

      I recall the electric company going on about wall chargers always consuming power even when it they are not charging any device.

       

      ps, I plugged in a power meter (left over from my days as residential energy advisor), on the 120v side, and measured that the transformer now uses about 0.20 amps (= 24 watts?).

       

      Thanks!

      -Nico

        • Re: Power use of transformers, dependent on load?
          clem57

          12 volts x .2 amps = 2.4 watts. That is not bad, right?

          clem

          • Re: Power use of transformers, dependent on load?
            John Beetem

            Nico wrote: I'm wondering if the power used by a transformer varies depending on the load put on the transformer.

            I would say yes, definitely.  If you feel a wall wart that's plugged into the wall but not to a device, it gets a little warm.  If you plug in the device and turn it on, the wall wart gets quite a bit warmer depending on how much power the device is using.  If I remember my basics, a transformer is primarily a reactive load, which means it stores the energy it gets from the wall as a magnetic field and then returns it as the current alternates.  So if your device is off you're not consuming power -- it's just sloshing back and forth.

             

            However, there are resistive losses from this sloshing, so things do get a little warm.  If you have a device plugged into the secondary, then you're using the energy instead of letting it slosh back.  The transfer of energy from primary to secondary is imperfect, so you get resistive losses that warm up your transformer.

             

            I doubt you'd save anything with a smaller transformer.  It probably has thinner wires with higher resistive losses.

             

            Power isn't my speciality, so I'm happy for others to correct me.

            2 of 2 people found this helpful
              • Re: Power use of transformers, dependent on load?
                mcb1

                John is pretty close ....

                A wall wart tends to have a DC to DC converter, with capacitors and maybe a regulator, so it will consume some power while there is no load.

                More importantly the capacitors will age and especially the later inverter types without a transformer.

                 

                A straight 230 (or 120) to 12v transformer will have some losses, which tends to result in the core heating.

                Therefore if you leave it there it will consume a very small amount of power, and the percentage of its full load is very small.

                 

                John is also right in the resistive losses, but you could measure the voltage to see how high it has climbed now that the load has dropped.

                IMO I wouldn't bother replacing it until you have to.

                 

                For what its worth we tried those plug in power meters to check on the loads of various equipment we use, and the results were c..p.

                They seem fine at high loads, but we found them to be out by 100% on the small loads.

                 

                We constructed a device that placed a resistance in the neutral, and used a portable scope to capture the voltage across it, while measuring the mains voltage.

                This also included a means of capturing the first cycle to measure the inrush current, which on some 600w servers was 44 Amps.

                 

                Mark

                3 of 3 people found this helpful
              • Re: Power use of transformers, dependent on load?
                michaelwylie

                Yes, it does.

                 

                Clarify: Suppose you have a 120 Volt to 12 Volt step down transformer. Slap a 1.2 ohm power resistor on the output, and the load current will be 10 amps. The current into the transformer will be 1 amp. As the secondary current keeps increasing, the I²R losses inside the transformer will increase. The efficiency may stay relatively constant, but the losses will increase. It's subtle.

                3 of 3 people found this helpful
                • Re: Power use of transformers, dependent on load?
                  nazima

                  When the charge increases, transformer losses power consumption rise while the VA value of power output is not broken. There was an mistake. Transformer losses (depending on measuring criteria) are usually shown on the secondary side and do not have output capacity. Then no load at secondary open circuit.

                  2 of 2 people found this helpful
                  • Re: Power use of transformers, dependent on load?
                    kkazem

                    Yes, the power used by (not transferred out of) a transformer increases with load.

                    1. The magnetic losses, due to magnetizing the core does not change with load, only with input voltage (increasing) and with input frequency (increasing).

                     

                    2. The copper losses increase with load. This goes for both the primary and the secondary winding losses.

                     

                    Absolutely, decreasing the lamp wattage load on your transformer reduces both the primary and secondary current. Since the primary and secondary winding resistances are fixed (at any given temperature), the (I^2)*R losses will be less by the square of the reduced current.  

                    1 of 1 people found this helpful