• Please review our updated Terms and Rules here

Question about transformers

6885P5H

Experienced Member
Joined
Feb 7, 2015
Messages
320
Location
Québec, Canada
Hi. I have a question about transformer capacity. If I have a transformer rated for 200VA (~1.74A), and I hook up a 2A load to it, what is gonna happen? Is the load gonna get starved of current? Basically is the capacity a cap? Or will the load still get all the current it wants, function as it should, at the expense of the transformer's health?
 
At that rating/load, probably not much. It might get uncomfortably warm. Depends on the load voltage too.

Its practically no different than putting 2a down a wire rated for 1.75.
 
It is unlikely that that small an increase will do much. Things are usually designed with some head room. It is hard to say what is the limiting factor. The core or the wire. In the end it will be the wire. At some point as you increase the load, the core will saturate. When this happens, the current in the primary goes up quickly and the wire fails, hopefully before starting a fire. Of course, all the current will heat the wire and all the core losses will heat the core. The insulation between the windings could fail from heat. The question is, is the load really 2 amperes or is the load just rated at 2 amperes and really only 1.4 amperes or such.
Dwight
 
I deal more with larger transformers, and I don't have my copy of the NEC in front of me, but for larger transformers at least it's allowed to run one at 125% of the continuous rating. That would be 250VA. But there are different rules for smaller transformers, as I recall, and I work more with transformers in the 50KVA to 1,000KVA range at $dayjob.
 
Just tossing my .02 here. You're dealing with small transformers (unless you're talking about 12KV class voltages). The tendency of consumer-grade transformer suppliers is to design their products such that open-circuit voltages are much higher than nameplate and the voltage drops as the load is increased. That is, a certain minimum current draw is anticipated, so as to allow the secondary winding resistance to compensate a bit.

I discovered this when converting from low-voltage incandescent light strings to LED. With a full incandescent load, the RMS voltage pretty much matched the nameplate spec of say, 12Vrms. When the incandescent lamps were substituted by LEDs (AC units with rectifier and filter fitted into the lamp itself), the transformer voltage rose to about 18Vrms, which was not so good for the LEDs. A series resistor brought things back into the desired range. Fortunately, I had a good supply of vitreous enamel wirewound power resistors.

FWIW, these were not cheap transformers--they were US-branded screw-terminal control units. You'd initially think that using a unit with a much higher nameplate current rating would be a good thing, but it's not the case.

My take is that with a heavier load, the output voltage will drop somewhat, but at only 125% of nameplate rating, you probably won't notice it.
 
If it's an older unregulated linear type transformer, check the voltage with a multimeter before you plug it into the device because they have a tenancy to drift up over time.

The cause of the voltage drift is the windings shorting together; Which can be caused by the 50/60Hz AC making the coils vibrate, heat and the nature of them being tightly wrapped together. The transformer doesn't need to be powered either to have voltage drift because the tight winding wrapping will eventually make the varnish migrate.

I found out the hard way some years back when I got my Sega Genesis out of storage, I used the original transformer that came with it and the console started behaving erratically. Games would crash, the console would spontaneously reboot and eventually smoke, the LM7805s burned up. I checked the voltage on the transformer, which was supposed to be 9v was actually around 17.8v. Alarmed and curious, I pulled out all of the linear transformers I had and tested them and sure enough, they were all way out of spec.

I smashed a few of them to inspect the windings and found that the thin varnish on the copper wire had clumped up in several areas and you could get continuity between the terminals on the transformer and brushing across the windings with the other probe.

Making sure it wasn't just my stuff, I got a friend who had literally hundreds of linear transformers of all varieties of all ages to test his and sure enough they were all out of spec. The older they were, the worse they were out.
 
One issue here is where (and by whom) the transformers were made. I've got US-manufactured 60+ year old filament transformers that perform just as well as the day they were new. But things weren' manufactured for the throwaway consumer market much, then--and really good transformers were (and still are) expensive.
 
The quality ones are probably wound by hand, or at least by a human operator on a winding machine with decent sized wire.

All of the bad transformers I have seen go bad use thin whispy wire wound so tight that unwinding the coils usually ends up breaking sections of wire because the clumps fused together. I've seen videos of the automated winding machines doing it at a blistering pace.
 
That--and the old ones soaked each winding in varnish--you can tell this when you unwind one--bits of extra varnish come off with the wire. Varnish and fishpaper... That's not including the high-end potted ones.

They don't make 'em like they used to. :(
 
Even back then they didn't always. Zenith really seems to have started the drive to low cost consumer grade. But I don't know that I've ever worked on a Crosley, I expect those to be worse, and really with all the other odd brands back then there should be others, too.
 
I have a 200VA 115 to 230V transformer. I did not know what volt-amperes were when I got it, but I thought 200 sounded like it was a lot... I was sick and sleep-deprived because of the illness so I did not have all my sanity available hahaha.

So I hooked up a computer to the transformer and it worked, pretty cool I thought. I then tried another one, which I'm pretty sure was supposed to be working, but it complained about parity errors and failed. That's when I realized 200VA is actually around 1.74A and not the crazy rating I had in my head, and I wondered if the computer did not work because it did not get enough current.

So if the transformer is rated 1.74A, and I hook up a 2A computer to it, the computer will still get all the current it wants, and won't malfunction due to power-related issues?
 
200 VA at 115V = 200/115 = 1.74A.

200 VA at 230V = 200/230 = 0.87A.

This is theoretical - ignoring for losses etc.

So, my question is - when you say you are hooking up a 2A computer to it - what is the voltage of the computer?

Multiply the voltage rating of the computer by the maximum expected current draw of the computer to give you the VA consumed.

If the computer draws 2A at 230V that is 2 * 230 = 460 VA (way in excess of your 200 VA supply).

Dave
 
There is often quite a lot of confusion about how transformers actually work, for example thinking that as the secondary load increases that it can cause the core to magnetically saturate when this is not the case. The core losses (hysteresis and eddy currents during the magnetic cycle) remain fairly constant and independent of the load. It works like this:

In the off load state(voltage applied to the primary and no secondary load) the magnetic flux cycle in the core (and the current that produces it) lags the applied voltage by 90 degrees if there were no losses, the current would be in phase with the flux, but since there is some resistance, hysteresis and eddy currents (all waste energy as heat) there must be a small component of the current in phase in phase with the applied voltage. So the no load primary current leads the flux by a small angle and can therefore be split up into two components, a "wattless" magnetizing current in phase with the flux and a Loss current in phase with the applied voltage.

When a load is placed on the secondary, current flows in the secondary winding, the magnetic field from this acts to reduce the flux set up by the primary winding. And not increase it as some people believe. It only takes a small reduction in primary flux to enable full load current to flow in the primary.Therefore, there is little error in assuming that the transformer's main flux remains constant between between no load and full load conditions. And it is the case that the magnetic effect of the secondary current is immediately neutralized by the the appearance of a corresponding component in the primary current.

Assuming, off load, a mains transformer has a sensible magnetizing current and the peak flux is not too high, you can forget about the transformer core when it comes to the max load the transformer can tolerate. All power transformers for some mains voltage and frequency and core material should be designed so that the fixed losses of eddy currents and hysteresis and off load primary current are not excessive.

However, the thing that limits a transformer's ability to deliver more than a certain amount of power is the winding DC resistance. These are the copper losses otherwise known as I(squared)R losses as they increase with the square of the current. Larger transformers simply can have thicker wire and lower copper losses for any applied load. When you overload a transformer, its winding temperature will simply go higher than it was designed for and the transformer will get hotter. How much of that is acceptable depends on how conservative the initial design was and other factors including ventilation, quality of enamel wire and insulation used etc.

(If you want you can measure the DC resistance of the primary and secondary windings. To get one resistance total you can mathematically transform one winding resistance into the other by multiplying it by the square of the turn's ratio (the impedance ratio). Then with the known load current I, you can calculate the power loss in the windings with I(sqaured)R. How much the transformer heats up though with that power dissipated in it , is a little like a heat sink calculation, it depends on the average degrees C/watt of the entire transformer body).

The above remarks of course ignore leakage reactance effects which could be a whole other topic.
 
Last edited:
The core saturation current goes down with heat. The inductance goes down as it gets hotter. The heat is IR as you state but at some point, if the wire and the insulation hold up, it will reach the point that the inductance is not high enough to support 60Hz. The primary current will increase significantly and then the wire fails.
Dwight
 
The core saturation current goes down with heat. The inductance goes down as it gets hotter. The heat is IR as you state but at some point, if the wire and the insulation hold up, it will reach the point that the inductance is not high enough to support 60Hz. The primary current will increase significantly and then the wire fails.
Dwight

Yes and some transformers sail closer to the edge and a hot core is a very bad sign. It really shows up when some (especially vintage)transformers for 60Hz are run on 50Hz, the off load primary current and core heating can be severe. I once had one in a vintage TV set where this problem and the radiated magnetic fields were so high that they disturbed the CRT beam and it was the original 60Hz transformer for the set. A replacement modern transformer designed for 50Hz of the same size & power rating had something like 1/5 the off load primary current on 50Hz.
 
Back
Top