Image Map Image Map
Page 2 of 2 FirstFirst 12
Results 11 to 17 of 17

Thread: Question about transformers

  1. #11
    Join Date
    Feb 2015
    Location
    Québec, Canada
    Posts
    271

    Default

    I have a 200VA 115 to 230V transformer. I did not know what volt-amperes were when I got it, but I thought 200 sounded like it was a lot... I was sick and sleep-deprived because of the illness so I did not have all my sanity available hahaha.

    So I hooked up a computer to the transformer and it worked, pretty cool I thought. I then tried another one, which I'm pretty sure was supposed to be working, but it complained about parity errors and failed. That's when I realized 200VA is actually around 1.74A and not the crazy rating I had in my head, and I wondered if the computer did not work because it did not get enough current.

    So if the transformer is rated 1.74A, and I hook up a 2A computer to it, the computer will still get all the current it wants, and won't malfunction due to power-related issues?

  2. #12

    Default

    Short answer: yes.

  3. #13
    Join Date
    Jun 2012
    Location
    UK - Worcester
    Posts
    2,630

    Default

    200 VA at 115V = 200/115 = 1.74A.

    200 VA at 230V = 200/230 = 0.87A.

    This is theoretical - ignoring for losses etc.

    So, my question is - when you say you are hooking up a 2A computer to it - what is the voltage of the computer?

    Multiply the voltage rating of the computer by the maximum expected current draw of the computer to give you the VA consumed.

    If the computer draws 2A at 230V that is 2 * 230 = 460 VA (way in excess of your 200 VA supply).

    Dave

  4. #14

    Default

    There is often quite a lot of confusion about how transformers actually work, for example thinking that as the secondary load increases that it can cause the core to magnetically saturate when this is not the case. The core losses (hysteresis and eddy currents during the magnetic cycle) remain fairly constant and independent of the load. It works like this:

    In the off load state(voltage applied to the primary and no secondary load) the magnetic flux cycle in the core (and the current that produces it) lags the applied voltage by 90 degrees if there were no losses, the current would be in phase with the flux, but since there is some resistance, hysteresis and eddy currents (all waste energy as heat) there must be a small component of the current in phase in phase with the applied voltage. So the no load primary current leads the flux by a small angle and can therefore be split up into two components, a "wattless" magnetizing current in phase with the flux and a Loss current in phase with the applied voltage.

    When a load is placed on the secondary, current flows in the secondary winding, the magnetic field from this acts to reduce the flux set up by the primary winding. And not increase it as some people believe. It only takes a small reduction in primary flux to enable full load current to flow in the primary.Therefore, there is little error in assuming that the transformer's main flux remains constant between between no load and full load conditions. And it is the case that the magnetic effect of the secondary current is immediately neutralized by the the appearance of a corresponding component in the primary current.

    Assuming, off load, a mains transformer has a sensible magnetizing current and the peak flux is not too high, you can forget about the transformer core when it comes to the max load the transformer can tolerate. All power transformers for some mains voltage and frequency and core material should be designed so that the fixed losses of eddy currents and hysteresis and off load primary current are not excessive.

    However, the thing that limits a transformer's ability to deliver more than a certain amount of power is the winding DC resistance. These are the copper losses otherwise known as I(squared)R losses as they increase with the square of the current. Larger transformers simply can have thicker wire and lower copper losses for any applied load. When you overload a transformer, its winding temperature will simply go higher than it was designed for and the transformer will get hotter. How much of that is acceptable depends on how conservative the initial design was and other factors including ventilation, quality of enamel wire and insulation used etc.

    (If you want you can measure the DC resistance of the primary and secondary windings. To get one resistance total you can mathematically transform one winding resistance into the other by multiplying it by the square of the turn's ratio (the impedance ratio). Then with the known load current I, you can calculate the power loss in the windings with I(sqaured)R. How much the transformer heats up though with that power dissipated in it , is a little like a heat sink calculation, it depends on the average degrees C/watt of the entire transformer body).

    The above remarks of course ignore leakage reactance effects which could be a whole other topic.
    Last edited by Hugo Holden; June 1st, 2019 at 04:21 PM.

  5. #15

    Default

    The core saturation current goes down with heat. The inductance goes down as it gets hotter. The heat is IR as you state but at some point, if the wire and the insulation hold up, it will reach the point that the inductance is not high enough to support 60Hz. The primary current will increase significantly and then the wire fails.
    Dwight

  6. #16

    Default

    Quote Originally Posted by Dwight Elvey View Post
    The core saturation current goes down with heat. The inductance goes down as it gets hotter. The heat is IR as you state but at some point, if the wire and the insulation hold up, it will reach the point that the inductance is not high enough to support 60Hz. The primary current will increase significantly and then the wire fails.
    Dwight
    Yes and some transformers sail closer to the edge and a hot core is a very bad sign. It really shows up when some (especially vintage)transformers for 60Hz are run on 50Hz, the off load primary current and core heating can be severe. I once had one in a vintage TV set where this problem and the radiated magnetic fields were so high that they disturbed the CRT beam and it was the original 60Hz transformer for the set. A replacement modern transformer designed for 50Hz of the same size & power rating had something like 1/5 the off load primary current on 50Hz.

  7. #17
    Join Date
    Jan 2007
    Location
    Pacific Northwest, USA
    Posts
    31,196
    Blog Entries
    20

    Default

    Ferroresonant transformers can also be a big problem if they're "tuned" to the wrong line frequency.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •