I repaired a medium-sized UPS rated at 3KW and lately a second, larger unit rated at 16KW 

 3000VA UPS

 

 A few weeks ago (September 2018) an Uninterruptible Power Supply was dropped off at the Low Cost Repair Centre. Although I'd suggested that most problems could be fixed with a new set of batteries these hadn't arrived with the equipment. I checked over the UPS but didn't see much wrong. I did notice a discoloured choke inside the box but what can go wrong with a choke, so left it and returned the equipment., suggesting new batteries were probably needed.

It arrived back a few days later, again sans batteries, so I removed the darkened choke and fitted a stout copper link in its place. Whilst the main board was detached I decided to test all the electrolytic capacitors. To do this meant removing about a dozen small circuit boards. This was no easy task as all the boards were soldered into position. Hours later, and much to my surprise I'd given scores and scores of the electrolytics a clean bill of health. The designers had apparently done an excellent job.

Despite all my efforts I'd been unable to find any information about the equipment so I was still on a learning curve.

After reassembling everything I returned the thing for the second time and awaited results.
 

 I heard back and there was still a problem, so I spoke to the site engineer who told me it was still troublesome. I suggested buying a new one but then I discovered a massive jump in price from a 1.5KVA to a 3KVA so agreed to see the UPS for a third time, but insisted on getting the battery pack so I could run some proper tests.

A clue to the problem was that he'd reported that smoke had come from the UPS case after a short time of running on batteries.
 

 Above is the main circuit board which I removed from the chassis. Dotted around the board perimeter are the daughter boards, some of which carry extremely complex circuitry. The main board carries two major areas of high-power circuitry.

I've already removed a blackened choke and replaced this with copper wire (centre right) and below you can see where solder connections have degraded from heat. I've also removed a component adjacent to the bad choke which I'll come to later.
 

 Here's a better view. You can see that the missing part is a capacitor. When looking over the board I hadn't noticed this, instead vaguely thinking it was a large relay as only the top view was visible and the markings hidden.

 

 

 

 

 

 Above are the parts removed for testing

 Below is another circuit board which is fitted on a metal cover under the lid. That other, small board, was screwed to the lid and carries a battery condition display and some LEDs
 

 Because I'd now got the battery pack, I could do some proper tests. On the right (above) are all the UPS connections which are brought out to a simple choc block. Not having any documentation I was unsure of two of the connections. I noticed track visible through the board that appeared to connect the brown/blue wires (bottom right) to a relay so initially made the wrong assumption that these wires indicated the state of the equipment ie. closed meant ON and open meant OFF, however the wires are marked "Control Switch" so I checked to see if there was a voltage across them and found 60 volts DC. To prevent a mistake causing any damage I found a 270 ohm resistor and discovered only a small current between the connections,so shorted the wires with a 15 ohm resistor and found it was an on/off switch.

 

 Below is the battery pack. It uses eight 12 volt 7Ah batteries connected in series. I noted the fastons don't fit the battery terminals so the battery pack is not the original one fitted by the manufacturer. That can at the back is from another project dating from 1917...

 

 

Powering the unit from 240 volt mains produced a start-up routine which apparently is a battery test. Most types of UPS do this and the nature of the test will reveal any problem with the battery pack. After a short period an alarm sounded and clearly, figuring out the display which now showed a red LED, the batteries had been determined to be not up to scratch. I've found in the past, with other UPS equipments, that a battery can be obviously bad, looking bloated or split, or looking quite normal, but presumably not reaching the correct level of voltage with a specified current. Whatever the test, bad batteries seem to be reliably diagnosable but are those above really bad? The report of smoke made me think a fault was present that somehow resulted in a false bad battery warning..

Although a simple test isn't going to prove this one way or another it may give me a clue. I measured the voltage of each battery with the UPS off, then on. Each battery looked pretty well nominal with none showing more than a fraction of a volt lower or higher than average so I wondered how the UPS would operate if the mains was turned off. I connected a 60 watt lamp to the mains output and poked my 15 ohm resistor into the choc block. Much to my suprise the lamp immediately came on.

I tried a few times and each time I got 220 volts AC from the unit so I decided to leave it on and see what happened. At this point I'll mention the purpose of this UPS. It's to provide a mains voltage for a short period, if the local mains fails, in order to allow lift passengers to get out of a lift which would otherwise be stuck. Hence the large 3KVA output, which is necessary to provide a lot of power for a short period to get the lift to a floor and safety.

Being winter, I wasn't in the workshop but in our conservatory which was much warmer. By 20 minutes I'd noticed that a slight hot smell had grown to a definite burning smell and my XYL called out that I should find out where it was coming from. I already knew the answer so I turned off the UPS, lifted the lid and gingerly felt around for something hot. There are a couple of large heatsinks and to my suprise, both were cold.... but I could feel heat emanating from something. It was one of the two large chokes (I'd already replaced a blackened one thinking it might be shorting internally). In fact, it was incredibly hot so time to do some calculations.

First, a 60 watt light bulb consumes 270mA of current from 220 volt mains. I hadn't yet noticed how the choke was connected, but clearly 270mA through a pair of 1mm wires in parallel shouldn't worry it. I'd already measured its resistance (not easy) and found it was roughly 35milliohms. Losses at 270mA should be around 2milliwatts so the choke surely couldn't be in a mains feed delivering 60 watts. So what about the battery connections? Well, the batteries are connected in series to provide around 96 volts so equating the loss at 220 volts to 96 volts gives me 618mA. The loss increases by a whopping factor of 6.5 to 13milliwatts. No way can the choke get to be that hot from 13mWatts.

A UPS takes in mains, rectifies it to something like 320 volts DC and chops it at a figure usually between 20 and 30KHz then transforms this to generate a lower AC voltage. This is rectified to produce a supply voltage for an inverter and a voltage for charging the batteries which are kept trickle charged until required.

When mains fails, the UPS uses its batteries to produce a mains supply. Various relays perform the switchover from the public mains supply to that generated by the UPS. With luck, and good design of course, any mains powered equipment will see a loss of mains power for only a very short time as the changeover relays are energised.

Why on earth would a choke running such a low current get so hot then? Clearly the UPS is working, it's not as if it wasn't, and the output surely wouldn't heat up a choke as much as that observed. It took 20 minutes for the choke to cool sufficiently to be able to comfortably touch it and therein lies a clue.

Below, overheated solder connections to the choke.

 

Because the UPS can handle 3KVA it's possible that somewhere in the unit there's a fault which is drawing a huge current, but not enough to cause the equipment to stop functioning. The contradiction to this is the absence of heat anywhere else in the equipment. Let's say the choke is drawing enough current to make it get very hot... let's say the dissipation is 40 watts. The current necessary to produce 40 watts in 35mohm is about 34 amps. 20 watts would be produced by 24 amps, but surely those sort of currents would raise the temperature of a heatsink, yet both are stone cold.

The answer of course is nothing to do with DC current (see the capacitor's rating below). The two chokes are connected in series and intially one was burnt, then removing this caused to second one to burn. A DC current through two chokes in series would produce about the same heating effect in both, but it was only one that appeared to suffer. Adjacent to the chokes is a capacitor (see below) which from the size and shape I'd assumed was a relay. I checked it in-circuit with my ESR meter and found it measured 1.44uF and zero ohms ESR, but removing it revealed previously hidden markings which showed it should have been 10uF.
 
   

  Above, the fried choke and the pristine-looking duff 1.44uF capacitor.

 

The two chokes and the capacitor represent a filter. At this point I hadn't traced exactly where these fit into the UPS circuitry but what's happening is a ripple is present which is being inadequately filtered by the duff capacitor. The ripple component of the current passing through the first choke is heating, not the copper coil, but the ferrite core of the choke. The core is getting hotter and hotter to the extent the copper coil is reaching a temperature sufficient to burn the enamel insulation. The second coil might also be suffering the same fate, but by the time the current leaves the first choke the ripple has been absorbed by the core and isn't bad enough to cause excessive heating. Removing the first choke and replacing this with a jumper wire allowed the ripple to reach the second choke and this was the one I'd found to be incredibly hot. Below.. removing the black component showed it not to be a relay but a capacitor.

 

 Above, from the position of blue and brown wires which are mains in and mains out, the filter may be in the internally generated mains output and it's function would to eliminate roughness in the 50Hz output so a decent sine-wave results? I traced the circuit and sure enough the filter is in the mains output.

My guess is the missing 8uF or so of capacitance is in some way associated with a measured reduction of the battery charging voltage which is confusing the battery test routine. Once the capacitor has been replaced, hopefully the batteries will pass their test.

Going back to some theory (for the purists: leaving out "j"). What's the impedance of a perfect new 10uF capacitor at 50Hz ? Answer 318 ohms.

Current drawn at the point where the mains voltage reaches 220 volts RMS will be 0.7A RMS but as the mains goes periodically from +220 volts to -220 volts the current will vary in sympathy from +0.7A to -0.7A. In practice the output from the UPS will not be a perfect symetrical sinewave so there may be a residual current. However, the purpose of the capacitor is to help filter noise and this noise will be related to the method used by the equipment to generate the mains output. I imagine to get the best efficiency from the equipment the transistors fitted to the giant heatsink will be turned on and off very rapidly and this means that harmonics (mainly odd) of the 50Hz output voltage will be present. The 10uF capacitor will present a lower impedance to these harmonics than for 50Hz, for example at 150Hz, 106 ohms, 1KHz, 16 ohms, 10KHz 1.6 ohms. Presumably the UPS designers calculated the effect of the 10uF capacitor, together with the pair of chokes to eliminate as much noise as possible from the output and make this look like a nice clean sinewave within commercial budget constraints.

As a capacitor degrades not only does its capacitance reduce but its internal resistance rises also. A degraded capacitor in many stressful applications will rapidly expire and usually turns into a resistor, goes open circuit or sometimes just explodes. The example below didn't get that far but I've seen the remains of many that have ostensibly vanished leaving shredded aluminium and packing material plastered around the inside of the equipment.

I'm afraid that capacitor reliability is going to suffer more and more due to pressures to reduce their physical size. Bearing in mind that internally generated heat and ambient temperature are the key factors determining the life of a capacitor, the smaller the package the less the reliability. Maybe this explains the size of the replacement I used for this repair? It's 50% larger, maybe because the suppliers only deal in components that don't result in customer's complaints? Nowhere could I find a replacement that matched the (smaller) size of the one I removed.

After reassembling the UPS I didn't test it. Instead I waited until my customer arrived to collect it. I turned it on and after a short time switched on the battery pack. The battery test routine ran and declared 50% capacity (which is about right as I'd run the batteries for at least 20 minutes a few days back), then it gave green lights (no red alarm LED and no annoying bleep). Success... so I unplugged the mains supply and the 60 watt lamp immediately lit up. After a minute all was OK so I turned off the equipment and loaded it on the wheelbarrow.

I understand there a a few more of these equipments on site. I wonder how long these will last?

 Before leaving the subject of UPS equipments, I'll mention mine. It's a small thing rated at 1.5KVA using two 7Ah lead acid batteries and used to power my computer, display and a small desk lamp. I've had it for several years and its saved me a lot of bother as here in the New Forest mains power isn't very reliable. A few months ago our mains power dropped out and the UPS took up the job of looking after my computer. After a few minutes the power returned and all was well, but after a short time we again lost power. After a few seconds it reappeared and again after a few minutes the same thing happened again. After something like ten minutes I was aware of a strange smell not unlike a steam train. This gradually worsened and I wandered around the house looking for its source. I traced it to my UPS. A clue was my computer had turned off. I lifted out the UPS and detached its case. Inside the two batteries were red hot and the sides were split with H2S fizzing out of the splits.

Later I checked the circuit board and found three of the power FETs had gone short-circuit placing raw AC across the batteries which of course had quickly failed. I fitted three new FETs and two new batteries and much to my surprise the UPS was now working normally. I'd already rung the people responsible for providing our mains supply and put it to them that the mains voltage must have risen and blown up my UPS. It took three phone calls before I was promised a cheque for £40 to cover the cost of new parts.

 

Now for a much bigger equipment, a 16KW UPS

 Trimod 16KW UPS

 

 Above is a view of the top front of the equipment which stands alittle under 5 feet tall

Below rear views with all panels detached for access and modules removed.

 

 

 

 

 From the rear you can see the connections for the 6 power modules, removed and shown below.

 

 

Above a view of one of the two filter boards and below two of the four battery trays.

 

 
 

 Overall control of the equipment is via this small panel on the front door cabled to the circuit board pictured below.
 

 
 

 

 So, what's wrong with the UPS? I opted to receive the whole rack as I'd gained enough experience with the smaller 3KW UPS to realise that I need the batteries and filters. I'd also checked beforehand and found the control panel permits a set of diagnostic checks. Unfortunately the weight of the UPS being over 3 hundredweight or well over 150Kgm makes repair a tedious business.

I removed the control PCB shown above and noticed it had been repaired before with two capacitors looking different to the remainder. All bar one are tiny surface mounted types which are not too reliable in circuitry where heat is involved, such as in a chopper power supply or in regulator circuitry. I removed all the capacitors and found all the surface-mount types ranged from open circuit to having a very high ESR.

I then worked out the general principles behind operation.

The equipment runs from single-phase 230 Volt mains and is designed and configured to produce a 400 Volt 3-phase mains supply backed up by twenty 12 Volt batteries. To generate its 16KW of power some six power modules are fitted. Control of the UPS is exercised via a small display unit fitted in the door and cabled to the control PCB. It seems that battery voltage is fed to a chopper power supply on the top left of the PCB, above and I'd guess that in normal operation the batteries are connected in series and are trickle-charged from the power modules. This being so I decided to feed the PCB from a 240 Volt DC PSU. A clue to this voltage is that the input capacitor is rated at 450 Volts.

Having already tested the PCB before swapping the capacitors I'd expected it to burst into life having fitted 12 new ones but that wasn't the case. As I had a spare chopper chip I first swapped the 3844B SO8 surface-mounted device as I've found these can fail if associated capacitors go bad. A new chip failed to bring the PCB to life so I looked further. All the surface-mount diodes and resistors etc tested OK but I found a rather unusual 6-leg SOIC chip marked "AB" which I eventually decoded as a Ricoh DC/DC converter type RP500N212A rated at 6.5 volts input and 2.1 volts out. As this is a rare thing with no immediate chance of replacement but with little chance of it being bad, I decided to trace the circuit to see exactly why this strange device was fitted. I measured the input pin at 27 volts. This is fed from HT via a chain of high value chip resistors (5 x 1.2Mohm) ending with a 27 Volt zener diode marked "K4". Clearly there's more to this than meets the eye and sure enough the input ground plus the Enable input were connected to long meandering tracks weaving in and out via tiny plated-through holes and ending with a set of four tiny chips marked "A4". These are tracked to a connector marked "LCD display". The DC/DC converter input ground was routed via a 47Kohm resistor to its long track, with a capacitor smoothing the voltage between this and the Enable track. My guess is the designer chose to use the odd Ricoh chip to remotely control Power on/off from the UPS front panel. As I can see no relays used in the equipment (so far) this may have been an innovative approach to achieve better reliability than using a simple relay.

Having established the likely presence of a remote on/off switching feature, the next step was therefore to place the PCB on top of the rack where I could plug in the cable from the front panel display. To do this entailed making up a 20 foot cable to feed the 250 Volt HT supply from the bench PSU. Much to my relief, having plugged in the display and having turned on the HT supply the on/off button turned on the display. This resulted in an error message because none of the batteries or power modules are fitted, neither are the remainder of the cables to the controller PCB plugged in. Pressing the on/off button turned off the display. So far so good....

The next step is to resolve a battery problem.

Despite being advised that the batteries have been replaced and are only a few months old none had a terminal voltage greater than 5.2 Volts. For sealed lead acid batteries this is not good but nevertheless I decided to charge them to see if they're recoverable, otherwise it will mean replacement at a cost of around £20 each or £400 for the set of 20 (it sounds a lot of money but it's actually less than 3% of the £14,000 price of the equipment). Not only were the batteries flat nearly all the faston connectors were sprung open so that their grip was virtually non-existant. This is rather puzzling as the "new batteries" would have never worked even with a working controller PCB.
 

 

 50% of the batteries being charged. These batteries are rated at 12 Volts 7Ah and are fitted five to a tray. I believe the rack cabling is arranged to provide 240 Volts from the 20 batteries. I have a 12/24V charger which gives me the option of charging two trays of the batteries in sets of 5 pairs. After a couple of days I'd managed to get all twenty batteries to read over 12 volts and after tightening their jumpers and connecting their leads which are arranged to provide a group of four plus a single battery per tray, which somehow makes the complete set of twenty providing 240 volts of power. Maybe there's a test circuit checking sets of fours... so four trays gave four immediate sets plus a further set of four spread over four trays (4 x 48V + 1 x 48V = 240V).

I slid the four battery trays in place and checked the voltage at the power cable going into the controller board. It measured 240 Volts so I connected up the various leads. One goes to the display held on the door, one to the batteries and another monitors the 3-phase AC outputs from the power modules. Also is a connection to AC mains input so all can be monitored by the controller.

If everything works, without a mains supply, and without plugging in the power modules I should be able to see a display. Pressing the ON button brought up the display as I'd hoped and I'd started to use the scrolling buttons when suddenly the display went off, and try as I might nothing would persuade it to re-illuminate. Maybe the battery voltage dropped suddenly? No.. the voltage was still 240V so I ried the high voltage DC supply but that didn't bring up the display either...

I considered the options. Kicking the thing into touch was my first choice, but that meant a huge cost to someone so I looked at possible reasons for the failure. I decided to power the board from 240 volts on the bench and check the standby voltage developed on the pcb. This, I'd measured as 27 volts during initial tests but now it was missing... what had been a stable 27 volts across the zener now measured only 115mV. Maybe a resistor had failed.. but no all seemed OK? Checking the various parameters I reckoned the current through five 1.2Mohm resistors fed from 240V to the 27 volt zener would be (240V-27V)/6M or about 35 microamps.. each resistor would dissipate (240V-27V)/5 x 35uA = 1.4mW and the zener diode would dissipate less than 1mW. There's no way any of these parts are stressed so my first thought was the chopper chip enable pin was drawing enough current to kill any voltage at the DC-DC converter, so I removed the 3844B. still no voltage so maybe the rare Ricoh chip had failed? I removed it (I had to use the hot-air gun as the chip is only 2.9mm x 1.6mm and has 6 legs) and checked once more... Still no voltage other than a mere 115mV across the zener diode so I removed it and checked its resistance. The chip is marked K4 which in my reference book is used on 17 different chips but I think its an MMSZ52354. It correctly measured as a diode in its forward direction, but oddly it read only 0.3 volts in reverse, making it act like a conducting zener diode with just my multimeter test voltage. I hunted around and found a BZT03-C27 which I soldered across the surface mounting pads and tried again. Thankfully 27 volts was now present so I refitted the Ricoh chip and the chopper chip and tried again.. The 27 volts was still present and moving the pcb over to the UPS, still powered by the HT supply, I pressed the ON button and the display came to life. I'd imagined the 27 volt supply was responsible for the display, but of course this was impossible given the tiny currents involved so obviously what was happening was this... once the ON button is pressed, the Ricoh chip is turned on when its enable pin (which goes to the display panel) picks up a fraction of the 27 volt feed. The Ricoh chip converts whatever voltage is present between its input power pin and input ground to 2.1 volts between its output and output ground. This is connected to the chopper chip enable causing this to activate the circuitry which establishes the various voltages for powering the controller board. The controller microprocessor then establishes contact with the display mounted on the door.

 

 

 The chopper chip is quite large being 4.8mm x 3.8mm, but the Ricoh DC-DC converter is a mere 2.9mm x 1.6mm. The 27 volt zener diode is hidden behind a capacitor.

 

 The only zener diode I had to hand was this bead type. As I have no real idea why the original surface-mount diode failed the replacement should fare better.

You'll note the five series-connected resistors. Because chip resistors generally have only a limited HT rating a common trick is to use several in series. This also allows the dissipation rating to be reduced.

This pcb also carries 3-phase mains monitoring circuits which again use resistor chains like this.

 Having connected the repaired controller using the HT supply, and having scrolled around the system details, I switched off, disconnected the HT supply and installed the battery trays. I then pressed the ON button and it worked. Now, the UPS is being operated by its own batteries.

At this point I decided to call it a day. The UPS is rated at 16KW so, under full load, a single phase mains input might have to supply some 16000W/240V=66A. I've no idea what sort of auto-test procedure might be triggered once the six power modules are fitted once mains is connected, but as my mains supply is rated at only 13A max it will be prudent to now return the equipment to site where a suitable mains supply is available. Further tests can then take place to determine if the batteries are up to the job...

 

 pending

 Return to Reception