I've found a lot of computers coming in with faults that a few years ago would have been unheard of.
The newer ATX PSU labels carry better and better ratings and yet are physically smaller than ever. They fail regularly. The older bigger one's with less power printed on their labels seem to go on for ever.
Inside the new PSUs the components are getting smaller and smaller for the same ratings.
Is this the reason they're failing?
If a capacitor gives up its sometimes impossible to find a replacement physically small enough to fit in the case.
Is it that margins are getting squeezed?
Are manufacturers asking too much of the component suppliers?
Not just component problems though. I still occasionally find the odd PSU with resistors inadequately rated for their job. Wattage is OK but in the front end of the PSU where there are voltages in excess of 320, some designers still use resistors with totally inadequate voltage ratings. Of course these fail prematurely.
In fact, it's not just mains variation, but to some extent whatever domestic appliance being operated by your next-door neighbour may influence the peak voltages in the front end of your power supply. Whenever the spin on his washing machine drum changes direction a thump of power is induced into the local mains supply. The waveform of the voltage entering your computer is no longer a pure sine wave as intended by the generating company, it may be a lop-sided affair with large switching transients. The smaller of these may be dealt with by tiny devices across the power supply input but sometimes not. Sometimes the poor thing will end up charred beyond recognition and sometimes cracked into pieces. Once this protector has given up, any repetition of the event will pass through the diode bridge and into the smoothing capacitor where chance may play its part and end-of-life follow quite quickly.
The real reason for a reducing PSU MTBF (mean time before failure.. or a way of averaging out over lots and lots of similar things) is a combination of causes...
Yes.. physical size is being reduced, pressing component manufacturer's to cut margins on their specifications.
Prices are being squeezed in line with increased competition. This means pennies saved on components as margins are cut back.
BUT... in the UK the mains supply isn't what it used to be. Pressure from Government to cut prices has cut costs. Cutting costs has been made by reducing staff. This has resulted in less preventative maintenance and hence more random power cuts, more mains fluctuations and much more seriously... reducing the number of local power conditioners (sub-stations.. or those humming masses of grey pipes or a little brick building with a lightning bolt on its door sitting in the middle of some housing estates) with the result that your mains voltage isn't as tightly controlled. Basically the mains voltage has been allowed to rise. Why? Because less tolerance in the voltage means less expense.
The power station doesn't send out 230 volts. If it did there certainly wouldn't be 230 volts at your mains socket. Because of voltage drop along the distribution cables more voltage has to be added to deal with the losses. Least loss is at higher voltages so the primary transmission may be over 100,000 volts. As this is too high for most peoples electric doorbells and suchlike it must be transformed into something more manageable and more useful. Progressively the voltage is transformed downwards until it's at the right level for domestic users. Of course the actual voltage at the local mains socket is still determined by the local demand on current and as this may vary from next to nothing to huge amounts of current when that football match on TV gets to half time and thousands of kettles are switched on, some means of governing the voltage must exist. A technique using feedback of the amount of current being consumed is therefore made at the local sub-station. In turn this equipment will feed back its demand on the equipment further back in the chain and so on. Reduce the number of such equipments and you'll cut costs but of course with the end result that mains variation will increase.
Most mains supplies are well above the nominal rating of your computer power supply! Look at the labelling. It will probably say "230 volts". What is your supply voltage? Probably around 245 or more. Hence reduced reliability from the stressed components. A few years ago a PSU may have been rated at up to 265 volts. I don't think this applies any longer to most. Before the Government allowed the voltage to be plus 10%/minus 6% it was plus/minus 6%. Did they think about the consequences of their action (said to allow harmonisation with the rest of Europe)? I think not.
What about electrical equipment manufactured for use on the continent (I mean Europe of course, or for Europhiles...the rest of Europe)?
In France and Germany, equipment such as TV sets will be marked 220 volts. Also light bulbs and computers and suchlike are designed to run at 220 volts. Plug one of these European leads into a UK mains socket at 2am and there will be a significant mismatch. In practical terms.. the life of a 220 volt item will be markedly less when operated in the UK. How much? It's hard to say. It depends on the margins built into the particular design. Plug into UK mains a device intended for US mains, nominally 115 volts at 60 cycles, and the lifetime may be measured in milliseconds or up to a minute if you're lucky. It's all a matter of degree. One of my customers bought a "Millenium Clock" in Florida. This was a large circular thing covered with LEDs and had a count-down to 2000. When he plugged it into the mains on his return it barely counted one second before illuminating hundreds of LEDs in one last gasp then extinguishing and expiring in a puff of smoke.
Not very satisfactory you might say. But what about the situation in England say in 1930? Such was the totally haphazard growth of mains, from its concept in late Victorian times to the mid-30s that a person might be faced with a mains socket supplying AC or DC. If it was AC it may be anything from say 100 volts in Brompton and Kensington to 250 volts in Durham. DC supplies were just as varied being from around 100 volts at Herstmonceux to a toe-tingling 500 volts in parts of Eccles. AC wasn't always 50 cycles either. It varied across the country from 25 cycles to 100 cycles, presumably depending on the history of the particular power station and its initial use... as sometimes lower frequencies were considered best for some jobs and higher for others. Even a simple task of purchasing a light bulb must have been fraught with difficulty, no wonder gas lighting was so popular for so long.
Wireless set designers were forced to offer their products in such a way as to enable a specific user to tailor the voltage settings and cope with the two types of supply from his mains. Hence we see the term "AC/DC". These sets did not use a transformer (useless for DC mains) but relied on a large ballast resistor carrying lots of tapping points to iron out the difference between the lowest and the highest voltage to be met in practice. AC mains sets had a more economic (cooler) means of catering for voltage differences. This was a set of tapping points on the mains transformer primary, usually via a rotatable 2-pin plug. Even as late at the 60s were AC/DC sets for 200 to 250 volts still being made.
Not only computer PSUs but also failing more regularly are the small PSU's for things like scanners and printers. These are usually marked 230 volts and their small physical size, coupled with their location, where heat dissipation is well nigh impossible, means that they will surely fail if their designers do not provide an adequate margin. Unfortunately "margin" in this sense invariably equates to heat and heat invariably equates to failure.
If you don't believe me. What's the modification being recommended by lots of people that service satellite receivers? Take off the lid and add a small electric fan to cool the chips.
A side effect of increased mains voltage is reduced life for your light bulbs. When a lamp is designed there's a trade-off between brightness and life. The brighter a lamp for a given voltage and wattage the shorter it's life. That's been a fact known since lamps were first invented!
Today the mains is 230 volts plus or minus 10%. Except for rare occasions when load is very high, that means PLUS. In times of lightly loaded mains the voltage is going to be at least 253 volts and this may be even more when tolerancing is poor. A modern light bulb is marked "240 volts" so it's lifetime that was calculated at 1,000 hours will be a lot less than this. Continental electrical goods are rated at 220 volts so beware when plugging, say French equipment, into UK mains! If it has a lamp in it, it'll be very bright but not for long. Strangely, many of those new bayonet-fitted long-life fluorescents are marked 220-240 volts. Longer life in Europe and shorter life surely in the UK? At least they'll be brighter here.
Back to computer PSUs... I've had a few PSU's that appear to be perfect but the computer stubbornly refuses to work. This is because at initial switch-on, in a fraction of a second the PSU output is checked by the motherboard and if found wanting will be turned off. Usually the 5 volt rail say.. will drop a fraction of a volt under load and this is enough to deem it bad. Make the "power good" line active and the PSU will come on quite happily with voltages looking normal, but this test is not as stringent as the computer test. The usual cause of the problem is poor smoothing capacitors running at way above their designer's ratings. If the capacitors are low in value or if their impedance rises the PSU output integrity will be compromised and it won't work.
I must have replaced four hard drives in a couple of weeks. These weren't very old either. A few months ago I fitted a new drive under guarantee. Not surprising why it had failed though. The owner had set defrag to operate every day as a matter of routine. The poor hard drive had just worn itself out!
The latest problems...
Couldn't read because the circuit board had failed (Fujitsu.. I contacted them about a new circuit board, as the data was needed urgently, but they just couldn't have cared less). I made sure the new drive wasn't a Fujitsu.
Made a noise like a marble in a tin bucket when spinning up. A Seagate.
Just couldn't be recognised except when cable select set up instead of master or slave. A Connor.
Couldn't be set as master but could be read as a slave. Another Connor.
See elsewhere in my scribblings about the GREAT Fujitsu scandal!
Not very often do I get a faulty motherboard. A Pentium Pro board wouldn't accept more tha 16Mbyte of memory. A socket 7 board from a Tiny computer lost its graphics facilities.
Lots and lots of these have been replaced over the last few years. They seem to last no more than 2 to 3 years now. I have a suspicion it may be their lasers getting tired. Question... is the laser switched on continuously? Is it switched on only when a disk is present? It it switched on when a read is requested? Maybe the ones that fail most frequently do so because the laser is on too much? The same holds good for domestic CD players. I'm forever fitting new lasers to machines not much more than 2 or 3 years old.
With computer CDROMs.. it's not worth repairing these because the cost of a new model (with a much better spec) is far cheaper than the likely cost of repair.
These things, used in PSUs and atop the processor chip, often start to get noisy with advancing years but quieten down as they warm up. Don't be fooled though because when thet quieten they sometimes spin too slowly and when it's a non-Intel chip needing maximum cooling the fan won't keep the temperature down and the chip r u n s s l o w .
Because of consumer grumbles much quieter fans are now around. Time will tell whether these are any good.