Hewlett
Packard 431C Power Meter
|
I was given one
of these meters some time ago in a very sorry state. The problem
was probably due to storage in damp conditions and the markings
on the dial had started to peel off. The dial was printed by
a transfer method whereby a very thin film of material is stuck
to the surface. The film had decomposed and broken up especially
where it was blank and the peeling was completely obstructing
the pointer which was stuck at zero.
Below is an image of one in
good condition and here's a technical manual. There are two main
variants: if yours has a serial number other than those shown
use the first link.
|
|
The instrument requires
a specific lead and thermocouple probe which may be plugged in
at the front or alternatively at the back. As with all sensitive
power meters the probe has a relatively small tolerance in terms
of input power. It can measure (at full scale deflection) 10
microwatts to 10 milliwatts and is accurate (with appropriate
probe) from 10MHz to 40 GHz. If you would like to know this in
voltage terms it will give you a reading of about 0.1 on its
most sensitive scale when the probe sees 7 millivolts RMS. The
switch on the lower right is set for the specific probe connected
to the Thermistor Mount socket. |
|
The maximum power
reading for this 478A probe, printed in red is expressed as 30MW
average power which actually means 30mW RMS. The black ring is
part of the N-connector input socket.
For some very odd reason Hewlett
Packard designers (or more likely their Drawing Office?) have
used a capital "M" which normally means "Mega"
instead of lower case "m" which is "milli".
They also inadvertently used upper case "DBM" when
they should have used "dBm". The documentation accompanying
the equipment and written by more knowledgeable Technical Authors
is correct.
The probe is calibrated on the
reverse side from 0.01GHz to 10GHz, showing a curve from which
a correction factor can be selected on the front panel.
In case you're not familiar
with dBs the designers calibrated the meter and switch in both
dBm and mW. Again there's a clash between the front panel and
meter markings.. the former half right but the latter 100% wrong. |
|
|
In the picture above you
can see the degraded meter scale on my example. It's just about
usable. |
|
Above is a picture
showing the construction of the instrument. The big space is
for a NiCad battery.
Just as British made stuff is
obviously British, US-made stuff is obviously American, often
using duralumin metalwork and always with coarse-threaded screws.
Now a point on practical measurements.
You'll see that the power meter has a range switch and a meter
scale marked in dBm and mW. These scales are equivalent so 0dBm
= 1milliwatt.
As the probe efficiency is frequency
dependent you can check the curve on the back of the probe and
set the centre knob to correspond with the frequency at which
you're working.
Now, for power readings to be
fully meaningful, you need to know the resistances associated
with the scale markings. In this meter the input impedance is
50 ohms so 0dBm means 1 milliwatt dissipated in 50 ohms and that
means there's should be a voltage, V across the 50 ohm input
(where V=square root of the product 50 x 1mW) of 0.2236 volt
to give you a display of 0dBm. As we're using RMS or "root
mean square" measurements this means that a little under
a quarter of a volt RMS present across the load resistor makes
the pointer show a reading of 0dBm.
Say we wish to connect a typical
signal generator to the power meter with a coax cable. The signal
generator will have an output impedance which is likely to be
about 50 ohms. Setting the output level to 0dBm on the generator
places a voltage of 0.2236 at its output socket. If we measure
this voltage with a suitable high impedance voltmeter you should
see a little under quarter of a volt AC, however if we now connect
the power meter to this socket we place 50 ohms across the output.
As power depends on voltage and current (Power=volts x amps)
we have to consider any changes in current flow. If the generator
output has a 50 ohm matching resistor in series with its output
connector the current will pass through this and the 50 ohm input
resistance of the power meter, thus half the output power will
be dissipated in the generator and half in the power meter. Similarly
if the output resistance of the generator is shunting its output
the output voltage will now see 25 ohms and the power will be
split 50/50 between the two 50 ohm impedances.
This is really annoying. One
sees the generator output indicated as 0dBm and you see -3dBm
displayed on the power meter scale. To get around this you can
use a series resistor between the generator and the power meter
(the resistor needs to be reactance-free at the frequency being
used so you should not use a wirewound type or a carbon film
type using a circular track). If you calculate the attenuation
due to the resistor you can take account of this in your measurements.
If you place 450 ohms between the generator and power meter you
increase the load from 50 ohms to 500 ohms. A tenth of the power
indicated at the signal generator is being passed to the power
meter (this is -10dB) so a reading of 0dBm on the power meter
means you have +10dBm registered on the generator. This is actually
an approximation because the generator actually sees a 45 ohm
load instead of 50 ohms. The output voltage is likely to be 90%
of that indicated so the power will be less than indicated. Increase
the attenuation to 20dB by using a larger series resistor 1531
ohms and you get a big improvement or 97% of the indicated power
at the generator output. A larger series resistor of 4950 ohms
gives you a load of 49.5 ohms which is even better if you can
tolerate an attenuation of 30dB.
It's important to know lots
of seemingly minor points about a measuring instrument if you're
interested in really accurate numbers. For example, if you connect
a voltmeter to the terminal on the rear of this power meter to
see what's driving the display the operating manual says you
need to make adjustments to the readings if your voltmeter has
an input impedance of less than 10 Mohms (that's 10 Megohms not
10 milliohms).
I found this tester to be very
useful for measuring small amounts of power (for
example when testing my home-brew noise source) but unfortunately
the meter let me down twice when peeling paint jammed the needle.
I dismantled it and fixed it before deciding to replace the meter
scale. I unscrewed the scale from the meter and scanned it. Using
PhotoShop I managed to, not only clean it up and replace missing
numbers, but to change the irksome lettering. Not "DBM"
but dBm and certainly not "MW" but mW. I also removed
the old numbers and swapped to a cleaner font plus taking off
the decimal point from the mW readings and using "10"
instead of 1 so that the voltage scale now reads 0 to 10. I made
a slight error without thinking and failed to give the mW scale
a red background. I'd vaguely imagined that yellow was correct
but later changed the drawing to correct this. It's not perfect
but does at least show some age...
|
|
How did I change the scale? A little
tricky as the scale is glued very firmly to the mirror-finish
backplate. I laid the metal plate, scale uppermost, on a cooker
hotplate. It got pretty hot and the scale started to bubble,
after which it just peeled off. I printed the new scale on a
standard laser printer and cut it to shape. I used a small bladed
scalpel to cut the curved slot for the mirrorand lightly spayed
the rear surface with a mounting adhesive. The paper wetted but
after about 30 seconds it dried and I was able to fit the paper
to the metal plate. |
|
|
Using this meter can be
slightly puzzling because the range switch doesn't immediately
line up wth the meter scales. The mW scale is OK (better since
I changed the max reading to 10 rather than 1) but the dBm scale
is calibrated 0 to -10 when the range switch is marked +10, +5,
0 etc. Of course you interpret the +10 marking as 10-0, +5 as
5 to -5, and 0 as 0 to -10. |
Return
to discussion on power readings |
|