

Sooner or later most amateurs, or radio enthusiasts generally just messing around with radio equipment, will decide to make measurements. Dealing with radio frequencies is different to dealing with DC where simple Ohms Law rules. AC measurements are affected by capacitance not just resistance and the higher the frequency the greater the significance of even small capacities. AC measurements are also affected by inductance so when one designs simple test equipment the capacitive and inductive effects of construction must be considered. The higher the frequencies the more one must consider these effects. Read on to see details... Another problem you'll meet when making measurements is input and output impedance. You may not even realise that impedances are important and that test equipment markings are meaningless without reference to impedance. Many RF equipments are designed to have an input impedance of 50 ohms and one can often see this impedance as a 50 ohm resistance using an ohmmeter. A well designed equipment will have only a very tiny input capacity and input inductance at its input socket commensurate with the operating frequency and ideally the input impedance will be 50 ohms across its whole operating range. Often, the quality of the matching, or the closer the input impedance is to DC resistance, defines the performance of an equipment. A spectrum analyser will display power amplitude in respect to frequency. When the key parameter, which is power, is measured and the reading drops by more than half (the 3dB point) it is said to be outside specification so the reliable operating range (or bandwidth) of the equipment is said to be within the points reduced by 3dB, the term bandwidth usually defining the lower and upper limits of frequency (such as 100Hz to 1GHz). To confuse matters, in the case of an oscilloscope which displays the amplitude of an input which is voltage in respect of time, our 3dB points will no longer be relevant to describe half. Power is proportional to the square of voltage so the square becomes x2 in the language of decibels changing 3dB to 6dB. An equipment calibrated in power levels in dB will have a specification described by 3dB for a reading of half the true input but to describe a reduction of displayed voltage by half is 6dB unless its really expensive when the dB figure is reduced or dropped in favour of a low percentage figure. To confuse matters even more you might find the bandwidth of an oscilloscope described, not as 6dB but as 3dB. The minus sign means "less" or "reduced by". What the manufacturer means here is that the voltages displayed at the upper and lower frequencies of the quoted bandwidth are 70% or 0.7 of their true values (incidentally I'm glibly using whole numbers, but dBs are worked out from logarithms which are based on an "infinite series" so 3 is really 3.010 etc and 70% is really 70.76 etc % ... the figures, like Pi go on for ever). For example, for a 100MHz scope, a 10 volt RMS signal of 100MHz will appear on the screen as 7 volts RMS or about 19.8 volts peak to peak. In other words, describing an oscilloscope in terms of 3dB equates to 70% rather than 50% trace amplitude at the extremes of its bandwidth but a spectrum analyser will display a trace Incidentally, I've described equipment performance as though it only depends on the quality of its input impedance, but of course having made an oscilloscope with an input spec to 100MHz the internal design needs to support this. For example, an oscilloscope which is good for 1GHz will have much more exotic circuitry than that for 100MHz and therefore may be an order of magnitude (x10) more expensive. An oscilloscope defined as a 100MHz scope can be used with confidence up to 100MHz but remember that at this point the amplitude of the display will be only 50% or maybe 70% that of the voltage being measured depending on whether the spec mentions bandwidth to 6dB or 3dB. If one presents a signal more than 100MHz you will see a display which is lower in amplitude than it should be by an indeterminate amount, in fact the entire design of the scope will be based on the specification so the display will rapidly fall in accuracy past its spec. The key reason for the maximum operating frequency of many equipments is the departure of input impedance from DC resistance, whether it be stray input capacitance or inductance. DC is effectively a very low frequency, but measuring the input with an ohm meter might fail to show 50 ohms, particularly when a blocking capacitor is in series with the input. With professional laboratory equipment 3dB points are abandoned and a figure of 1% or 3% etc is more common. Before departing from this topic, I'll just mention oscilloscope probes. Using a probe will essentially negate the manufacturers equipment specification, substituting for this the spec of the probe. This statement also applies to the use of coax cables plugged into the input socket of the equipment. For more on this topic see "Making practical measurements" below... Anyway, back to first principles... If you try to calculate the effects of capacity and inductance at a particular frequency you will need to understand some basic mathematics, and also you will need to understand the basic units of capacity and inductance. The unit of capacitance is the Farad which is a spectacularly large unit when working with radio frequencies so we often talk about picofarads (or "puff"). One farad = 1,000,000,000,000 picofarads. Similarly the unit of inductance is the Henry and we usually work with millihenries or microhenries in radio. One Henry = 1,000 millihenries or 1,000,000 microhenries. The origins of these units, which may help to understand why they are so large, go back to the days of frock coated scientists working with large brass physics lab stuff, huge coils and big glass jars for capacitors. In fact the unit for capacity was the Jar** and it was only after WW1 that the British Navy stopped using the Jar for measurement of capacity. Oddly, the unit of resistance measurement, the Ohm, is pretty useful as it stands. ** The name Jar comes from the method of making a capacitor which was to line the inside and outside of a glass container with metal foil. So what is the effect of say a 30pF shunt capacitance at various frequencies? Unfortunately, as soon as I try to keep things simple the wheel comes off. Note here that the term "Resistance" that I'll be using is not strictly accurate. The term should be "Impedance" and differs from resistance by including socalled complex components such that combining the impedance of a capacitor and an inductor at AC is not straightforward arithmetic. For example combining the two will result in tuning effects or resonances. However for the purpose of my explanation I'll stick to resistance. Maybe, later I'll discuss impedances and "imaginary" numbers. Suffice it to say, if the resistance of a coil is say 50 ohms and associated capacitance is also 50 ohms you will get a strong interaction or resonance effect. Just a note in passing... this interactive effect I mentioned is often used by designers to artificially improve some equipments by increasing a rapidly dropping input impedance right at the working limit to a figure nearer the low frequency design impedance. Judicious use of a capacitor and a small coil can boost input impedance to artificially improve the upper working frequency of a wattmeter for example. A sort of last gasp before the thing isn't usable. You can see that 30pF has a dramatic effect at frequencies over 100MHz, where it will halve the amplitude of a signal, but is not too bad below 10MHz. 











There are a few measurement methods. The two most commonly used being voltage (if you're using an oscilloscope for example) and power expressed in dBm (the controls on a spectrum analyser are often marked with dBm). Either can be used but the dB (decibel) method relating to power is most popular nowadays. Slightly puzzling is how one understands or expresses things in dB. Remember Ohms Law. Power=V times I, but also V squared divided by R. Current measurements are awkward so voltage readings are generally made***. Basically, dB measurement is based on ratios. The term "dBm" is an expression relating a measurement to 1 milliwatt of power so +10dBm means 10mW. As an aside: you may be familiar with the term "3dB points" which is used to describe filters. Why choose 3dB? That's because 3dB is really the ratio"2". When we say 3dB we are also saying "half" which now sounds more sensible. In fact it's not precisely 3dB but roughly 3.01dB because the Log base 10 of 2 is not an exact number. As you can guess then from what I've just mentioned, dB expresses measurements using base 10 logarithms. This is clever as it turns large numbers of zeroes into simple digits. Log base 10 of 10 is 1, Log base 10 of 100 is 2 and Log base 10 of 1000 is 3. The equation for expressing power is 10 Log base 10 of the ratio so, for 1 watt which is 1000 milliwatts (ratio = 1000) the figure is 30dBm. When we use voltage rather than power to express things the dB equation is 20Log base 10 of the ratio. This is because power is proportional to voltage squared and a "square" in Log base 10 is counted as x2 Generally, RF equipment uses a standard input or output impedance of 50 ohms ** so we can say that 1Watt of power represents about 7 volts. This is from the Ohms Law expression: Power=V squared divided by R or put another way, V = the square root of the product of R and W. Sometimes 75 ohms is applicable, for example in older British equipment, or sometimes balanced or nonstandard impedances are met. To work with these needs some thought and maybe some extra matching hardware. At this point I'll digress a little.One method of making measurements is to use a multimeter, but these tend to be used for low frequencies rather than radio frequencies. A multimeter set to AC will show you a voltage reading which is ideally RMS. Not all voltmeters do this but the better quality ones do. RMS means Root Mean Square and is the standard method of expressing an alternating voltage. Why? Because at any instant a typical alternating voltage could be virtually anything from say minus 100 to plus 100 so RMS provides an understandable way of indicating a voltage. The RMS voltage is defined as that having the same heating effect as a DC voltage of the same magnitude when connected to a resistor. Looked at on an oscilloscope an alternating voltage of the type I'm discussing is generally sinusoidal but beware, what you'll see on a scope has transitions from minimum to maximum and the only easy way to measure this voltage is peaktopeak or perhaps (given a graticule) from the midway point to the top. RMS is the value at the peak voltage divided by the square root of 2 (where the peak is measured from the midway level to the very top of the wave). If you calculate the sinusoidal voltage across a 50 ohm load for 1Watt you get roughly 7 volts RMS. This is the same as 10 volts peak across 50 ohms or 20 volts peaktopeak across 50 ohms. When you express peak power or peaktopeak power you get different figures. 1 Watt RMS= 2 Watts peak=8 Watts peaktopeak. Quite a difference in the numbers, and in fact open to misleading claims by transmitter or amplifier manufacturers. ** UK television aerials, and lots of related test gear use 75 ohms as standard and lots of early UK RF test equpment will have 75 ohm inputs. *** One current measuring technique which is relatively simple is a hot wire aerial current meter but, because the aerial impedance may be illdefined, this is generally used for tuning checks rather than accurate power measurement. 



Because of the definition of dBm we must be sure when making measurements that 50 ohms is maintained throughout the test circuitry. Another reason for maintaining 50 ohms is concerned with the effects of mismatching. An RF signal should travel in one direction through the circuit and it will do this if everything is said to be matched. For example. Connect a length of coax cable to a 50 ohm generator output socket and what happens to the RF signal? When it reaches the open end it sees a high resistance and bounces or reflects back down the cable to the generator output socket. If the cable length is related to the wavelength of the RF signal a standing wave will be produced, but given an indeterminate cable length the RF signal will be changing continuously in amplitude. Try to make a measurement of voltage at the end with a high impedance meter and the voltage could be virtually any value. Even a small mismatch in a circuit can result in odd voltage readings. A perfectly matched system will allow you to make meaningful measurements, otherwise the best you can expect are relative values. For practical measurements an oscilloscope is the easiest measurement instrument to use as it generally has a high impedance input. This means that you can measure voltages without unduly affecting the circuit, but not necessarily so... see the effect of a 30pF oscilloscope input in the resistance table above. Also, an oscilloscope with a probe can have a different characteristic to that printed at the oscilloscope input terminal. I'll pause here for thought. Some scopes have a 50 ohm BNC or PL259 input connector but don't be fooled into thinking the input is 50 ohms. Take my newest scope the Instek GDS1102U for example. This has an input impedance of 1Mohm + 15pF and is rated at DC100MHz(3dB). The input capacity of 15pF equates to 106 ohms at 100MHz so you can see the input isn't simply 1Mohm. To help deal with the capacitive shunting effect one would use a probe rather than just connect a BNC lead to the input (which of course would, itself add capacitance). A quick word about the probe that comes with the GDS1102 scope as it's likely to be similar to others. It's designed for connecting to an input of 1Mohm with a shunt capacitance of between 10 to 35pF. These type of probes incorporate a trimmer which lets you balance the probe to a scope. Typically whilst watching a square wave trace the trimmer lets you sharpen the rising and falling edges. A slide switch lets you select either x1 or x10 attenuation, but the spec tells you the 1:1 probe is good for DC to only 6MHz although the scope itself is good up to 100MHz, which I find strange. However the 10:1 setting is good for DC to 100MHz. Still puzzling over the reduced spec of the Instek probe I checked the equivalent offering from Tektronix. Sure enough they also quote DC to 6MHz at 1x and the reason is clear as they quote an input capacitance of 110pF which, from the resistance table above tells me that at 10MHz this would have a shunting effect of 144 ohms, making the scope probe miles worse than the 1Mohm DC figure. At 10x Tektronix quote DC to 200MHz, 10Mohm with 17pF (the latter having a resistance of around 50 ohms at 200Mz). Clearly, for radio measurements one should always use the 10x setting on a scope probe and even then expect detuning due to the 17pF capacity. Surely there must be a way of making measurements without messing up tuning. Let's consider a probe that could do this. This could be designed in such a way as to (almost) isolate the effect of the measurement process, for example if its input impedance is so high that it perhaps has only a 1% effect or less on the circuit being measured. See my active probe. This was designed to protect my spectrum analyser from damage from voltages met in valve equipment. It has a 1Mohm resistor as its probe and vitually no input capacity. This introduces us to the use of a spectrum analyser. A spectrum analyser is an alternative measurement instrument, but this will have an input rated at 50 ohms AND will have a relatively low tolerance of power or voltage. You certainly do not want the analyser input to act as a transmitter dummy load so the ideal method of using the analyser is to make a small sampling device that limits the voltage applied to its input socket to say 100mVolts. Looking at actual numbers, my spectrum analyser is the DSA815TG whose datasheet says the maximum CW power handled at its input socket is 20dBm or a mere 2.24 volts RMS, above which protection circuits come into play. Damage will be sustained at 30dBm which from the tables below is a mere 7 volts RMS. I'm currently testing my old Yaesu FT480 transceiver (and later, plus a Microwave Modules linear power amplifier). As the FT480 even in low power mode can push out between 2 and 5 watts and the MM linear can push out 100Watts it is essential to think very hard BEFORE connecting a spectrum analyser for measurements. In the following, I describe the design of an attenuator or sampler which can provide the means of monitoring a transmitter so that adjustments or power measurements can be made. You'll see that this can be used with an oscilloscope or spectrum analyser but the results will need some explanation. Below is a table showing the calculations involved in the design of an attenuator suitable for sampling the voltage across a 50 ohm dummy load. To give practical results the attenuator must not unduly modify the system impedance of 50 ohms so will have the form of a potentiometer or divider comprising R1 in series with the voltage being measured and R2 connecting the voltage at the dummy load to ground via R1. R1 will be high enough to buffer the effect of the attenuator and R2 will be 50 ohms to ensure correct measurements. Insofar as matching the spectrum analyser input see further explanations. A simple T connector is inserted between the SWR meter and the dummy load such that there is no resistance between the two and the FT480 output sees the 50 ohm dummy load. The output from the Tconnector will connect to the new attenuator or sampler. This attenuator will have two resistors: R1 linking the input socket to the output socket and R2 shunting the output socket to ground (the coax shield). R2 is selected as 50 ohms to match the system impedance and R1 will be selected to place 100mVolts across R2 at the rated maximum power input in the table. For example to display a level of 100mV on the spectrum analyser screen when 2 Watts is running into the dummy load requires R1 to be 4.950Kohm. Coincidentally this provides exactly 40dB of attentuation. As you can see below nothing special by way of resistor power rating is required. The columns in the tables show power in dummy load; product of load and power; load voltage; load voltage divided by 100mV (where 100mV is the desired sample voltage for the spectrum analyser); attenuation, input resistor R1 (output resistor R2=50 ohms); power dissipated in R1 and R2. The attenuation column is calculated from 20 x Log base 10 of ratio of input voltage and the output voltage This is the same as 10 x Log base 10 of ratio of load and sample output power which is chosen as 0.2mW (being 100mV across 50 ohms) 







So, as you can see from the above tables, to test the FT480 will require an attenuator of 60dB to cope with the output from the linear or 50dB barefoot. I made one in a small extruded aluminium box using a 39kohm and a 12kohm + 100Kohm parallel combination in series, with a small 51 ohm resistor at the output. This worked OK. During the various tests using an FT480 set to 145MHz FM I found that these BNC connectors are not very good. Some were a little wobbly and you could see variations in the RF output when the leads were moved. I used a Bird Model 43 wattmeter and a Welz dummy load. The latter wasn't very good at 2m and it's spec suggests this. Now a word of explanation about making accurate measurements with the spectrum analyser. The attenuator/sampler described above includes an output resistor of 50 ohms (R2). This is fine for making measurements on an oscilloscope or a high impedance device because the sampler design (with an input of 100 volts) puts 100mV across R2, however if one connects the sampler to a spectrum analyser having a 50 ohm input this will be shunted by R2, which is also 50 ohms. Therefore the voltage divider will be modified and the output resistor R2 will effectively be 25 ohms. Attenuationwise the sampler will now be 66dB rather than 60dB as it's output voltage will be halved (from the equation 20Log10(2) which gives 6). If one wants a real 60dB attenuator in front of the spectrum analyser one would omit the termination resistor, just leaving in place the 49.95Kohm series resistor. A practical option might be to add a small toggle switch so the attenuator/sampler can be used in front of a scope or analyser. I'll try this and see what effect the switch has. An alternative is to add a second BNC Tpiece and put a 50 ohm termination on the T. Above, I mentioned an active probe designed to buffer a spectrum analyser from high voltages. This device has a fundamentally different use to the attenuator/sampler as it designed to deal with relatively small RF signals which might be present in valve circuits where DC voltages are pretty high (circa 250 volts). The sampler's job is to syphon a tiny but accurately measurable part of the power being fed to a dummy load (or an aerial for that matter). Basically part of the RF current supplied by the transmitter is channelled through the input circuitry of the measuring equipment whether it be a 50 ohm resistor to which a scope probe connects, or the 50 ohm input circuit of a spectrum analyser. As long as the active probe has a reasonably flat frequency response the actual signal level isn't that important as one is generally looking at only the relative response of circuits to frequency. Now some pictures of measurements This shows the FT480 set to 145MHz connected to a Welz dummy load with the Bird Wattmeter on the right. The sampling device carrying a 49.95Kohm series resistor is shown on the left. The dummy load is working at about its frequency limit, being rated at up to 150MHz. 































The FT480 is capable of 10 watts RF output but has an output stage which is governed to some extent by aerial matching so that in the event of a bad mismatch the power will be limited to prevent damage. I used a discone aerial which is my general purpose aerial, not ideally suited for 2meter operation. As I have a Marconi TF1020A/5MI absorption power meter which incorporates a 50 ohm dummy load and a meter indicating 150W or 300W I decided to use this in place of the Welz device. Before doing this I had moved to testing my Microwave Modules 2 meter linear amplifier (MML144/100) which is an early model which can be driven by a 10 watt transceiver and capable of delivering 100 Watts output. The first job was to resurrect a huge transformerdriven 12 volt power supply and having got this working connected it to the MML144/100 which was coupled to the FT480. I connected the Bird 43 wattmeter fitted with a 100200MHz 100W insert initially together with the Welz dummy load and having set the FT480 to FM, pressed the transmit switch. Things seemed to work OK and the Bird wattmeter showed 100W forward power. Connecting the spectrum analyser via its 60dB sampler gave me a picture showing a large signal at 2m and a fairly large signal at the second harmonic; not what I'd expected. The indicated levels were 50dBm and 22dBm above the noise baseline. These must be signals of 100 Watts at 145 MHz and 160mW at 290MHz so, although the harmonic looks large, because of the Log scale, it isn't too bad after all. I then connected up the TF1020A as I'd read it worked up to 500MHz and should provide a better 50 ohm load, however it indicated only 25Watts on its 150Watt range and about half this on its 300Watt range, whilst the Bird was indicating 100Watts full scale so some investigation is necessary. 











