ADMITTING YOUR
SUSCEPTANCE TO MY RESISTANCE TO IMPEDANCE

If I thought about
that title for just a little while longer, I might be able to come up with a
ribald limerick about impedance and reactance, susceptance and admittance. But that may not be as great an idea as it
seemed at first, so instead let's carry on!

I think most of us
deal with impedance all the time, maybe to the point that we've stopped
thinking about some of the basic questions.
A few columns ago, I pointed out that our so-called 600-ohm balanced
audio standard apparently originated when pole-mounted telegraph wires were
re-used for telephone transmission. An
old story tells us that fifty and seventy-five ohm RF transmission lines came
along because that's what you got when you used standard sizes of copper tubing
to make coaxial cables. So we owe our
selection of 50- and 75-ohm cables at least partly to the plumbing
industry? More on that
later. Who amongst us remembers
the 230-ohm balanced "open-wire" transmission lines that were used
before high-power co-ax became available?

And what do we mean
when we say that a chunk of co-ax is 50 ohms?
Some smart apple is going to reply that means that's the cable's ** characteristic
impedance**. But what exactly does
that mean? If you measure between the
centre conductor and the shield of that co-ax with an ohmmeter, it will read
open circuit, and it will measure close to that at audio frequencies. I daresay if you measured its impedance at a
few GigaHertz with a bridge, you might find that the
cable's impedance was close to a short circuit.

Well, the reactive
components of a coaxial cable are the series inductance (L) of the inner
conductor, and shunt capacitance (C) between the inner conductor and the
shield. Then there's series resistance (R)
of the inner conductor, and susceptance (G)(very high
shunt resistance of the insulation between the inner and outer conductor). So if we look at the whole spectrum of RF
frequencies, there is a broad range where the
characteristic impedance holds true. And
I guess that's why it's called the "characteristic" impedance. Ignoring the two resistive components, the
simplified formula for calculating the characteristic impedance is the square
root of L/C. And there are formulas to
calculate impedance based on the ratio of the diameters of the inner and outer
conductor. Here's where it gets
interesting: in actual practice, we find that cable attenuation increases
faster with increasing frequency than the simple L/C formula would lead us to
expect. This turns out to be because of ** skin
effect**, which causes R to increase with the square of frequency, until
it can't be ignored with our simplified formula. The obvious way to reduce skin effect (and
that attenuation) is to increase the surface of the inner conductor, by
increasing its diameter. But this will
cause the characteristic impedance of the cable to drop, so that to pass a
certain power of signal, greater current will be required, which increases
losses due to resistance, and eventually we reach a point where we're not
improving anything this way. By
continued experimentation, we find that there is an optimum ratio of inner and
outer conductor to minimize cable attenuation, and it's about 1:3, which gives
us an impedance of… 75 ohms. If instead
you try to optimize the amount of RF power a given size of cable can safely
carry, you end up at about… 50 ohms. So
there you have it: where signal losses must be minimized, 75 ohms is your best
bet. In transmission, where we're more
concerned about maximizing the power we can crank out of our lines, 50 ohms
turns out to be the wise choice.

Sometimes it's
reassuring to find out that some standard is what it is for good scientific
reasons, and not due to the whims of someone trying to figure out what size of
pipe to connect to your bathroom.