Shorting out a transmission line
Inside the coax cable are two conductors carrying current, the inside of
the shield and the outside of the center conductor. The current on one
of those conductors travels to the antenna, and an equal current returns
on the other conductor.
At the point where you insert the pin, the current has two possible
paths: it can continue down the cable as it normally does, or it can
return to the other conductor via the pin. The fraction which goes each
way is determined by the impedance of each path. A pin is electrically
very short at frequencies at which the coax can be effectively used, so
it has negligible reactance. Assuming that it's making good contact with
both the shield and center conductor -- which it might not be -- the
resistance will also be small. So it makes a good RF short circuit.
Therefore a large fraction of the current will return via the pin rather
than going on down the cable. So the first effect will be that it will
greatly reduce the amount of power which reaches the antenna to be radiated.
What will happen to the transmitter? That depends on the transmitter and
where the pin is inserted. If the cable didn't have any loss and the pin
had zero resistance, the transmitter would see a pure reactance. That
is, what it would see would look like a pure L or C, with the value and
sign depending on the pin's position relative to the transmitter. In
practice, the pin will have some resistance and the cable will have some
loss, so the transmitter will also see some amount of resistance, the
amount again depending on the pin position, as well as the cable loss
and pin resistance. I suspect that most modern 100 watt-class solid
state transceivers would probably just shut down their output stage and
not be permanently damaged, but I'd rather not experiment with my own
rig. The result might be more spectacular with a tube type linear with
pi network output. But again, it would depend on the design of the
transmitter and the particular impedance it sees.
Roy Lewallen, W7EL
Paul Burridge wrote:
I recall a story from many years ago - possibly an urban myth -
where some guy stuck a pin through a ham's coax feeder and thereby
took him off air/blew up his rig etc. Given that RF shorts are a
totally different kettle of fish from DC shorts, I'm just wondering
how feasible from a technical perspective this reported act of
sabotage is.
I'm no expert on transmission lines, but it strikes me that the
efficacy of such a stunt depends to a great extent on the point in the
line where the pin is inserted as related to the wavelength of the
transmitted signal. We all know short and open stubs are used as
matching elements at the higher frequencies, so it's implicit that
just sticking a pin in anywhere isn't necessarily going to adversely
affect the efficiency of an antenna system, unless one hits a node at
the frequency of operation. What I mean is, IOW, you won't
successfully short out coax at RF unless you stick the pin in at an
appropriate point. Of course, I might be full of crap on this one as
antennas have never been my strong point. Can anyone enlighten me?
btw: this is for academic discussion only! I've no beef against any
amateur and have been one myself for over 20 years.
|