View Single Post
  #5   Report Post  
Old February 22nd 07, 03:43 AM posted to rec.radio.amateur.antenna
[email protected] jimlux@earthlink.net is offline
external usenet poster
 
First recorded activity by RadioBanter: Jan 2007
Posts: 61
Default Mobile antenna Placement

What is the minimum distance for a 11 meter antenna and a 10 meter
antenna--both mobile?


Fred Wall
NF8W


Half a wave length....
Dave WD9BDZ


That's a positive and unqualified response. So I'm curious -- how did
you arrive at that figure?
Roy Lewallen, W7EL


Just recently looked it up in the ARRL 2006 Handbook.


Dave


One might ask, minimum for what? Coupling between antennas so that
the receiver of one radio isn't destroyed by the other? That would
depend on the power outputs and the receiver input power limit.
Interaction so that the pattern isn't affected? Aside from patterns
on mobile installations not being particularly normal anyway, it would
depend on what's connected to the feedpoint of the "parasitic"
antenna. And, is that a free space half wavelength or far enough that
the phase of the near field has flipped 180 degrees?

And, as a practical matter, taking a sort of average, half a
wavelength is about 5 1/4 meters or 17-18 ft. That's a fair distance
apart for most cars.


So, let's surmise some practical details and try and help Fred out..
Say he wants to avoid putting more than 1 Volt RMS into the receiver,
and he wants to use 100W on his 10m rig. Let's not worry about tuning
precisely or grotty details of the car.. just 2 vertical monopoles 2.5
m long, 2 meters apart, with a 50 ohm load on one, and the source on
the other.

With 100W (about 60Volts and about 1.7 A into the 36 ohm Z), you'll
get 24 volts across the 50 ohm load. That'll probably cook it. Even
if you ran 1 Watt, you'd get about 2.4 V... So 2m is a bit close if
you want to run 100W.