Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
Hello,
I recently bought a Comet B-10 mobile antenna. However, I noticed it's only rated for 50 watts, and my Icom IC-208H puts out 55 watts. I traded the B-10 in for a Comet-Maldol EX-104B which is only a couple inches longer than the B-10; the bases look almost identical. The EX-104B is rated for 100 watts. Anyone have any idea why these two antennas are rated so differently for power handling? --Ed G-- |
#2
![]() |
|||
|
|||
![]()
Hi Ed,
If it's rated for 50 watts, your 55-watt transmitter will work just fine. You will lose 3dB in your cable and connectors, and only deliver 30-40 watts to the antenna. Generally speaking, for any antenna from a reputable antenna provider, the more metal in the sky, the better. But an inch or two won't make enough difference to be discernable. In fact, 10 dB is how much it takes for a mobile listener, listening to NBFM, to notice the difference. TEN DB! That means you have to change from 1 watt to 10 watts for me to hear the difference. As always, I say: work on the antenna first, receiver second, transmitter third. You are doing the right thing. All the best, Dave Ed wrote: Hello, I recently bought a Comet B-10 mobile antenna. However, I noticed it's only rated for 50 watts, and my Icom IC-208H puts out 55 watts. I traded the B-10 in for a Comet-Maldol EX-104B which is only a couple inches longer than the B-10; the bases look almost identical. The EX-104B is rated for 100 watts. Anyone have any idea why these two antennas are rated so differently for power handling? --Ed G-- |
#3
![]() |
|||
|
|||
![]()
I'd really be interested in learning where that 10 dB figure came from.
I admittedly have very little experience in using NBFM, but 10 dB seems awfully large to make a perceptable difference. I recall from a communications course that FM detection has a threshold effect -- below a certain S/N ratio, the noise effectively multiplies the modulation, rather than adding to it as it does in low S/N AM. This is described by Carlson in _Communication Systems_ as "mutilation" of the modulation. It seems from his analysis that it would take much less than 10 dB to cross the threshold, that is, to go from noisy but recognizable modulation to badly distorted, "mutilated" audio. In a quick experiment with a NBFM receiver and signal generator set for 5 kHz deviation, I could easily tell the difference in quieting resulting from each 3 dB change in signal level, when the signal was below full quieting. Subjective examination of the output signal on a scope showed a transition from a recognizable but noisy sine wave to apparent noise only, with a signal level change of only 2 dB. So, what's the basis for the 10 dB figure? Roy Lewallen, W7EL Dave Bushong wrote: . . . In fact, 10 dB is how much it takes for a mobile listener, listening to NBFM, to notice the difference. TEN DB! That means you have to change from 1 watt to 10 watts for me to hear the difference. . . . |
#4
![]() |
|||
|
|||
![]()
On Wed, 15 Oct 2003 23:09:29 GMT, Dave Bushong
wrote: Hi Ed, If it's rated for 50 watts, your 55-watt transmitter will work just fine. You will lose 3dB in your cable and connectors, and only deliver 30-40 watts to the antenna. snip All the best, Dave How do you figure a 3 dB loss? A mobile installation 'typically' uses RG-58 which at 2 meters has about 4.5 dB loss per 100 feet and most mobile antenna's come with 15 - 20 feet of cable. As I see it, that's about 1 dB loss (or thereabouts) for the cable and I've seen connector loss figures hover around 0.5 dB. At a 1.5 dB loss roughly 1/4 of the signal is not unreasonable which would put it at the 40 watts you mention. I don't [totally] aruge your conclusion - just your 3 dB assertion. Am I missing something? Howard |
#5
![]() |
|||
|
|||
![]()
What kind of connector has 0.5 dB loss at 2 meters? And what's the loss
mechanism? Is there some kind of connector out there filled with carbon or something? Who made the measurements and how? The ARRL Antenna Book shows RG-58/U and RG-58B/U (plain copper center conductor) as having just under 6 dB/100' attenuation at 2 meters, and RG-58A/U and RG-58C/U (tinned copper) as about 6.5 dB/100'. I checked a 100' piece of RG-58C/U in my junk box and found it to be 5.6 dB/100' at 146 MHz. So I'd expect 20 feet or so to have just over 1 dB of attenuation, almost certainly not enough to notice except perhaps just barely, if you were right at the noise level. Certainly it wouldn't be noticeably improved by using some other kind of cable. Oh, and that measurement was made with BNC connectors on both ends. The loss of those connectors shouldn't be measurable except with extremely sensitive equipment. Roy Lewallen, W7EL Howard wrote: How do you figure a 3 dB loss? A mobile installation 'typically' uses RG-58 which at 2 meters has about 4.5 dB loss per 100 feet and most mobile antenna's come with 15 - 20 feet of cable. As I see it, that's about 1 dB loss (or thereabouts) for the cable and I've seen connector loss figures hover around 0.5 dB. At a 1.5 dB loss roughly 1/4 of the signal is not unreasonable which would put it at the 40 watts you mention. I don't [totally] aruge your conclusion - just your 3 dB assertion. Am I missing something? Howard |
#6
![]() |
|||
|
|||
![]()
Roy,
I don't have any "real" reference to the connector loss, just folklore. I took the cable loss from the same source as you - if there is error chalk it up to sloppy reading of the graph 8-} I do see that we both quoted the same 1 dB of cable loss, that tells me I'm on track. I did re-read my 'guesstimate' of power loss in the cable and it does look like my math is off a bit. I agree with what you say regarding the loss not being enough to notice and thanks for steering me straight on the connector loss. Personally, I've never tried to measure connector loss nor have I gone obsessive-compulsive about changing everything to BNC or N connetors, I just don't see any return on the effort for 2 meters. I use PL-259's with the exception of antenna's that have N connectors; have never fretted the "folklore" impedance bump issue either - just concentrate on making a solid connection and not melting the dielectric when soldering the braid to the shell, then follow with good weatherproofing. Howard On Thu, 16 Oct 2003 23:19:57 -0700, Roy Lewallen wrote: What kind of connector has 0.5 dB loss at 2 meters? And what's the loss mechanism? Is there some kind of connector out there filled with carbon or something? Who made the measurements and how? The ARRL Antenna Book shows RG-58/U and RG-58B/U (plain copper center conductor) as having just under 6 dB/100' attenuation at 2 meters, and RG-58A/U and RG-58C/U (tinned copper) as about 6.5 dB/100'. I checked a 100' piece of RG-58C/U in my junk box and found it to be 5.6 dB/100' at 146 MHz. So I'd expect 20 feet or so to have just over 1 dB of attenuation, almost certainly not enough to notice except perhaps just barely, if you were right at the noise level. Certainly it wouldn't be noticeably improved by using some other kind of cable. Oh, and that measurement was made with BNC connectors on both ends. The loss of those connectors shouldn't be measurable except with extremely sensitive equipment. Roy Lewallen, W7EL Howard wrote: How do you figure a 3 dB loss? A mobile installation 'typically' uses RG-58 which at 2 meters has about 4.5 dB loss per 100 feet and most mobile antenna's come with 15 - 20 feet of cable. As I see it, that's about 1 dB loss (or thereabouts) for the cable and I've seen connector loss figures hover around 0.5 dB. At a 1.5 dB loss roughly 1/4 of the signal is not unreasonable which would put it at the 40 watts you mention. I don't [totally] aruge your conclusion - just your 3 dB assertion. Am I missing something? Howard |
#7
![]() |
|||
|
|||
![]()
Thanks for the info.
There are really only two primary mechanisms that can cause connector loss, and they're the same as the ones causing transmission line loss -- conductor resistance and dielectric loss. The conductor loss won't be much more than for the same length of coax, since the conductor diameter is as large as for coax. Admittedly, nickel plating can be quite resistive, but the distance is very short. I've seen some UHF (PL-259/SO-239) connectors with phenolic insulation, which isn't the greatest, but again, it's there only for a very short distance, so the overall loss won't be much at all. As a connector insulation, Teflon is virtually lossless up through UHF. A well-designed connector is just a piece of transmission line, with about the same loss characteristics. As for "impedance bumps", poorly designed connectors like UHF types will show a small impedance change over their length. Again, though, the distance is so short that the overall effect on SWR is negligible until you get very high in frequency -- the effect is just the same as a small shunt C or series L -- of good quality -- inserted at that point. And this has no effect on connector loss. I remember being surprised at finding UHF connectors used in a commercial 450 MHz radio I had. But whatever impedance change they presented was simply compensated for by the transmitter output and receiver input networks. Likewise, if they're at the antenna, a slight adjustment of the antenna matching system (e.g., gamma match) or antenna length will compensate. Good quality connectors like BNC, TNC, N, and so forth have a very nearly constant 50 ohm impedance through them if properly assembled. This isn't generally true of right angle connectors and adapters, but for most amateur purposes the impedance change they present isn't of any consequence, either. I recall seeing a disassembled cheap UHF adapter -- female-female as I recall -- which contained a steel spring as the center conductor. Now, something like that might have measurable loss even at HF. So I can't say that all adapters, even the very junkiest ones, have negligible loss. But you don't need to be concerned about the loss of adapters having even a modicum of quality. And it's unlikely you'll be able to measure the loss of any connector through VHF at least, and above that only with sensitive equipment. Finally, let me caution against interpreting "insertion loss" or "mismatch loss", often specified for connectors and adapters, as representing actual dissipative loss, like coax loss. It doesn't -- it's something else. People sometimes read or hear that a connector has a "mismatch loss" of such-and-such, then repeat that number as being the dissipative loss you'd get by putting the connector into a system. That's a mistake. I've posted quite a bit about this from time to time in the past, so anyone who's interested should be able to find out more by doing a Google search on this newsgroup for "mismatch loss". Roy Lewallen, W7EL Howard wrote: Roy, I don't have any "real" reference to the connector loss, just folklore. I took the cable loss from the same source as you - if there is error chalk it up to sloppy reading of the graph 8-} I do see that we both quoted the same 1 dB of cable loss, that tells me I'm on track. I did re-read my 'guesstimate' of power loss in the cable and it does look like my math is off a bit. I agree with what you say regarding the loss not being enough to notice and thanks for steering me straight on the connector loss. Personally, I've never tried to measure connector loss nor have I gone obsessive-compulsive about changing everything to BNC or N connetors, I just don't see any return on the effort for 2 meters. I use PL-259's with the exception of antenna's that have N connectors; have never fretted the "folklore" impedance bump issue either - just concentrate on making a solid connection and not melting the dielectric when soldering the braid to the shell, then follow with good weatherproofing. Howard On Thu, 16 Oct 2003 23:19:57 -0700, Roy Lewallen wrote: What kind of connector has 0.5 dB loss at 2 meters? And what's the loss mechanism? Is there some kind of connector out there filled with carbon or something? Who made the measurements and how? The ARRL Antenna Book shows RG-58/U and RG-58B/U (plain copper center conductor) as having just under 6 dB/100' attenuation at 2 meters, and RG-58A/U and RG-58C/U (tinned copper) as about 6.5 dB/100'. I checked a 100' piece of RG-58C/U in my junk box and found it to be 5.6 dB/100' at 146 MHz. So I'd expect 20 feet or so to have just over 1 dB of attenuation, almost certainly not enough to notice except perhaps just barely, if you were right at the noise level. Certainly it wouldn't be noticeably improved by using some other kind of cable. Oh, and that measurement was made with BNC connectors on both ends. The loss of those connectors shouldn't be measurable except with extremely sensitive equipment. Roy Lewallen, W7EL Howard wrote: How do you figure a 3 dB loss? A mobile installation 'typically' uses RG-58 which at 2 meters has about 4.5 dB loss per 100 feet and most mobile antenna's come with 15 - 20 feet of cable. As I see it, that's about 1 dB loss (or thereabouts) for the cable and I've seen connector loss figures hover around 0.5 dB. At a 1.5 dB loss roughly 1/4 of the signal is not unreasonable which would put it at the 40 watts you mention. I don't [totally] aruge your conclusion - just your 3 dB assertion. Am I missing something? Howard |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
APS-13 Antenna question | Antenna | |||
Yagi / Beam antenna theory question... | Antenna | |||
2-440 on AM/FM antenna question | Antenna | |||
Mobile Antenna Question | Antenna | |||
Theroretical antenna question | Antenna |