Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
Along several decades of radio hamming on the HF bands, I noted that the
measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which SWR is independent of feedline length (as long as the cable attenuation remains constant). Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. At that regard I got an idea that could explain the phenomenon, at least part of it. Reading coaxial cable data sheet, I noted that manufacturers typically give a small tolerance on cable impedance (2 to 3 ohm). Let us then assume that the feedline cable has a 53-ohm impedance, whilst the Bird wattmeter is 50 ohm sharp. If the 53-ohm cable is terminated on an e.g. 75-ohm (purely resistive) antenna, the real SWR on the line would be 75/53=1.41 independently of feedline length (if the attenuation variation with length is neglected). But the impedance seen by the wattmeter obviously varies with the feedline length, and it can be easily calculated that the seen impedance range results in an apparent SWR, on the 50-ohm wattneter, reading that varies from a maximum of 1.5 (when feedline length is an even multiple of half wavelenght) down to a minimum of 1.33 (when feedline length is an odd multiple of wavelenght quarters). For 100W of forward power, the reflected power varies from about 4W down to about 2W. Repeating the exercise with an e.g. 85-ohm load, the apparent SWR measured on the 50-ohm wattmeter would vary from 1.7 down to 1.51 (reflected power varying from 7W down to 4W). You can get easily convinced that such variation is only due to the assumed 3-ohm difference in cable impedance. With older cables having a nominal 52-ohm impedance, instead of 50, the situation could get even more evident. Any comment would be appreciated. 73 Tony I0JX |
#2
![]() |
|||
|
|||
![]()
"Antonio Vernucci" wrote in message
. .. Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which SWR is independent of feedline length (as long as the cable attenuation remains constant). Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. At that regard I got an idea that could explain the phenomenon, at least part of it. Reading coaxial cable data sheet, I noted that manufacturers typically give a small tolerance on cable impedance (2 to 3 ohm). Let us then assume that the feedline cable has a 53-ohm impedance, whilst the Bird wattmeter is 50 ohm sharp. If the 53-ohm cable is terminated on an e.g. 75-ohm (purely resistive) antenna, the real SWR on the line would be 75/53=1.41 independently of feedline length (if the attenuation variation with length is neglected). But the impedance seen by the wattmeter obviously varies with the feedline length, and it can be easily calculated that the seen impedance range results in an apparent SWR, on the 50-ohm wattneter, reading that varies from a maximum of 1.5 (when feedline length is an even multiple of half wavelenght) down to a minimum of 1.33 (when feedline length is an odd multiple of wavelenght quarters). For 100W of forward power, the reflected power varies from about 4W down to about 2W. Repeating the exercise with an e.g. 85-ohm load, the apparent SWR measured on the 50-ohm wattmeter would vary from 1.7 down to 1.51 (reflected power varying from 7W down to 4W). You can get easily convinced that such variation is only due to the assumed 3-ohm difference in cable impedance. With older cables having a nominal 52-ohm impedance, instead of 50, the situation could get even more evident. Any comment would be appreciated. 73 Tony I0JX The Bird actually measures a combination of capacitive coupled voltage and inductively coupled current. There is a app note on the Bird website. Find: "Straight Talk About Directivity". |
#3
![]() |
|||
|
|||
![]()
Antonio Vernucci wrote:
I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. If the characteristic impedance of the feedline differs from the characteristic impedance of the calibrated SWR meter, the indicated SWR will vary with length of feedline. If the feedline is lossy, the transmitted signal SWR will decrease between the antenna and the transmitter. All feedlines have a certain amount of loss. If common-mode current is present on the SWR meter case, the SWR reading will vary because the meter has no fixed ground reference. -- 73, Cecil, IEEE, OOTC, http://www.w5dxp.com |
#4
![]() |
|||
|
|||
![]()
It's important to know and keep in mind that the SWR meter doesn't
actually measure the SWR on the feedline. So its reading doesn't prove or disprove anything about how the SWR on a feedline changes with length. What the meter effectively measures is the impedance seen at that point. It's calibrated in such a way that if it's connected to a transmission line of exactly 50 ohms impedance, the indicated SWR will be the SWR on the line. Otherwise, the indicated SWR won't be the actual line SWR. There are at least three things which can cause an indicated SWR variation with line length: 1. The feedline Z0 isn't exactly 50 ohms. The Z0 of coax easily varies +/- 5 ohms from nominal, and sometimes closer to +/- 10 -- it's seldom exactly 50. If you connect a perfect 50 ohm load to your transmitter via a 45 ohm line, the impedance seen by the transmitter will change with line length. Consequently, the SWR meter reading will also change. The actual SWR on the line will not, except as dictated by loss, described next. 2. The feedline has loss. The SWR will improve as the line becomes longer due to line loss. If the line is long enough to be very lossy, the transmitter will see nearly the line's Z0 regardless of what load is connected to the other end. The actual SWR on the line will be greatest at the load, decreasing as you get farther away. 3. There is current on the outside of the coax shield (common mode current). When this happens, the feedline becomes part of the antenna. Consequently, changing the feedline length actually changes the effective antenna length, which in turn changes the feedpoint impedance. Roy Lewallen, W7EL Antonio Vernucci wrote: Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which SWR is independent of feedline length (as long as the cable attenuation remains constant). Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. At that regard I got an idea that could explain the phenomenon, at least part of it. Reading coaxial cable data sheet, I noted that manufacturers typically give a small tolerance on cable impedance (2 to 3 ohm). Let us then assume that the feedline cable has a 53-ohm impedance, whilst the Bird wattmeter is 50 ohm sharp. If the 53-ohm cable is terminated on an e.g. 75-ohm (purely resistive) antenna, the real SWR on the line would be 75/53=1.41 independently of feedline length (if the attenuation variation with length is neglected). But the impedance seen by the wattmeter obviously varies with the feedline length, and it can be easily calculated that the seen impedance range results in an apparent SWR, on the 50-ohm wattneter, reading that varies from a maximum of 1.5 (when feedline length is an even multiple of half wavelenght) down to a minimum of 1.33 (when feedline length is an odd multiple of wavelenght quarters). For 100W of forward power, the reflected power varies from about 4W down to about 2W. Repeating the exercise with an e.g. 85-ohm load, the apparent SWR measured on the 50-ohm wattmeter would vary from 1.7 down to 1.51 (reflected power varying from 7W down to 4W). You can get easily convinced that such variation is only due to the assumed 3-ohm difference in cable impedance. With older cables having a nominal 52-ohm impedance, instead of 50, the situation could get even more evident. Any comment would be appreciated. 73 Tony I0JX |
#5
![]() |
|||
|
|||
![]()
On Mon, 1 Jun 2009 22:50:52 +0200, "Antonio Vernucci"
wrote: Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which SWR is independent of feedline length (as long as the cable attenuation remains constant). Hi Antonio, Yes, this is the accepted wisdom. Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. A good point. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. This, too, is accepted wisdom. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. You ARE describing a hot braid problem. At the slight shift of 3W out of 100W, it is a small problem by the same degree (nothing you would notice by other indications). At that regard I got an idea that could explain the phenomenon, at least part of it. Well, I am going to skip that quote to cut to the chase. What you describe is called mismatch uncertainty. It exists in a cable that has a nominal Z that matches neither the load nor the source. Depending upon the amount of mismatch at each end, you have a zone of confusion between those ends that will result in as many different readings as you have insertion points to measure at. As most modern transmitters have a source Z of 30 to 70 Ohms, you might note a very, very small perturbation when the other end of ANY line is mismatched - but I doubt it. To provoke this condition into revealing readings that are significantly beyond the range of error requires mismatches at both ends on the order of 3:1. You don't describe that. More likely your problem is Common Mode currents - what you call hot braid. One test is to use a snap-on choke and slide it along the line and note if the SWR meter reading moves in concert with the hand motion (or the reading simply shifts by the addition of the choke). 73's Richard Clark, KB7QHC |
#6
![]() |
|||
|
|||
![]()
On 1 jun, 22:50, "Antonio Vernucci" wrote:
Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which *SWR is independent of feedline length (as long as the cable attenuation remains constant). Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. At that regard I got an idea that could explain the phenomenon, at least part of it. Reading coaxial cable data sheet, I noted that manufacturers typically give a small tolerance on cable impedance (2 to 3 ohm). Let us then assume that the feedline cable has a 53-ohm impedance, whilst the Bird wattmeter is 50 ohm sharp. If the 53-ohm cable is terminated on an e.g. 75-ohm (purely resistive) antenna, the real SWR on the line would be 75/53=1.41 independently of feedline length (if the attenuation variation with length is neglected). But the impedance seen by the wattmeter obviously varies with the feedline length, and it can be easily calculated that the seen impedance range results in an apparent SWR, on the 50-ohm wattneter, reading that varies from a maximum of 1.5 (when feedline length is an even multiple of half wavelenght) down to a minimum of 1.33 (when feedline length is an odd multiple of wavelenght quarters). For 100W of forward power, the reflected power varies from about 4W down to about 2W. Repeating the exercise with an e.g. 85-ohm load, the apparent SWR measured on the 50-ohm wattmeter would vary from 1.7 down to 1.51 (reflected power varying from 7W down to 4W). You can get easily convinced that such variation is only due to the assumed 3-ohm difference in cable impedance. With older cables having a nominal 52-ohm impedance, instead of 50, the situation could get even more evident. Any comment would be appreciated. 73 Tony I0JX Hello Antonio, In my opinion (when dealing with actual antennas) it can be: 1. your coaxial cable is part of the antenna (common mode current). Changing the length, changes the common mode impedance. You can rule this out by sliding some large ferrites along the cable close to the VSWR meter, or change the grounding a bit and watch the difference (if present). 2. your bridge inside the VSWR meter is not perfect. You can check this by connecting known impedances (for example 56 Ohms resistor and a 44.6 Ohms resistor and 100 ohms versus 25 Ohms). 3. The cables you are using are not exactly 50 Ohms. I think your analysis is right. When you have cable with slightly different Z0, readings depend on length. Of course when you extend with a good 50 ohms cable (directly connected to the meter), the reading should not change. I did the math also and found also VSWR=1.7 and VSWR=1.51 for 85 ohms load connected to cable with Z0=53 Ohms. I didn't expect such difference for just 3 ohms deviation from 50 Ohm. 4. Harmonics in the final amplifier (I hope that is not the reason). Best regards, Wim PA3DJS www.tetech.nl remove abc in case of pm |
#7
![]() |
|||
|
|||
![]()
"Antonio Vernucci" wrote in
: Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which SWR is independent of feedline length (as long as the cable attenuation remains constant). Tony, You may find my thoughts at http://www.vk1od.net/transmissionlin...splacement.htm of interest. Owen |
#8
![]() |
|||
|
|||
![]()
On Jun 1, 1:50*pm, "Antonio Vernucci" wrote:
Along several decades of radio hamming on the HF bands, I noted that the measured SWR of all the antennas I have mounted (Yagis, dipoles) slightly varies when the feedline length is changed by several meters. For 100W of forward power, the reflected power could vary somewhat, e.g. from 2W to 5W or so, measured on a Bird wattmeter. This behavior would seeem to deny the theory, according to which *SWR is independent of feedline length (as long as the cable attenuation remains constant). Clearly the measured SWR change cannot be due to the change in the feedline attenuation as, at HF, adding or cutting a few meters of cable would yield a very small change in attenuation and hence a negligible impact on measured SWR. Reading here and there, the most common theory explaining such phenomenon is that, in presence of RF on the coaxial cable braid, the SWR meter reading is influenced by the feedline length. I am not too convinced of that explanation, also because I have invariably experienced the measured SWR variation phenomenon with all antenna I have had, and I never had hot braid problems. At that regard I got an idea that could explain the phenomenon, at least part of it. Reading coaxial cable data sheet, I noted that manufacturers typically give a small tolerance on cable impedance (2 to 3 ohm). Let us then assume that the feedline cable has a 53-ohm impedance, whilst the Bird wattmeter is 50 ohm sharp. If the 53-ohm cable is terminated on an e.g. 75-ohm (purely resistive) antenna, the real SWR on the line would be 75/53=1.41 independently of feedline length (if the attenuation variation with length is neglected). But the impedance seen by the wattmeter obviously varies with the feedline length, and it can be easily calculated that the seen impedance range results in an apparent SWR, on the 50-ohm wattneter, reading that varies from a maximum of 1.5 (when feedline length is an even multiple of half wavelenght) down to a minimum of 1.33 (when feedline length is an odd multiple of wavelenght quarters). For 100W of forward power, the reflected power varies from about 4W down to about 2W. Repeating the exercise with an e.g. 85-ohm load, the apparent SWR measured on the 50-ohm wattmeter would vary from 1.7 down to 1.51 (reflected power varying from 7W down to 4W). You can get easily convinced that such variation is only due to the assumed 3-ohm difference in cable impedance. With older cables having a nominal 52-ohm impedance, instead of 50, the situation could get even more evident. Any comment would be appreciated. 73 Tony I0JX There is a very good possibility that your analysis is correct. I see the same effect, and in fact, it's of particular concern to me right now, because I'm putting what effectively is an SWR meter into production, and it's important that we have a test setup that accurately measures the performance. I've been specifically concerned that the test setup, as currently configured, may have trouble because the connecting cables may not be close enough to 50 ohms. As others have said, IF there is a problem with RF on the outside of the line, any variation in observed SWR is most likely because the change in line length has changed the load the other end of the coax is seeing, NOT because the meter is directly responding to the "outside" RF. The meter measures transmission line current and transmission line voltage, and the line itself is all the reference it needs to do that. Direct response to RF on the outside of the line could result from poor construction of the meter, but I wouldn't expect that from a Bird. One other possibility that I haven't seen mentioned, too, is that the impedance of the line is not constant along its length. With line of good construction that hasn't been abused, the variation should be small. You can detect it by running a network analyzer sweep of just the line, across a broad frequency range. But a line with polyethylene dielectric (and especially one with foam polyethylene) that's gotten too hot--perhaps because of high power at high SWR--can have the center conductor go "off-center" and change the impedance. If the effect you are seeing is the result of a line that's not quite the same impedance that the meter is calibrated to (which itself may be noticably different from 50 ohms), you could plot the change in indicated SWR as a function of line length and see it vary in a smooth and predictable manner. Most likely, though, what you're seeing is the sum of several effects, and the variation in indicated SWR or reflected power may not be all that smooth. Very often when I expect to see a nice smooth spiral centered on one point on my network analyzer's Smith chart display, what I see is a spiral that follows along some arc, because of various imperfections. (Sometimes it's fun to try to figure out just what the imperfections are...) Cheers, Tom |
#9
![]() |
|||
|
|||
![]()
It's important to know and keep in mind that the SWR meter doesn't actually
measure the SWR on the feedline. So its reading doesn't prove or disprove anything about how the SWR on a feedline changes with length. Hi, Roy yes, that is the reason why I was talking of "apparent" or "measured" SWR, whilst the real SWR does not vary with line length. 1. The feedline Z0 isn't exactly 50 ohms. The Z0 of coax easily varies +/- 5 ohms from nominal, and sometimes closer to +/- 10 -- it's seldom exactly 50. If you connect a perfect 50 ohm load to your transmitter via a 45 ohm line, the impedance seen by the transmitter will change with line length. Consequently, the SWR meter reading will also change. The actual SWR on the line will not, except as dictated by loss, described next. 2. The feedline has loss. The SWR will improve as the line becomes longer due to line loss. If the line is long enough to be very lossy, the transmitter will see nearly the line's Z0 regardless of what load is connected to the other end. The actual SWR on the line will be greatest at the load, decreasing as you get farther away. 3. There is current on the outside of the coax shield (common mode current). When this happens, the feedline becomes part of the antenna. Consequently, changing the feedline length actually changes the effective antenna length, which in turn changes the feedpoint impedance. I would say that in case no. 1 the meter measures an apparent SWR, whilst in case no. 2 it measures the real SWR existing at the measurement point. I am not sure what it measures in case no. 3 Regards. Tony I0JX |
#10
![]() |
|||
|
|||
![]()
If the characteristic impedance of the feedline differs from
the characteristic impedance of the calibrated SWR meter, the indicated SWR will vary with length of feedline. If the feedline is lossy, the transmitted signal SWR will decrease between the antenna and the transmitter. All feedlines have a certain amount of loss. If common-mode current is present on the SWR meter case, the SWR reading will vary because the meter has no fixed ground reference. Hi Cecil. in the first case the meter measures an apparent SWR, whilst in the second case it measures the real SWR (occuring at the measurement point). For the third case, I am unable to figure out whether the meter reads an apparent or the real SWR. 73 Tony I0JX |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Group Delay Variation - How much is too much? | Homebrew | |||
variation of number station | Shortwave | |||
interesting variation in the DA5 | Shortwave | |||
Variation in modeling predictions between software | Antenna |