Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W--
with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? |
#2
![]() |
|||
|
|||
![]() "gibberdill" wrote in message ... I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W-- with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms) I would expect it to fail - probably at the connection points. I think it would heat up, become brittle, and break. If it was mine, I would want to upgrade to 16 or 14 ga. before putting a kw. -- I've always used 100 watts or less, so this is not from experience - just my opinion. Hal --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.799 / Virus Database: 543 - Release Date: 11/20/2004 |
#3
![]() |
|||
|
|||
![]()
Hal Rosser wrote:
"gibberdill" wrote in message ... I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W-- with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms) P = I*I * R I = sqrt(P / R) = 4.47 A bart |
#4
![]() |
|||
|
|||
![]()
And then you want to think about modes. If you are operating SSB or CW,
the average power will be much less than 1000 watts. Bart Rowlett wrote: Hal Rosser wrote: "gibberdill" wrote in message ... I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W-- with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms) P = I*I * R I = sqrt(P / R) = 4.47 A bart |
#5
![]() |
|||
|
|||
![]()
Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms)
P = I*I * R I = sqrt(P / R) = 4.47 A Thanks for the correction - Don't know what I was thinking at 130 am P equals I-square R ***of course** I was treating it like volts * amps = P and I was apparently confusing R with Volts or Amps --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.799 / Virus Database: 543 - Release Date: 11/19/2004 |
#6
![]() |
|||
|
|||
![]()
Thanks all.
"Hal Rosser" wrote in message .. . Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms) P = I*I * R I = sqrt(P / R) = 4.47 A Thanks for the correction - Don't know what I was thinking at 130 am P equals I-square R ***of course** I was treating it like volts * amps = P and I was apparently confusing R with Volts or Amps --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.799 / Virus Database: 543 - Release Date: 11/19/2004 |
#7
![]() |
|||
|
|||
![]()
Of course, Bart's math is fine. Seems like there's a bit more to the
story. The current may be less because the actual antenna impedance may be a bit higher than 50 ohms, but 50 is probably a reasonable figure for calculations, to be on the safe side. And Chuck's right that the heating may be much less because of the mode of operation. But...what current would be too much? The RF resistance of the 26AWG wire at 4MHz is about 125 ohms/1000 feet, or about 3.06 times the DC resistance, at 20C. The DC current to get the same heating as a 4MHz RF current would then be about sqrt(3.06)=1.75 times the RF current--close to 8 amps, using Bart's calculated RF current. (It's actually a bit less as the wire gets hot and its resistance goes up, because the skin depth increases with increased resistance, but use 1.75 times for a worst-case.) Since the fusing current (current at which a wire in nom. 20C ambient will melt) is 20.5A for 26AWG copper, you're very unlikely to melt the wire, but it very well might get pretty hot, depending on actual operating conditions. Suggestion: suspend a foot or so of 26AWG like you used in your antenna, with a weight on the bottom end to put similar tension in the wire to what you have in your antenna. Run a DC current through it, and see if anything bad happens (like getting too hot, stretching too much, or whatever). Use a DC current so that the heating will be similar to what you'd get worst-case with your antenna setup. Or use RF, if it's easy for you to do that...DC would be easy for me. Expect to get very similar heating if you use about 1.75 times as much DC current as what the RF current will be. So try about 8 amps for a worst-case test. Cheers, Tom Bart Rowlett wrote in message ... Hal Rosser wrote: "gibberdill" wrote in message ... I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W-- with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? Ohms law is gonna put 20 amps on that wire (1000 watts / 50 ohms) P = I*I * R I = sqrt(P / R) = 4.47 A bart |
#8
![]() |
|||
|
|||
![]()
Very nice, Tom.
Can we keep going with that? Using the 8A dc assumption and the 40 ohm/1000 ft dc resistance of #26 AWG, we find that approximately 2.56 watts is dissipated in the first foot (less in subsequent feet). This assumes the current is linearly distributed in that foot. To put it into intuitive perspective (not as neat as your proposed experiment, but a potential substitution of armchair hand-waving for actual work) we can say that the wire is dissipating about a quarter of a watt per inch near the center of the dipole. Comparing that with the heat generated by a quarter watt resistor (at 1/4 watt dissipation)(larger diameter, shorter length) we are not likely to get 3rd degree burns from an incandescent antenna wire. Hi. 73, Chuck |
#9
![]() |
|||
|
|||
![]()
gibberdill wrote:
I plan to use my 26 ga. 75 meter dipole (stealth antenna)-- now with 100 W-- with my new amplifier, with 800-1000 W output. Is that too small a gauge? How much current can I expect it to take? Even if it doesn't get too hot, will it be significantly less efficient? I've seen the other answers to this post, but no one asked this question: why are you going to put a kW into a stealth antenna? Before you do, you should do these three things: 1) be sure that you have minimized all possible losses from your current antenna setup; 2) be sure that you have a receiver that will pick a gnat out of the noise (better use DSP), because you will be loud and everyone who answers you will think you can hear them; 3) minimize any man-made noise sources in your local area (sort of related to #2). Remember, if you can't hear 'em, you can't work 'em. All the best, and 73, Dave kz1o |
#10
![]() |
|||
|
|||
![]()
It depends on whether you are in Alaska, in mid-winter, at midnight, or in
New Mexico, at midsummer, at noon. The temperature rise will not do any damage, not even to birds. The reduction in efficiency due to increase in wire resistance will be insignificant. Think in terms of 0.001 S-units. Just turn up the powe to 1 KW and see what happens, if anything. It won't fall down. The antenna length might increase by some fraction of one percent due to thermal expansion. But your tuner will not notice any difference. The difference is about the same as that between summer and winter, Remenber that the wire has a very great surface area, over a length of many feet, to dissipate the amtenna losses - and in a cooling breeze. It/s only the relatively small antenna losses, roughly 100 watts, which have to be dissipated - not the full 1000 watts. Most of the loss and temperature rise occurs in the middle half of the antenna. It is calculable but not worth the trouble. If it is enamelled magnet wire the enamel will probably not be affected. Enamell is accustomed to being in prolonged warm places. The probability of doing any damage to anybody's property is less than winning the national lottery with only one ticket. You may consider making your antenna even more stealthy. In general, the diameter of antenna wires is decided only on the grounds of durability in adverse weather conditions. Here in the UK we are quite fortunate - until Earth Warming tornados become more frequent. ---- Reg, G4FGQ. |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Adding lengths to bare wire antenna? | Antenna | |||
Guy wire vs. guy rope | Antenna | |||
ton of wire to apply at 90 Mhz | Antenna | |||
randon wire newbie question | Antenna | |||
Feedpoint impedence / wire diameter | Antenna |