Home |
Search |
Today's Posts |
#10
![]() |
|||
|
|||
![]()
In rec.radio.amateur.antenna gareth wrote:
"Sal M. O'Nella" wrote in message ... "gareth" wrote in message ... As we all know, the atmosphere greatly affects the propagation of radio waves, with all the various layers, and the effect of the Sun and sunspots on propagation through the atmosphere. Is it therefore not beyond the bounds possibility that this same atmosphere affects the initial propagation of radio waves away from our antennae, and that somehow is the reason why short antennae are poor radiators compared to antennae of significant (1/4 lambda) fractions of a wavelength? I know that I have attempted to discuss this before and been met by the hidebound rednecks of Yankland, but it is a question of interest to me, and not a troll. ================================================= I doubt if distant conditions affect the origin. I intended the atmosphere immediately adjacent to the antennae. I am only a talented amateur but I think with an antenna, the wavelength is best matched by the antenna aperture. This is not the case with short antennas. What do you think? The standing wave caused by reflection from the open end of a short antenna will not cover a full quarter cycle, and therefore the radiation must be reduced accordingly. Nope, and easily shown to be false. But that fits in with my opinion that the atmosphere / environment/ lumeniferous aether / or whatever needs in some way to be excited or twisted by the EM field of the antenna, hence my suggestion that you quoted above. Your opinion was disproved about 100 years ago. -- Jim Pennino |