Question on re-radiated field
Can someone please confirm or deny the following arguments.
Let us have:
- a transmitting system operating at any given frequency
- and a metal bar, located far away from the transmitter, whose electrical
length is exactly half wavelength at the operating frequency.
An induced RF current will flow in the bar. Such RF current causes a re-radiated
field which adds up to the field generated by the trasmitter.
Two questions:
- which are the amplitude and phase shift of the re-radiated field with respect
to those of the field generated by the trasmitter? My instinctive answer would
be same amplitude (in absence of ohmic losses) and 180 degrees. The total field
(transmitted + re-radiated) at the metal bar would so be zero.
- how does the total field change moving away from the bar? I would say that
while the field generated by the transmitter varies very slowly with the
distance from the bar (the transmitter is assumed to be very far away), the
re-radiated field varies fast (also because one would initially be in the near
field). In conclusion, the more we move away from the bar, the lower is the
contribution of the re-radiated field to the total field. That should be the
reason why, in a Yagi antenna, a parasitic element cannot be put too far away
from the driven element.
Thanks and 73
Tony I0JX
|