Impedence mismatch into FET preamp
Hi Fred and all....
It's not due to desense. When I tune the cavities up properly, there's about
10dB noise figure on the preamp (reciever is 0.8uV sensitive). Then when it
transmits, the noise figure doesn't change.
When I tune the notches off slightly so the preamp starts to desense, the
noise figure goes down as it starts to desense, then the noise figure goes
up high as it pushes the preamp well into desense.
I'm going to retune the cavities for minimum SWR at some point anyway (we've
got problems with the frequency and a deaf repeater is actually quite a good
idea at the moment, so it's not an issue), and see what that does.
I just wanted to try to understand why it's doing this, does the FET noise
figure increase with an impedance mismatch on the input, and so pushing it
into desense reduces the amount of noise on the preamp?
It's just an interesting thing I've never come across before and would be
interested to find out a technical explanation to it all.
Thanks all,
Sam
"Fred McKenzie" wrote in message
...
In article , "Samuel
Hunt" wrote:
Am I correct in thinking that if there is an impedance mismatch (say the
input circuit was tuned into a 50 ohm load, then was finally presented
with
say a 105 ohm load), then the preamp's noise figure would increase and
the
FET would start to become artificially noisy?
Then if the FET was pushed into overload by an adjacent strong signal,
the
noise would then start to reduce.
I have this odd situation on a repeater, and I can only put it down to
mis-tuned cavities. I've never come across it before though, where the
repeater actually IMPROVES in sensitivity when the transmitter keys up
(?!?!), so I would like to know what's going on before I fix it.
Sam-
There may be some measurable increase in a preamp's noise output, but a 2
to 1 mismatch will probably not cause such a serious degradation in its
noise figure.
Perhaps I misunderstand you. When the noise reduces due to overload, that
is NOT the same as an improvement in sensitivity. That is commonly
refered to as desensitization (desense). An amplifier's gain is reduced
to zero when it is in saturation (or oscillating).
Inadequate or mistuned cavities can cause desense. It can also be caused
by poor shielding, often from poor co-axial cable. You might benefit if
you switched to double-shielded cable. Also look for poorly attached
connectors.
73, Fred, K4DII
|