View Single Post
  #2   Report Post  
Old February 28th 06, 02:57 AM posted to rec.radio.amateur.homebrew
Fred McKenzie
 
Posts: n/a
Default Impedence mismatch into FET preamp

In article , "Samuel
Hunt" wrote:

Am I correct in thinking that if there is an impedance mismatch (say the
input circuit was tuned into a 50 ohm load, then was finally presented with
say a 105 ohm load), then the preamp's noise figure would increase and the
FET would start to become artificially noisy?

Then if the FET was pushed into overload by an adjacent strong signal, the
noise would then start to reduce.


I have this odd situation on a repeater, and I can only put it down to
mis-tuned cavities. I've never come across it before though, where the
repeater actually IMPROVES in sensitivity when the transmitter keys up
(?!?!), so I would like to know what's going on before I fix it.


Sam-

There may be some measurable increase in a preamp's noise output, but a 2
to 1 mismatch will probably not cause such a serious degradation in its
noise figure.

Perhaps I misunderstand you. When the noise reduces due to overload, that
is NOT the same as an improvement in sensitivity. That is commonly
refered to as desensitization (desense). An amplifier's gain is reduced
to zero when it is in saturation (or oscillating).

Inadequate or mistuned cavities can cause desense. It can also be caused
by poor shielding, often from poor co-axial cable. You might benefit if
you switched to double-shielded cable. Also look for poorly attached
connectors.

73, Fred, K4DII