In article , "nick"
writes:
If a (let's say 12v) relay is rated for 10 amps @ 110 volts,
it will take 1100 watts.
It will take 1100 watts at 110 volts. But if you lower the voltage to, say, 50
volts, and the contacts are still rated at 10 amps, it can only handle 500
watts. Etc.
Will it take 1000 watts of RF?
I would think not, but I don't know why.
Depends entirely on the RF characteristics - impedance, SWR, etc.
The trick is to exceed neither the voltage nor current ratings of the contacts.
In a 50 ohm RF application with low SWR (less than 2:1), that relay is probably
limited by its contact voltage rating, not the current rating. Which works out
to about 200 watts.
73 de Jim, N2EY
|