In article k.net,
robert casey wrote:
One on-line coax-loss calculator indicates about 144 dB of loss in
RG-59, at 2.4 GHz over an 800-foot span.
Added to the antenna losses at both ends, and I have real doubts as to
whether a usable signal will result. It *might* work if the 802.11
radios were plugged directly into the coax, with no antennas involved,
but that does not sound feasible.
If you get a pair of preamp/power amp modules, like these:
http://cgi.ebay.com/2-4GHz-802-11b-1-Watt-WiFi-amplifier-signal-booster_W0QQitemZ5845608031QQcmdZViewItem?hash=ite m5845608031&_trksid=p3286.c50.m20.l1116
and use them at both ends of that long run of coax, it might work.
Even that might be marginal.
One site I Googled stated a WiFi receiver sensitivity of -76 dBm. A
typical WiFi card has a transmitter output of around 15 dBm, and
access points may be up in the 20 dBm range. That'll allow for only
about 100 dB of attenuation between transmitter and receiver before
you can't get a good connection any more... and the estimate for the
RG-59 coax was 45 dB worse than that.
Boosting the power to 1 watt will give you between 10 and 15 dB of
additional signal... still far short of the 45 dB of additional power
and/or sensitivity needed for a connection.
Running 2.4 GHz over thin cable is just sorta silly... especially when
there's a decades-old, well-tested, and very reliable cable-based
technology which will give equal or better data rates due to much
lower attenuation at lower frequencies.
--
Dave Platt AE6EO
Friends of Jade Warrior home page:
http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!