I know this question is very vague, but still I wonder if there is an answer
to it.

Is there a theoretical formula that links antenna dB gain with increase in
maximum range of the wireless signal. I understand that in theory RF signal
range is infinity, but I can not figure out if there is a correlation
between antenna dB gain and signal range at certain frequency. Following the
common logic it is obvious that range will increase as antenna has better dB
gain, but how much?

A formula for electric field goes something like: E=9500*power/distance (I
might be wrong about this), but this does not take in account the frequency
of our signal.

P.S. I am not concerned with terrain configuration, weather condition, and
other factors at this time.

--
general wireless list, a bawug thing <http://www.bawug.org/>
[un]subscribe: http://lists.bawug.org/mailman/listinfo/wireless

Reply via email to