43429members
217967posts

Fade Margin in Data Radios

Highlighted
Lieutenant JG

Fade Margin in Data Radios

First, what is fade margin? It’s the difference between a radio’s sensitivity (minimum signal level it can decode data) and the actual received signal level. Higher fade margins can provide more reliable data radio links.

 

Any radio has an engineered receiver sensitivity, measured in dBm. (decibels relative to 1 milliwatt) This should be specified in the radio’s data sheet or brochure. The older Trio MR450 data radio, for example, has a sensitivity of -106 dBm at a 9600 bps over-the-air data rate, while the QR450 has a sensitivity of -113 dB at 8 kbps. Sensitivity goes down as data rate goes up.

 

The link between two radios can experience an event called “fading,” in which changing atmospheric conditions causes the signal level to drop. This can be quite significant at higher frequencies. And if there is any noise (eg from atmospheric conditions or electrical equipment) or interference (from other radios nearby) the radio needs the signal to be stronger to hear it well.

 

Under typical conditions with radios using modulation types such as CPM (continuous phase modulation), the wireless industry has typically specified a fade margin of at least 20 dB to provide reliable paths. For example, if a system is operating at 24 kbps over-the-air, and the sensitivity at that data rate is -107 dBm, all paths must be designed such that the actual received signal level is -87 dBm or higher.

 

When testing wireless paths using software such as Pathloss or Radio Mobile, if the fade margin is not achieved the designer may choose any of several options:

  • Increase antenna height to get the antenna above obstacles
  • Select antennas with higher gain
  • Reduce the radio data rate to get better receiver sensitivity
  • Add a repeater between the sites

 

Background information:

Sensitivity of a new data radio is tested at the factory by pushing a constant data stream through a pair of radios in a carefully-controlled lab environment. The data stream entering one radio is compared with the data stream exiting the other radio, as signal attenuation between the two is increased. Eventually the signal will be weak enough that errors will begin to appear. The software tool tracks these errors, and the test ends when it’s found that (on average) 1 bit in a million is wrong. The signal level at this point is stated as the radio’s sensitivity at that data rate.

 

A “1 bit in a million” bit error rate is typically stated as a 1x10-6 BER. Note that some manufacturers have in the past stated the radio’s sensitivity at a 1x10-4 BER. This is only 1 bit in 10,000 instead of 1 in a million, so the sensitivity appears better. Not really a fair comparison.

 

A value of 0 dBm is 1 mW. Transmit power is typically specified in positive numbers eg +40 dBm is 10 watts. But receive power (coming in from the antenna) is measured in negative numbers. For example, -70 dBm is a common value. For each 10 dB decrease in received signal level, the power in milliwatts is reduced by a factor of 10. So -70 dBm is a very tiny amount of power...  0.000001 of a milliwatt, or 1 picowatt !!

 

Joel Weder
Telemetry Solutions Specialist
Schneider Electric
1 REPLY 1
Highlighted
Administrator

Re: Fade Margin in Data Radios

Great and useful info, thanks for sharing 🙂