Jasa Web Design

Wireless performance and hot air

pig-hot-air-balloon

This post is triggered by a Twitter exchange started last night with Glenn Fleishman on the subject of how heat affects wireless communications. His initial tweet was:

I’m not getting the love. How does high temperature prevent Wi-Fi signal propagation?

followed by others,

Ok, the stupid “temperature affects Wi-Fi” story? I read the report. The report is talking about all tower-based wireless comm (1/2)

and it says that heat could affect transmission distance (implies this), but also that heat might damage towers/backhaul (2/2)

It referenced an article on The Telegraph which quotes Caroline Spelman, UK’s Environment Secretary, saying:

The signal from wi-fi cannot travel as far when temperatures increase. Heavy downfalls of rain also affect the ability of the device to capture a signal.

The first statement is misleading and even FUD-like, while the second is true – I have personally witnessed a statewide TETRA network go down (over 200 base stations) when a heavy thunderstorm sat on top of the SwMI, with rain curtains killing all outbound wireless links (hint: star topologies with a central switching node are a no-no).

Quoting the report itself (click for larger version), four things are notable:

The first being:

Location/density of wireless masts may become sub-optimal as wireless transmission is dependent on temperature.

Cellular service and hot air

The phrase above, in my view, implies wireless services that rely on frequency re-use as part of their inherent design, such as cellular telephony and TETRA, for example. These networks are called cellular for a reason, as they are designed in individual cells with shape and coverage made to fit into a puzzle. A graphic explains this much better:

Here, we are using four frequencies to serve a total of eight cells. As can be seen, no neighboring cells use the same frequency, so the spectrum usage efficiency is doubled in this particular case. In designing cellular networks, one can increase the density of cells by decreasing their coverage, and thus increasing the capacity of the network. In a city it’s common to find cells covering 300m, whereas in rural areas I’ve been registered on cells as far as 27km away.

The coverage of each cell is configured by careful handling of parameters such as power output and sectorization using panel antennas. The following graphic illustrates how sectorization and power management helps shape a cell’s coverage (source: University of Washington in St. Louis)

With all this in mind, how could heat possibly affect the arrangement of cells in a way to make their location and density “sub-optimal”? Take a look at this:

The heat output from the chimney is causing visible light to refract, as the density of air above it changes with temperature. This is an extreme example, but illustrates the idea of the report’s claims. Visible light falls on the high end of the radio spectrum, so if it can be affected by air density, so can lower frequency wireless signals. To make a point clear,

Heat doesn’t affect electromagnetic radiation (wireless signals), density does.

A very interesting observation can be found on Rob Flickenger’s Wireless Hacks, published by O’Reily:

This displays radio data for a one-mile link, averaged over several days. You can see that in the middle of each day, the signal drops by as much as 6 dB, while the noise remains steady … The repeating pattern we see indicates the effect of thermal fade.

What they did was control the signal strength of a one-mile wireless link over several days, and noticed that during periods of higher temperature (and thus lower air density) the signal strength dropped considerably.

Thermal fading could end up being a problem in cellular networks. Refraction by density variations could affect the path of wireless signals, causing interference not by ‘range’ changes but by interference on the same frequency. Secondly, decreased density due to increased temperature could result in coverage ‘holes’ at the edge of cells, which if compensated simply by increasing output power, could then result in interference with other cells when the density decreases.

Let’s go with the second one:

Reduced stability of foundations and tower structures.

Frankly, having studied materials engineering, I fail to see how an increase of even a few degrees could compromise a tower structure.

Third:

Increased damage to above ground transmission infrastructure.

This ties in with the previous one, the only obvious issue being the increased ventilation requirements for wireless infrastructure. They have cellular networks in Dubai, which is way hotter than the UK will ever be…

And finally:

Possible reduced quality of wireless service.

In turn, this ties with the first issue – the fading problems discussed can indeed affect the quality of cellular networks.

However

Your home Wi-Fi will NOT be affected by thermal fading significantly enough for you to notice. Temperature gradients inside a home are not capable of bending your Wi-Fi signal towards the neighbor and away from your laptop.

The report, while mentioning possible issues in wireless services, fails to mention them again, and fails to provide any deeper analysis of the issues listed, or possible solutions.

Finally, Glenn later said:

Range, not re-use.

The problem, in my view, is with systems that re-use frequencies. On a single-site system, you can just increase power to compensate for any losses due to density variations, and you’re done. On a cellular system, you simply cannot do that. I feel Glenn admits to this when he tweets

@alfwatt Anyway, she said Wi-Fi, but the report is talking about cellular infrastructure.

Cheers!

2 Responses to “Wireless performance and hot air”

  1. BubuXP May 8, 2012 at 14:18 #

    Summer is coming and I was asking myself if the higher temperature could affect the range of my wireless router.
    I supposed that higher temperatures increase the range because years ago in Sicily I could catch the Libian and Egyptian tv signals with my tv. And this happened only in the hottest days, summer mainly (I don’t know if this still happens because I don’t have a tv anymore).
    How can that was possible if high temperatures are not supposed to affect radio signals and, instead, they affects the air density to be less conductive?

    • Mike May 8, 2012 at 14:43 #

      The effect you observed is called tropospheric ducting, which takes place at the lower frequencies (VHF and UHF) used by TV, and consists (in simple terms) on the bouncing of radio waves off the troposphere. This can take place in hot days, but also depends on other factors – here you can see forecasts for Europe:

      http://www.dxinfocentre.com/tropo_eur.html

      The fact that TV stations sometimes use kilowatts of power instead of a few watts in cellular networks helps get their signals further.

Leave a Reply:

Gravatar Image