NIST researchers studied the effects of trees on millimeter waves, which are planned for use in 5G communication. Credit: N. Hanacek/NIST |
That’s one way to describe the problem confronting cell network designers, who have to embrace both the benefits and shortcomings of a new class of signals that 5G will use: millimeter waves. Not only can these waves carry more information than conventional transmissions do, but they also usefully occupy a portion of the broadcast spectrum that communication technologies seldom use — a major concern in an age when broadcasters vie for portions of spectrum like prospectors staking out territory.
However, millimeter waves also have drawbacks, including their limited ability to penetrate obstacles. These obstacles include buildings, but also the trees that dot the landscape. Until recently little was known about how trees affected millimeter wave propagation. And just as few of us would want to imagine a landscape without greenery, few designers would be able to plan networks around it without such a crucial fundamental detail.
The National Institute of Standards and Technology (NIST) has set out to solve this problem by measuring trees’ effect on millimeter waves. The effort could make a profound difference in our next-generation devices’ ability to see the 5G antennae that may soon sprout.
The 5G era will feature wireless communication not only between people but also between devices connected to the Internet of Things. The increased demand for larger downloads by cell customers and lag-free network response by gamers has spurred the wireless industry to pursue speedier, more effective communication. Not only could our current devices and services work more effectively, but we could realize new ones: Autonomous vehicles will depend on such quick network response to function.
“We will be able to do new things if our machines can exchange and process information quickly and effectively,” said Nada Golmie, head of NIST’s Wireless Networks Division in the Communications Technology Laboratory. “But you need a good communication infrastructure. The idea is to connect, process data in one place and do things with it elsewhere.”
Millimeter waves, which are new turf for the wireless industry, could be part of the solution. Their wave crests are just a few millimeters apart — a very short distance compared with radio waves that can be several meters long. And their frequencies are very high, somewhere between 30 and 300 gigahertz, or billion wave crests per second. Compared with conventional radio transmissions, which are in the kilohertz (for AM) and megahertz (for FM) ranges, new 5G signals will be very high frequency indeed — something like a bird tweeting at the upper range of human hearing compared with radio’s deep, low bass.
It is millimeter waves’ high frequency that makes them both tantalizing as data carriers and also hard to harness. On the one hand, more wave crests per second means the waves can carry more information, and our data-hungry era craves that capability to provide those faster downloads and network responses. On the other, high-frequency waves have trouble traveling through obstructions. Anyone who has passed near a house or car whose occupants are playing loud dance music knows that the throbbing bass frequencies are most of what reaches the outdoors, not the treble of a lilting soprano.
For 5G networks, the obstructing wall can be no more than an oak leaf. For that reason, NIST scientists embarked on a somewhat unusual task in September 2019: They set up measurement equipment near trees and shrubs of different sizes around the agency’s Gaithersburg, Maryland, campus. The study continued for months, in part because they needed seasonal perspective.
“The tree study is one of the few out there that looks at the same tree’s effect on a particular signal frequency through different seasons,” Golmie said. “We couldn’t only do the survey in the winter, because things would have changed by summer. It turns out that even the shape of leaves affects whether a signal will reflect or get through.”
The team worked with the wireless community to develop the mobile equipment that was needed to take the measurements. The researchers focused it on single trees and aimed millimeter-wave signals at them from a range of angles and positions, to simulate waves coming from different directions. They measured the loss, or attenuation, in decibels. (Each 10 dB of loss is a reduction by a power of 10; a 30 dB attenuation would mean the signal is reduced by a factor of 1,000.)
“The tree study is one of the few out there that looks at the same tree’s effect on a particular signal frequency through different seasons. Even the shape of leaves affects whether a signal will reflect or get through.” — Nada Golmie, NIST researcher
For one type of leafy tree, the European nettle, the average attenuation in summer was 27.1 dB, but it relaxed to 22.2 dB in winter when the tree was bare. Evergreens blocked more of the signal. Their average attenuation was 35.3 dB, a number that did not change with the season.
(As a measure of comparison, the team also looked at different types of building materials. Wooden doors, plasterboard walls and interior glass showed losses of up to 40.5 dB, 31.6 dB and 18.1 dB, respectively, while exterior building materials exhibited even larger losses, up to 66.5 dB.)
While NIST’s contributions to 5G network development effort could end up as ubiquitous as trees themselves, for most of us they will be considerably less visible. The measurements the team made are intended mainly for companies that create models of how different objects affect millimeter waves. Part of the effort was a collaboration with Ansys Inc. The company used the measurement data NIST shared with it to tune the tree simulation models, which cell companies use to plan out their networks of antennas in detail.
“Most models don’t include measurement-based information about trees,” said NIST’s David Lai, one of the scientists who conducted the study. “They might simply say that for a given tree-like shape, we should expect a certain amount of signal loss. We want to improve their models by providing accurate measurement-based propagation data.”
NIST’s collaboration with Ansys contributed to guidance issued by the International Telecommunication Union (ITU), the organization that creates guidelines for telecom standards. The results now appear as a new section on trees in ITU’s Recommendation ITU-R P.833-10. This publication serves as a reference for signal propagation models, which others will develop.
“Our goal is to get these measurements in front of the entire wireless community,” Golmie said. “We hope this effort will help the entire marketplace.”
Source/Credit: National Institute of Standards and Technology
tn010422_02