Can You Hear Me? How Antenna Choice and Spectrum Allocation Determine Cellular IoT Success

Subscribe To Download This Insight

By Jamie Moss | 3Q 2019 | IN-5536

Different bandwidth allocations are differently suited to cellular Low Power Wide Area (LPWA) services for the Internet of Things (IoT). In the interests of customer experience and quality of service, carriers should consider specific terminal equipment and network planning issues, to make the best use of their radio spectrum assets.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.


A Logical Assertion


Low Power Wide Area (LPWA) radio technologies exist to solve cost, power consumption, and coverage problems. These features are integral, with the second two being definitional, to LPWA. Here we consider its third feature: coverage. Improved coverage allows for better in-building and underground penetration over the widest possible area, for a more reliable, business-critical service. Narrowband Internet of Things (NB-IoT)’s three enhanced coverage modes allow an extra 20 decibels (dB) of link budget, while LTE-M’s two modes provide up to 15 dB of extra link budget. These modes are transmission repetition techniques that when activated work at the expense of throughput, latency, and power consumption: throughput and latency are reduced while power consumption is increased.

A logical assertion is that carriers with sub-gigahertz spectrum are the most naturally placed to offer cellular LPWA services; carriers with the lowest frequencies will be able to do so most effectively at the network level, and carriers with a single, larger frequency block that do not need to piece together multiple spectrum allocations will be able to do so most effectively at the terminal equipment level. NB-IoT has 28 defined deployment bands, while LTE-M has 29. Of these, LTE-M has 12 sub-gigahertz bands, not including duplicate sub-bands, while NB-IoT has 11. This is according to the latest 3rd Generation Partnership Project (3GPP) Technical Specification (TS 36.101).

Not All Carriers Are Equal


Some carriers are predisposed to achieving greater success than others in developing their LTE-M and NB-IoT connection business. Carriers that own sub-gigahertz frequencies can achieve greater coverage with less expense, creating a larger total addressable market of enterprises to sell to and use cases to serve. Especially if their spectrum allocation is lower than that of their rivals. However, there then exists the challenge of correctly providing customers with access to their networks through mobile transceivers that are properly tuned. All Industrial, Scientific, and Medical (ISM) band-based Sigfox and LoRaWAN connections can reliably work with the same single-band antenna, creating a unique level of transmit/receive reliability that cellular does not have, bandwidth contention notwithstanding. This leaves a further question of what to do about the mid-band frequency allocations for the cellular IoT that constitute the majority of all the bands standardized by the 3GPP for use by LTE-M and NB-IoT.

Effectiveness and efficiency permeate the IoT stack, from end users to their technology suppliers. For each actor to benefit, wastage of their respective resources must be minimal, while percentage margins and volume gains must be maximized. Carriers with lower cellular IoT frequencies have signals that will propagate further, requiring fewer cell towers (properly known as eNodeBs) for geographic coverage. This is important because the low revenue that will be generated per cellular LPWA connection requires LPWA transceivers to reuse existing mobile broadband cell tower locations, not require additional sites of their own for the sake of infill, and to piggyback as few of the existing sites as possible to necessitate the minimum number of cell tower upgrades and, ergo, the least possible expense.

For terminal equipment, antennas for low frequencies are the hardest to design and the most expensive to use while, for multi-band low frequency networks, multiple antennas are required, further increasing terminal equipment costs. A dangerous temptation here is for module manufactures to use general-purpose wideband antennas instead. This can be risky because wideband antennas are less effective at accurately resolving receptions for specific frequencies, especially at the edge of a given band. This lack of tuning can cause connectivity reliability issues for the end user, necessitating the more frequent use of enhanced coverage modes, which more rapidly diminishes battery life. This makes wideband antennas a false economy. For mid-band frequencies, which are less ideal for offering cellular LPWA, antenna design is ironically easier and cheaper, with general-purpose wideband antennas being adequate.

A Question of Reception


Carriers need to be careful to ensure that enterprises do not have sub-optimal experiences on their cellular LPWA networks. Patchy coverage and poor reliability during Proof of Concept (PoC) exercises must be avoided. Such results could easily cause potential customers to choose different LPWA technologies or network operators or abandon their IoT connectivity plans altogether due to uncertainty and/or a lack of confidence. Consequently, it may not be enough for modules to simply be certified to work on a carrier’s network; it may be necessary to ensure that modules use antennas that are deliberately optimized for the LTE-M and NB-IoT frequency bands that each carrier supports.

General-purpose broad-spectrum antennas may struggle to perform effectively at lower frequency ranges. Still, the propagation characteristics of those same frequencies makes them best-suited to supporting LPWA device requirements. The carriers that use sub-gigahertz ranges for cellular LPWA should be best equipped to serve the market with the greatest ease. They must not hamstring themselves by offering enterprises terminal hardware that fails to do justice to the networks and spectrum in which they have invested so much time and money. To this end, carriers must work closely with module vendors to precisely match end user equipment performance to their networks. Module vendors must in turn carefully chose their antenna component suppliers, as well as the module models that they present to each carrier for certification.

Carriers operating cellular LPWA networks in the gigahertz range will be easily served by a greater variety of broad-spectrum modules, but must pay careful attention to network coverage. Gigahertz spectrum is less ideal for supporting LTE-M and NB-IoT, because the higher the frequency, the less far it travels. For these carriers, network planning for effective cellular LPWA services is vital: where do carriers need coverage, and when do they need to make sure it’s in place by? Carriers may need to decide to target urban and suburban opportunities only, even though actively assessing opportunities is a far more difficult and intensive exercise than having a nationwide network of universal utility; which is why Vodafone has taken the stance of nationwide or nothing for its NB-IoT availability. So, while LTE-M and NB-IoT’s enhanced coverage features are a powerful fallback, it should be held in reserve as much as possible so as not to compromise the other definitional quality of the technology: low power consumption. The correct high-quality terminal antenna choices will help with this.