Competition vs. Complementarity
White Paper by Georg Serentschy
13 January 2020
DISCLAIMER: This document does not claim to give a complete and detailed picture of the ongoing discussion. Rather, essential discussion and conflict areas are briefly described in order to provide an overview of the state of the debate with the least bias possible. I am grateful for the advice and input offered by various subject matter experts while preparing this paper.
Table of Contents
This document describes some basic challenges for both the WiFi community and the mobile industry arising from the use of the 6 GHz band. It also discusses why WiFi does not play a significant role in the 60 GHz band. Topics such as the advantages and disadvantages of WiFi6 and 4/5G under illustrative usage scenarios, the role of latency times and the degree of reality of marketing promises are also addressed. One of the challenges lies in the fact that most of the literature is more or less biased because authors often act as advocates for either the mobile industry or the WiFi community. Ultimately it is a battle over spectrum, a scarce resource.
For all stakeholders in this area, critical fields of action and regulatory challenges lie ahead:
The process of allocating the 6 GHz frequency band to unlicensed radio local area networks (RLANs) is the center of the regulatory debate. Although various contribute to this process, the European Commission is the entity responsible for allocating spectrum in the European Union.
The technical work to evaluate the feasibility of the allocation is carried out by the Electronic Communications Committee (ECC) under the European Conference of Postal and Telecommunications Administrations (CEPT). This work was mandated in December 2017.
Results relating to the first task of the mandate were published in May 2019 as ECC Report 302.1
Results on the second task, covering the development of harmonized technical conditions, are expected to be published during the first half of 2020. This report will be critical for the coexistence of WiFi and 5G. The final report will serve as the basis for the European Commission’s implementing decision, which will define the conditions for the usage of the 59256–425 MHz frequency band under WiFi and other unlicensed technologies.
The high level at which decisions are to be taken indicates how diverse and challenging the policy and regulatory issues facing the various stakeholder groups are, as well as how important expert discussions are for helping all stakeholders to navigate reasonably safely through these issues.
To sum up, 5G and WiFi6 are complementary, but they are not the same thing. There is some convergence of technology features, which is the reason why this white paper is titled “WiFi Goes Cellular”. Both are key wireless developments, each of them serving specific segments of the market, but with some overlap. WiFi6, as the important development under user control, is all about more efficiency, spectrum, power consumption and data transmission.
In this field, many issues are a “work in progress”, and we can expect dynamic development as the standardization process moves forward.
IEEE, the Institute of Electrical and Electronics Engineers, is the standard development organization (SDO) that creates and maintains the WiFi standard, initially named 802.11. The standardization of WiFi6 is expected to be completed within the first six months of 2020 (if there are no major political upheavals).
Theoretical speed (max)
1 and 2 Mbps
1,2,5.5 and 11 Mbps
Up to 600 Mbps
2.4 GHz + 5 GHz
Up to 6.93 Gbps
Up to 9.6 Gbps
2.4 GHz + 5 GHz + 6 GHz
At the center of this is the question of how the coexistence of mobile networks and WiFi can be achieved. Coexistence has several dimensions (temporal, spatial and spectral), which should also be addressed in the course of standardization. The problem of coexistence has been exacerbated by the mobile industry’s desire to operate 5G with license-free frequency bands. Mobile technologies such as LTE LAA2 or more generally LTE U3 (for more details, see Appendix I) must comply in Europe with the “listen before talk” (LBT) principle, as they share the existing license-free 5 GHz band with WiFi. This means that LTE LAA and U are not primary modes within the spectrum and must therefore comply with the LBT principle.
Since WiFi is massively used at global scale, there is an urgent demand for additional frequency bands to expand deployment of such license-exempt technologies. But coexistence in the 6 GHz band is more complicated because it is a new license-free frequency band that, without existing incumbent usage, may be used by a variety of user groups under various technologies. Due to this fact, the LBT principle cannot simply be transferred to the 6 GHz band.
A prominent example of the conflict between the mobile industry and the WiFi community is seen in connected and autonomous driving. The European Commission (EC) and parts of the automotive industry have long supported WiFi as a rapidly available and proven communication platform.4 Germany and other Member States of the EU rejected plans of the European Commission to use WiFi (standard 802.11p) as a communication platform for autonomous driving and are in favor of 5G instead. This resistance ultimately forced the EC into a U-turn.5
Dedicated Short Range Communication (DSRC), in Europe called ITS-G5, is based on the automotive-specific WLAN standard IEEE 802.11p. The WiFi Community argued that mobile networks were not developed for independent communication between user devices without an overarching network. ITS-G5 is a standard for vehicle networking based on the WLAN standard IEEE 802.11p aka pWLAN and works in the 5.9 GHz band. IEEE 802.11p was specified in 2010 and is based on the outdated IEEE 802.11a WLAN specification. Concepts such as Vehicle-to-Everything (V2X), Car2X communication (C2X) and Cooperative Intelligent Transport Systems (C-ITS) envisage networked cars as communicating with each other, with other road users and with traffic technology in their immediate environment.
However, WLAN technology has some serious disadvantages for this scenario. IEEE 802.11p bridges only a few hundred meters and unlike mobile communications technology, there is no overarching network infrastructure to connect road users to a central control unit. This would have to be set up specifically for this purpose. Because the braking distance for vehicles on the road can be longer than the range of the WiFi radio signals, the use of IEEE 802.11p would in any case be limited.
Cellular-V2X (C-V2X) is based on the LTE standard and in future 5G. The “mobile technology camp”, on the other hand, relies on LTE C-V2X and sees more disadvantages in the WiFi-based ITS-G5. In contrast to the older ITS-G5 standard based on IEEE 802.11p, LTE C-V2X is a more modern alternative for communication between vehicles. The LTE specification is considered more powerful in terms of range and availability. C-V2X builds on an already established and widely used technology that enables both network-based and direct communication (LTE device to device). Worldwide, the tendency is toward the newer C-V2X.
Interestingly, the car manufacturers (OEMs) are not aligned with each other: VW, for example, planned to equip the new Golf with WLAN technology; BMW Group, Daimler, Audi, PSA and Ford, on the other hand, favor C-V2X. And the Chinese government is also backing C-V2X – and from 2025 only wants to allow vehicles with the corresponding technology on board. The telecommunications industry, however, stands united behind C-V2X. [separator top=”40″]
In applications such as the provision of connectivity in a sports stadium with 50,000 to over 100,000 visitors, almost every visitor today is equipped with a smartphone. As an alternative to the traditional cellular connectivity, which would pose a great challenge in such a scenario, a WiFi6 network operated for example by the sports stadium (or the sports club) would be an interesting alternative, not least because of the attractive additional revenue streams for the operator. Possibilities here include entertainment, advertising, sports betting, a video-supported view of the pitch, augmented reality applications, offloading mobile connectivity through distributed antenna systems (DAS) and many more.
A few years ago, the mobile industry hoped that femtocells would provide the solution to the capacity problems in mobile networks. These are small mobile radio stations that are connected via the wired internet and designed to enable small-scale, broadband coverage with UMTS/HSDPA and HSUPA locally. In Germany, Vodafone and Telekom introduced femtocells in 2013 but switched them off four years later. In the meantime, both companies are focusing on offloading data traffic from mobile networks via fixed-network customers’ WiFi access points (“WiFi offloading”).
WiFi offers high local data rates, but only within a very limited range. Traditionally, mobile telephony had lower data rates, but wider coverage at a higher price. According to the Cisco Visual Networking Index, about two-thirds of wireless data traffic are transmitted via WiFi and only one-third via mobile networks.6 This is no wonder as mobile internet via mobile providers is often more expensive (roaming outside of the EU is also an issue for most users) as well as congested, while often only supporting limited volumes, as only a fraction of customers enjoy truly unlimited data plans.
Mobile networks are now posed to catch up in volume and, with 5G, are about to increasingly adopt the WiFi concept of very small radio cells with wide transmission channels – albeit at a higher technical and financial cost.
With the introduction of any new technology, laboratory results achieved under ideal test conditions are often converted into full-bodied marketing promises that give rise to high expectations; this is the case here, too. Overly optimistic generalizations, driven by marketing concerns, have been made not only in relation to the data transmission rates but also latency levels.
Up to 400 MHz of modulation bandwidth is provided for 5G data transmission over super high frequency bands (SHF, or centimeter waves)7 and extremely high frequency bands (EHF, or millimeter waves).8 Though both only able to cover very short distances, the large bandwidth available for millimeter and centimeter waves does deliver capacity. Yet, due to the short physical ranges and quasi-optical propagation properties of these frequency bands, massive technical effort, including highly efficient transmission systems with sharply bundled antennas will be needed to achieve a range of only a few hundred meters at acceptable transmission rates. Extremely high frequency ranges are therefore not suitable for covering wide areas. The complex relationship between capacity, range and frequency is often summed up in a simplified formula: spectrum below 1 GHz is often called “coverage spectrum” (able to effectively cover large areas with large cells) and spectrum above 1 GHz “capacity spectrum”, able to transmit more data in a given time using small cells. The boundary between these is not exactly defined. This simplified rule is given to all those who wish to cover underserved areas with a capacity spectrum.
WiFi currently uses license-free spectrum within the 2.4 GHz, 5 GHz and 60 GHz bands. The corresponding wavelengths are 12 cm, 6 cm and 0.5 cm. Due to the very short range of millimeter waves, hardly any WLANs are operated in the 60 GHz band.
Although 60 GHz WiFi has existed for years and offers up to 2.16 GHz-wide transmission channels for delivering data at Gigabit speeds, it is still hardly used by end users. It is simply too expensive, the market for 60 GHz WiFi devices is very small and the line-of-sight range is at best a few dozen meters. Very short range, high susceptibility to interference, strong attenuation and pronounced quasi-optical transmission characteristics make for operating conditions that are much less favorable than those in the 5 GHz WiFi band; the latter, at lower cost and with channels up to 160 MHz in width, also supports high transmission rates but is far less susceptible to interference.
For the same reason, 5 GHz WiFi in turn has less favorable physical transmission characteristics when compared with 2.4 GHz WiFi. 5 GHz WiFi is does not reach as far as 2.4 GHz WiFi but offers a wider frequency band, allowing wider channels and thus higher speeds. The interference range with WiFi access points is shorter in the 5 GHz band, and since fewer WLANs overall use the wider frequency band, there is usually less interference. This is an advantage in densely populated areas. While 2.4 GHz has a slightly higher range, this means that the many WLANs in city centers interfere more with each other. Apart from that, 2.4 GHz is used by many other applications that potentially interfere with one another, while there are only four 2.4 GHz WiFi channels in total that do not overlap (1, 5, 9, 13).
The 5G millimeter range at 26 GHz has a wavelength that, at just over one centimeter, is only about twice as long as 60 GHz WiFi. mmWave2 – as it is called in 5G marketing language – therefore has hardly any better properties than 60 GHz WiFi.
Even rain, snow and fog strongly attenuate signals above 10 GHz. At a wavelength of 1 cm, raindrops can no longer be neglected as sources of interference. Also, free-field attenuation (attenuation of signals over distance, even without interfering objects) is physically higher at shorter wavelengths.
As far as range and susceptibility to interference are concerned, WiFi can therefore compete with 5G at extremely short wavelengths. They use similar frequency ranges, and both are therefore rather unsuitable for covering a large area. But what about latency? After all, 5G advertising promises extremely short delays.
The frequently cited short latency period of less than one millisecond (ms) in particular is a daring promise. It is said to be indispensable for autonomous vehicles. The conflict between marketing promises and reality begins with the fact that, strictly speaking, there is no such thing as “real time” transmission in a packet-based network, only “near real time”. With mmWave 5G, 1 to 4 ms latency is theoretically possible thanks to very short time slots, but only between 5G devices within one radio cell. This limitation creates challenges for autonomous driving based on designated short-distance radio in the extreme millimeter range.
Autonomous cars will not be calling home with 1 ms latency in the future either. Although the speed of light is 300,000 kilometers per second, to achieve a ping with a round-trip latency of 1 ms, the transmission distance must not exceed 150 km due to the physical running time of the signals alone. To achieve this, however, all other processing steps, guard intervals, time slots and interfaces would have to run with a delay of zero, which is unrealistic.
In 2019 latencies of 8 to 12 ms were in fact measured in a mmWave 5G network operated by Verizon in Chicago. Instead of 10 or 20 Gbps, the download rate was 80 to 900 Mbps, while the upload rate was a sobering 12 to 57 Mbps. Round-trip latency was correspondingly around 25 ms.9 These are values that the most recent LTE Pro Advanced generation would already have to achieve.
There are possibilities for improving the transmission quality of short waves. However, they are technically challenging and expensive. This would require up to 64×64 MU MIMO10 transmit/receive chains per 5G antenna sector. Covering all densely populated areas in a country with a dense network of such complex 5G mobile base stations would be far too costly. This would require the acquisition and development of a large number of expensive new sites.
Even if the terminal devices were then connected by radio at Gigabit speeds to the base stations, many countries often lack a pervasive and fast backbone infrastructure with fiber optics in the background to be able even to supply the 5G base stations with the promised Gigabit bandwidths.
11. Mobile operators will not deploy networks to the technical maximum: „Not all 5G is created equal”
It is therefore questionable how much bandwidth 5G will ultimately reach consumers and where at all. Rollout stages in the construction of LTE base stations can differ in terms of performance level and cost. An LTE base station can support a downlink rate between a maximum of 10 Mbit and just under 3,000 Mbit. Why should mobile operators deploy their LTE base stations to meet higher standards than are absolutely necessary or economically viable? (For some more details see Appendix I.)
This economic logic will have an even stronger bearing on the deployment of 5G. For 5G as well, there are various and thus varyingly expensive rollout stages. As with LTE, it cannot be assumed that operators will waste their money and expand 5G everywhere for the sake of enjoyment. Without regulatory pressure through rollout obligations as part of awarding spectrum (in auctions), less would happen than necessary in the development and deployment of LTE.
The fixed element of a mobile network is also an important element to consider. To deliver such ambitious performance, 5G will require a fiber backbone and it is far from clear how this essential prerequisite will be achieved.
In a recent FierceWireless article “2020 Preview: 5G marketing remains messy”11 the author highlights what she calls “confusion among consumers” as US operators launch initial 5G service and predicts that 2020 could see additional growing pains when different flavors of next-generation technology hit the masses. US carriers have pledged to deliver more widespread 5G coverage and enhanced benefits, but their respective approaches and spectrum holdings mean 5G could be “different things to different people”.12
Verizon plans to continue its millimeter wave 5G deployments,13 expanding from the previous 30 markets in 2019 the mobile service it dubs as “Ultra-Wideband 5G”. Earlier this year Verizon’s consumer group chief executive Ronan Dunne pointed out the differences between 5G using high-band spectrum and using low or mid-band frequencies, saying “not all 5G is created equal.”14
In everyday digital life, LTE and the new small, expensive 5G radio cells have to compete economically with WiFi6, the current sixth WLAN generation IEEE 802.11ax. A closer look reveals that especially in comparison with WiFi6, the announcement of a “universal” 5G technology revolution is apparently a partisan overpromise.
In marketing competition with the mobile network operators, the Wi-Fi Alliance15 has in the meantime even followed suit in terms of language and is now also talking about WiFi generations instead of the clumsy IEEE numerical codes. WiFi6 (IEEE 802.11ax) is already on the market and, compared to its predecessor WiFi5 (IEEE 802.11ac), offers higher ranges, higher data rates almost in the two-digit gigabit range, more efficient use of the electromagnetic spectrum and extensions for the more effective connection of energy-saving IoT devices.
Exactly the same features are used to advertise 5G compared to older LTE generations. However, much of what in 5G is praised as a stunning technical innovation is already included in the final stages of LTE.
Technically, the differences between the current WiFi6 and 5G are much smaller than one might think. Both 5G and WiFi6 offer MU (multi-user) MIMO, orthogonal frequency division multiple access (OFDMA) and quadrature amplitude modulation (QAM) up to 1024 symbols per character (1024-QAM, 10 bits per character) on the radio side.
With MIMO devices, several data streams are transmitted in parallel, with one and the same device transmitting or receiving streams simultaneously over one radio channel using several transceivers and the same number of antennas. This takes advantage of the fact that transmission conditions can differ slightly between antenna locations (spatial diversity). MU-MIMO is an extension of MIMO in which several terminal devices are radioed simultaneously with different data streams. For example, an 8×8 MU-MIMO device can simultaneously transmit two data streams to four 2×2 MU MIMO-enabled terminal devices (for more technical details, see also the recommended tutorials mentioned in Annex II).
Additionally, two organizations, representing the 5G industry NGNM (“5G and beyond”)16 and the WiFi industry (Wireless Broadband Alliance)17 respectively, have most recently published a joint white paper that explores the importance of RAN convergence between the two technologies.18
MU-MIMO has been used in mobile communications since newer LTE generations were introduced. With 5G, the number of MU-MIMO streams in the base station increases to up to 64×64 in the maximum expansion stage, which is why the 5G industry speaks of “massive MU-MIMO”. WiFi6, in comparison, has so far reached a maximum of 8×8 MU-MIMO streams.
However, the maximum number of possible parallel MU-MIMO streams for WiFi6 and 5G on the base station side does not mean that a single terminal can actually receive 64×64 or 8×8 MU-MIMO data streams. Many terminal devices are not even equipped with MIMO, much less MU-MIMO, but even only 2×2 MU-MIMO is an improvement.
Only for the super high (SHF) or extremely high frequency (EHF) bands with centimeter or millimeter waves is it physically possible to accommodate four or even eight MU-MIMO antennas in the housing of a tablet or large smartphone. MIMO antenna systems require a distance of at least one third of the wavelength between the individual antennas. For the 2.4 GHz WLAN band with a 12 cm wavelength, that amounts to 4 cm.
For 5G terminals, 4×4 MU-MIMO is currently the state of the art for the SHF and EHF bands. A total of 16 such 5G terminals could simultaneously receive 4×4 MU-MIMO streams each from a 5G system with a 64×64 MU-MIMO antenna system. Physics therefore also limits the integration of MU-MIMO in mobile terminal devices for the later switchover from LTE to 5G in the longer wavelength mobile radio bands, which serve as coverage frequencies for areas outside conurbations. The wavelength in the 700 and 800 MHz LTE bands is 40 cm. Smartphones and tablets simply lack the physical space for a practical MU-MIMO antenna setup with decimeter waves, the wavelength required for area coverage.
The number of antennas in the terminal devices cannot therefore be increased arbitrarily. On the base station side, both with 5G and WiFi6, many MU-MIMO antennas can and may be used. With the number of antennas distributed within the space on the side of the base stations, the statistical probability increases that one or more parallel radio beams with a good signal-to-noise ratio to the terminal device exist. Users genuinely benefit from this, provided operators invest accordingly in the technology.
Multi-antenna systems on the base station side increase the quality of radio coverage because of improved reception due to multi-path propagation. This is also the case if the terminal device is not capable of MU-MIMO or MIMO. In addition, 5G and WiFi6 systems require beamforming. This involves controlling antenna elements individually in a way similar to electronically steerable directional antennas, able to target and transmit to end devices.
With 5G and WiFi6 in practice, expanding base stations in this way does not result in the extremely high data rates forecast in theory; this only works under ideal (“lab”) conditions at the shortest distance. According to the Shannon-Hartley law,19 a 1024-Qhadrature Amplitude Modulation (QAM) must have a mathematical signal-to-noise ratio of at least a factor of 1023 and even more in practical technical implementation. Even minimal deviations in phase synchronization and only minor noise variance cause transmission errors in the decoding of bits and bytes. If the signal-to-noise ratio is not good enough, hardly anything remains of the forecast data rates.
With WiFi5, the predecessor of WiFi6, chip manufacturer Broadcom already advanced quadrature modulation rates using non-standard extension, from 256 to 1024 QAM. However, competitor Qualcomm doubted the practical benefits. Now 1024-QAM is standard for WiFi6 and 5G. In fact, it does not really add that much in practice. Still, increased theoretical transfer rates look good in product advertising.
Under more realistic conditions, however, with poor signal-to-noise ratios, 1024-QAM is constantly switched down to lower and more robust symbol rates, instead finally delivering 2 (4-QAM) instead of 10 bits per character, for example. If the base station is equipped with an excellent MU-MIMO system, this improves coverage and thus the signal-to-noise ratio. To save power and costs, however, not all 5G stations will offer the maximum extension of 64×64 streams, but only 32×32, 16×16 or 8×8. The current situation of LTE, that base stations are operated without MIMO or only with 2×2 MIMO for cost reasons, will foreseeably repeat itself with 5G.
There are therefore also critical voices that see a huge gap between what 5G is promised to become and what it is likely to be. Also, some representatives of the WiFi community criticize the weak points of 5G compared with WiFi, especially the radiation levels (which can be critical in some jurisdictions with very low EMC limits) and the openness of networks.
In terms of radio technology, both standards differ above all in the available frequency bands and the permitted transmission power levels, taking into account antenna gain and equivalent isotropic radiation power (EIRP). For WiFi in the 2.4 GHz band, only up to 0.1 W effective transmission power is permitted. A typical 5G base station works with up to 40 W effective transmission power per directional beam. 5G networks also have the privilege of using exclusive and undisturbed wavelength ranges that do not need to be shared with other radio applications.
These differences are primarily of a regulatory nature. However, the automatic handover between base stations and automatic authentication via SIM are essential for mobile radio. WiFi hotspots, on the other hand, are usually open only to a small group of users, without automatic authentication. Only in exceptional cases is automatic handover or automatic authentication available for professionally organized WiFi networks. WiFi offerings are therefore currently small-scale and tailored to a small group of users.[separator top=”40″]
They could, for example, be networked according to the principle of meshed free radio hotspots in Germany20 with automatic handover and at greater expense. However, due to the unsuitable frequency ranges and lower transmission power, WLAN is not suitable for covering large areas in the same way as LTE or 5G. However, mobile networks with 5G over SHF and EHF bands does not offer this advantage either.
It will take some time before today’s LTE bands are refarmed for 5G use. It is also questionable how antenna systems with 64×64 antennas, for example, could be technically implemented in these longer wavelength frequency bands. It has long been technically possible to upgrade any older LTE base station to LTE Advanced Pro with 8×8 MIMO.
There is an ongoing regulatory process to allocate the 6 GHz frequency band to unlicensed radio local area networks (RLANs). Although there are various contributors to the process, the European Commission as the entity responsible for allocating spectrum in the European Union is also responsible for the 6 GHz band.
The technical work to evaluate the feasibility of the allocation is being carried out by the Electronic Communications Committee (ECC) under the European Conference of Postal and Telecommunications Administrations (CEPT). This work was mandated in December 2017. Results of the first task under the mandate were published in May 2019 as ECC Report 302.21 A report on the second task, to cover the development of harmonized technical conditions, is expected to be published during the first half of 2020.
This report will be critical for the coexistence objective that was highlighted at the beginning of this paper. Based on the final report, the European Commission will issue an implementing decision to define the conditions for usage of the 5925–6425 MHz frequency band with WiFi and other unlicensed technologies.
To conclude, 5G and WiFi6 are complementary, but they are not the same thing. There is some convergence of technology features, which is the reason why this white paper is titled “WiFi Goes Cellular”. Additionally, two organizations, representing the 5G industry NGNM (“5G and beyond”)22 and the WiFi industry (Wireless Broadband Alliance)23 respectively, have most recently published a joint white paper that explores the importance of RAN convergence between the two technologies.24
Both 5G and WiFi6 are key wireless developments, each of them serving specific segments of the market, but with some overlap. WiFi6, as the important development under user control, is all about more efficiency, spectrum, power consumption and data transmission.
This article shows how diverse and challenging the policy and regulatory issues facing the various stakeholder groups are, as well as how important expert discussions are for helping all stakeholders to navigate reasonably safely through these issues.
LTE networks are carrying an increasing amount of data. Although cells can be made smaller to help accommodate, it is not a final solution and more spectrum is needed.
One approach is to use unlicensed spectrum alongside the licensed bands. Known in 3GPP as LTE License Assisted Access (LTE-LAA) or more generally as LTE Unlicensed (LTE U), it enables access to unlicensed spectrum especially in the 5GHz industrial, scientific and medical (ISM) band.
There is a considerable amount of unlicensed spectrum available around the globe. These bands are used globally to provide unlicensed access for short range radio transmissions. These bands, called industrial, scientific and medical (ISM) bands, are allocated in different parts of the spectrum and are used for a wide variety of applications including microwave ovens, WiFi, Bluetooth and much more.
The frequency band of most interest in LTE-U / LTE-LAA is the 5 GHz band. Here there are several hundred MHz of spectrum bandwidth available, although the exact bands available depend upon the country in question.
In addition to the basic frequency limits, the use of the 5 GHz bands for applications such as LTE-U or LTE-LAA carries some regulatory requirements.
One of the main requirements for access to these frequencies is that of being able to coexist with other users of the band – a method of clear channel assessment (CCA) or listen before talk (LBT) is required. This often means that instantaneous access may not always be available when LTE-U is implemented.
Another requirement is that different power levels are allowed depending on the country and the area of the band being used. Typically, between 5150 and 5350 MHz there is a maximum power limit of 200 mW and operation is restricted to indoor use only, while higher frequencies often allow power levels up to 1 W.
The use of LTE-U / LTE-LAA was first introduced in Rel13 of the 3GPP standards. Essentially, LTE-U is built upon the carrier aggregation capability of LTE Advanced that has been deployed since around 2013. Carrier aggregation seeks to increase the overall bandwidth available to user equipment by enabling the use of more than one channel, either in the same or another band.
There are several ways in which LTE-U can be deployed:
Downlink only: This is the most basic form of LTE-U and it is similar in approach to some of the first LTE carrier aggregation deployments. In this, the primary cell link is always located in the licensed spectrum bands. Also, when operating in this mode, the LTE eNodeB performs most of the necessary operations to ensure reliable service is maintained, as well as that the channel is free so as not to cause interference to other users.
Uplink and downlink: Full TDD LTE-U operation with the user equipment having an uplink and downlink connection in the unlicensed spectrum requires the inclusion of more features.
FDD / TDD aggregation: LTE-CA allows the use of carrier aggregation mixes between FDD and TDD. This provides for much greater levels of flexibility when selecting the band to be used for LTE-LAA operation within unlicensed spectrum.
LTE-U relies on the existing core network for the backhaul and other capabilities such as security and authentication. As such, no changes are needed to the core network. Some changes are needed to the base station to enable it to accommodate the new frequencies and also incorporate the capabilities required to ensure proper sharing of the unlicensed frequencies. In addition, the handsets or UEs will need to have the new LTE-U / LTE-LAA capability incorporated to allow access to LTE over these additional frequencies.
One of the great fears that many have is that the use of LTE-U will swamp unlicensed 5 GHz spectrum and that WiFi using these frequencies will suffer the same as other users.
The LTE-U system is being designed to overcome this issue. Using a listen-before-talk (LBT) solution, all users should be able to coexist without any undue levels of interference.
There will be cases where LTE-U operation and WiFi use different channels. Under these circumstances, there will be only minimal levels of interference.
It is also possible to run LTE-U and WiFi on the same channel. Under these conditions, both are able to operate, although with a lower data throughput. It is also possible to place a “fairness” algorithm into the eNodeB to ensure that the WiFi signal is not unduly degraded and is still able to support a good data throughput.
10 MU-MIMO: Multi-user MIMO is a set of “Multiple Input and Multiple Output” (MIMO) technologies for wireless communication, in which a set of users or wireless terminals, each with one or more antennas, communicate with each other. In contrast, single-user MIMO considers a single multi-antenna transmitter communicating with a single multi-antenna receiver. MU-MIMO can leverage multiple users as spatially distributed transmission resources.
19 https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem; in information theory, the Shannon–Hartley theorem determines the maximum rate at which information can be transmitted over a communication channel of a specified bandwidth in the presence of noise.