Ensuring Proper Communication in Environments with High Interference
Demand for high-volume data streams in the current market for modern wireless communication systems is growing at a fast pace. In order to keep up with the trend to higher throughput requirements within unchanged bandwidth limitations, long-term evolution (LTE) technology has become a popular solution for replacing the transfer of data over 2G/3G communication networks. Although 5G is gaining ground in big cities and throughout the developed world, LTE is still the primary cellular standard in most countries around the globe. The popularity of LTE is driven in large part by the low cost and high performance it delivers. LTE can potentially reach a raw bit rate of 300 Mbps in the downlink channel using advanced MIMO configurations. Further, voice over LTE (VoLTE) enables voice transmissions.
Another major advantage of LTE is that 2G and 3G services are being switched off in many parts of the developed world. As a result, the default fallback system for emergency scenarios is the 4G LTE network.
Other than providing the standard of choice for commercial networks, LTE is also often used to broadcast emergency information in times of natural disasters and national crisis situations.
However, LTE has some vulnerabilities that are a matter of concern since it is possible to completely take down an LTE network or at least partially block communication networks intentionally or unintentionally. Some defined LTE bands are prone to coexistence issues with the S-band radar frequencies, such as those used by air traffic control (ATC) and air traffic surveillance (ATS) radars that scan the horizon up to 500 km range. In addition, at the lower end of the frequency spectrum, LTE has coexistence issues at the ultra high frequency (UHF) bands.
A clear understanding of LTE technology and its vulnerabilities is especially important for commercial, civil-governmental, and defense applications. This article highlights areas of greatest susceptibility to interference and jamming of the LTE network and possible counter‑measures and also explores coexistence issues. Our goal is to provide a solid foundation for the use of LTE technology for devices used in commercial, civil-governmental, and military applications.
Jamming Techniques
Wireless communication systems are not deployed in an ideal environment. The channels are subject to unwanted interference from other services operating in the adjacent frequency bands. There are also cases of jamming attempts on the network. This causes the performance of the network to degrade. In this section, we’ll discuss conventional jamming techniques as well as certain new, smarter, and more power-efficient jamming techniques.
Barrage Jamming
Barrage jamming (BJ) is the most basic jamming technique. This is highly effective when there is no prior knowledge of the network. The entire spectrum of the target signal is jammed by transmitting band-limited noise to the system. This means the signal-to-noise ratio (SNR) decreases over the entire bandwidth. BJ is the most inefficient method of jamming. It requires a lot of power but is taken as a baseline for comparing the efficiency of other forms of jamming and their corresponding effectiveness. More information on BJ analysis can be found in [2]. Figure 1 presents the spectrum for a BJ attack.
Partial Band Jamming
Partial band jamming (PBJ) is a technique in which a certain portion of the entire system bandwidth is targeted and jammed by transmitting additive white Gaussian noise (AWGN) over this specific bandwidth. When the power of the jamming signal is constant, the effectiveness of the jamming depends directly on the fraction of the jamming bandwidth and the signal bandwidth. More information on PBJ can be found in [1,2]. In Figure 1, the part of the spectrum affected by PBJ can be seen.
Single-Tone Jamming
In single-tone jamming (STJ), a single high-powered impulse of AWGN noise is transmitted to jam only a certain band of interest. In the LTE downlink, only single subcarriers can be jammed using the STJ technique.
Figure 1 shows the effect of STJ on the spectrum. STJ can also be considered as a special case of PBJ. A more analytical investigation of the STJ can be found in [2]. In STJ, the knowledge of the target system’s carrier frequency is required in order to jam the target signal.
Multi-Tone Jamming
Multi-tone jamming (MTJ) is another form of PBJ. Unlike STJ, multiple, equally powered noises are transmitted in order to take down multiple frequency subcarriers within the LTE bands. An MTJ attack is highly effective when there is a power limitation on the transmit side. This means that if there is a strict limitation on the transmit power, an increase in the number of transmitted tones will decrease the power associated with the individual transmitted jamming tones. A detailed analysis of the effect of MTJ on orthogonal frequency-division multiplexing (OFDM) can be found in [3].
Figure 1 shows an illustration of an MTJ attack on the spectrum. In MTJ, knowledge of the target system’s carrier frequency is required.
Asynchronous Off-Tone Jamming
There are two types of asynchronous off-tone jamming (AOTJ). The first type is called single off‑tone jamming, and the second is a multiple off‑tone jamming attack. The operational concept of this technique is to transmit asynchronous off‑tones that are not perfectly periodic or that have an offset at the sampling frequencies. As a result, the energy gets smeared from the true frequency into the adjacent frequency bins, thus creating inter-channel interference (ICI) of the OFDM signal at the receiver [1].
Also, the side lobes of the signal (sync function) are not aligned with the OFDM subcarriers because frequency offset can have non-zero components at the sampling period that can be a source of ICI. One advantage of AOTJ is that the jamming signal does not need frequency matching with the target signal or any channel state information (CSI). AOTJ demonstrates superior performance compared to BJ, STJ, and MTJ. An example of the two types of AOTJ can be seen in Figure 1.
Pilot Tone Jamming and Pilot Tone Nulling
In pilot tone jamming, the jammer must be perfectly synchronized with the target signal. This is done through the observation of communications between all the parties involved in the network. For example, a vector jammer signal Zi is equal to 0 (Zi = 0) for non‑pilot sub-carriers and qi (Zi = qi) for the pilot tones, which is an independent and identically distributed AWGN [4]. If this AWGN sequence is coherently transmitted on all pilots simultaneously, then the noise is not averaged out for linear combinations.
In case of pilot tone nulling, it is also important to know the channel. The transmitter transmits a signal which is channel-corrected and π-radian phase shifted of the pilot tone. This causes the original pilot tone to cancel out and thus degrades the performance of the network.
Coexistence with Other Services
Coexistence of LTE and S-Band Radar
Air traffic control (ATC) radar, military air traffic surveillance (ATS) radar, and meteorological radar operate in the S-band frequency range. In fact, 4G communication systems (such as LTE) also operate in the same frequencies. The testing and measurement of their coexistence is absolutely essential as performance degradation of mobile devices and networks has been proven.
Table 1 lists the LTE frequency bands for frequency division duplex (FDD) and time division duplex (TDD) modes of operation. Bands 1, 4, 7, 10, 22, 23, and 30 are fairly close to any operational S-Band radar system.
LTE base stations (eNodeB) may be disturbed through radar systems. Depending on the ATC or ATS radar system, a power of up to 7000 MW EIRP is transmitted. The blocking requirements of the LTE base stations (BS) and user equipment (UE) must also comply with these figures by considering the distance of the BS or UE. TS36.141 defines the blocking performance requirement for wide area BS as described in Table 1.
The UE may even be closer to a radar system. According to [5], out-of-band blocking parameters are defined as shown in Table 2.
In 3GPP TS36.521-1 [5], the test purpose of “TC 7.6.2 Out-of-band blocking” is described as “unwanted CW [continuous wave] interfering signal falling more than 15 MHz below or above the UE receive band, at which a given average throughput shall meet or exceed the requirement…”. Under minimum conformance requirements, the throughput is mentioned to be “≥95% of the maximum throughput of the reference measurement channel.”
As shown in several measurements, disturbance of LTE networks occurs through S-band radar, such as degradation of performance due to lower throughput indicated by an increasing block error rate (BLER). Throughput reduction is unlikely but not a major drawback. However, spectral efficiency, power reduction, and costs are of significant importance for any mobile network operator. Therefore, disturbance through other signals is of great interest.
Unlike mobile communications, radar is not defined by a global specification. Thus, many different systems applying different waveforms, frequencies, and bandwidths are deployed and operate nearly autonomously to detect the desired kind of target. For a radar engineer, bandwidth is also one of the key parameters when defining the radar system, as bandwidth defines range resolution. Depending on the radar, bandwidth can range from nearly zero (just a carrier frequency, CW radar) to measure radial velocity up to several GHz for high-resolution range measurements (e.g., ultra-wideband radar [UWB]).
The 2.7 GHz to 2.9 GHz frequency band is primarily allocated to aeronautical radio navigation, i.e., ground-based fixed and transportable radar platforms for meteorological purposes and aeronautical radio navigation services. The operating frequencies of these radars are assumed to be uniformly distributed throughout the S-band [6]. The two frequency bands for mobile communication and aeronautical radio navigation are very closely located, so the coexistence problem also needs special attention.
The application note 1MA211 [6] describes a more detailed investigation of the coexistence problem. The application note also discusses the potential issues concerning S-band radar systems and LTE signals from base stations/mobile devices operating in close range to the signal. It addresses frequency allocation of these systems, explains the performance degradation or malfunction that can be expected, and describes test and measurement solutions for interference testing of radar and LTE networks in detail.
Coexistence with LTE in Critical Environments
In critical environments such as hospitals, it is also important to ensure the coexistence of LTE with other wireless transmissions. The radio frequency (RF) environment of hospitals is very crowded, with many potential sources of interference, including wireless patient monitoring devices, wireless biosensors, smart TVs, etc. In addition, medical staff, patients, and guests in this environment typically introduce additional transmitters into the mix, such as smartwatches, smartphones, and wireless headphones. As a result, WLAN, Bluetooth®, and other mobile standards such as LTE or 5G are simultaneously in operation in a single environment. Therefore, network operators and manufacturers from both the mobile radio and the medical sector have a vital interest in preventing potential interference by performing in-depth testing of their products.
WLAN and Bluetooth® radio communication services operate in the license-free ISM4 band and have a high density of devices in most urban and sub‑urban operating environments. LTE band 40 lies very close to the lower end of the ISM band, and LTE band 7 follows, albeit with somewhat more separation at its upper end (see Figure 3). In addition, 5G new radio (NR) technology’s use of its Frequency Range 1 (also widely known in the industry as FR1, 410 MHz to 7125 MHz) overlaps with the LTE frequency spectrum and may even share some of the same band numbers. 5G uses these frequencies for ultra‑reliable low latency communications required for telemedicine applications.
LTE also operates in the frequency bands that are already available for existing 3G networks. Moreover, additional ranges are available for use, such as the 2.5 GHz to 2.7 GHz band (Europe/Asia) and the 700 MHz band (USA). LTE bands 5, 12, 13, 14, 17, 19, and 20 overlap with digital TV bands and should be checked for vulnerabilities where digital TV services are still in service. In this coexistence scenario, the digital TV transmitter may act as an interferer on the cellular system LTE. Depending on the spectrum situation, the LTE base station receiver or the LTE terminal receiver could be impacted. If the LTE system and the digital TV system are operated in different frequency bands, this coexistence scenario will never be a co-channel scenario. A more detailed discussion on the issue can be found in [8].
In-Device Interference and Coexistence
With the ever-growing usage of various wireless technologies and services, user equipment is typically designed with multiple radio transceivers designed to operate in accordance with standards such as LTE, Wi-Fi, Bluetooth, and global navigation satellite systems (GNSS) simultaneously. This means that in-device coexistence interference becomes a matter of concern due to the extreme proximity of multiple transceivers or different antennas coupling with each other within the same device and that can potentially act as interferers.
The extreme proximity of co-located radios due to the small form factor of user equipment and the scarcity of spectrum are the main points that account for this problem. When these radio technologies within the same equipment are working on adjacent frequencies or sub-harmonic frequencies, interference power due to out-of-band emissions from a transmitter of one radio may be much higher than the signal strength of the desired signal for a receiver of a collocated radio. This situation is known as in‑device coexistence interference.
Figure 4 shows one situation where user equipment supports multiple standards. The LTE signals undergo interference between different co-located radio transceivers. The Wi-Fi does not interfere with GPS but interferes with Bands 7 and 41 of LTE.
Mitigation Techniques
As discussed in the previous section, there are various jamming techniques, as well as unwanted interference that play a role in the degradation of the performance of the LTE communication system. This is important to know when looking at civil-governmental systems as well as military communication systems, which must be robust in both circumstantial and hostile jamming scenarios. Therefore, keeping all the discussed techniques in mind, a few schemes already exist or offer themselves for jamming mitigation.
Jamming Mitigation
One of the most basic ways of mitigating unwanted interference is to rely on RF techniques, such as sufficient filtering or isolation. Unfortunately, the current state-of-the-art filter technology cannot provide sufficient interference rejection, making finding better mitigation schemes necessary.
Certain interference and jamming mitigation schemes such as frequency division multiplexing (FDM) based solutions, time division multiplexing (TDM) based solutions, transmit power control solutions, and frequency hopping solutions are extremely popular.
FDM-Based Solution
The basic idea is to shift LTE or ISM signals away from an interfering band via the frequency domain. This can be done by performing inter-frequency handover within E-UTRAN or removing secondary cells (SCells) from the set of serving cells, as shown in Figure 5.
TDM-Based Solution
The basic idea behind TMD-based solutions is shown in Figure 6. This solution relies on avoiding the overlapping of signal transmission in the time domain. In LTE, a discontinuous reception (DRX) mechanism can provide TDM patterns for the scheduling of LTE transmissions.
Transmit Power Control Solution
This solution relies on reducing the power of the transmitting signal (LTE or ISM) to mitigate interference on the other receivers. Figure 7 shows a graphical depiction of the solution. Reducing the transmit power also means a reduction in the size of the coverage area.
Furthermore, in some cases, the UE can autonomously deny ISM transmission in order to protect important LTE signaling (e.g., radio resource control [RRC] connection configuration).
Frequency Hopping (FH) Solution
Frequency hopping (FH) solutions are widely used to mitigate the effects of hostile jamming. FH is mainly limited by the collision effect, and the spectral efficiency of the FH system is extremely low. In order to develop the spectral efficiency of the FH systems, a space-time coded collision-free frequency hopping scheme based on the OFDM framework and a secure subcarrier assignment algorithm can be used in which each user hops to a different set of subcarriers in a pseudo-random manner at the beginning of each new symbol period and at each symbol period. Different users always transmit on non-overlapping sets of subcarriers, thus making the FH scheme collision-free.
Frequency hopping has also been considered in cases where there is significant additional available bandwidth for use. However, it is difficult to overcome the impact of active jamming, especially when jammers acquire the inherent properties of media access control (MAC) layer protocols. There is a mitigation scheme known as the subcarrier‑level radio agility. This is based on the concept that jamming signals will likely experience varying levels of fading on different OFDM subcarriers. As a result, some subcarriers may not be significantly affected by the malicious power emission. As long as a transceiver pair is made aware of which subcarriers these are, they can be temporarily used for legitimate packet transmissions.
Thus, a framework is created that allows a transceiver pair to exchange information about these unaffected subcarriers in the available spectrum, where the jamming signal experiences significant fading. Once such subcarriers are identified, the maximum allowable transmit power is assigned to these channels. These channels are then used for packet transmissions to increase the probability of successful packet delivery, thereby increasing the long-term throughput (while being actively jammed).
Coexistence Problem Mitigation Techniques
Different approaches can mitigate disturbances on radar and 4G base stations. One approach is to reduce transmit power at the base station and radar. Also, increasing frequency separation or distance between the two services is a potential solution. However, these two approaches reduce the maximum range of the radar and coverage of the base station, and frequency selection may be impossible due to technical restrictions. One approach to mitigate the problem is to avoid letting mobile service base station antennas point toward the S-Band radar. Also, the improvement of receiver selectivity, filtering of transmitter signals, and reduction of unwanted spurious emissions on both sides allows coexistence.
The latter choice is the most straightforward mitigation measure, both at the radar and base station side. Receiver saturation can be avoided through inter-modulation, and a blocking filter can be placed on the radar’s receiver before the low noise amplifier (LNA). At the base station side, a filter can be placed on the transmitter close to the antenna to suppress the out-of-band LTE emissions in the spurious domain. Furthermore, a revision of the ETSI 3GPP technical specifications TS 136.101 (for user equipment) and TS 136.104 (for base stations) is recommended. Currently, these standards impose flexible power levels for spurious emissions in non-protected bands, while these levels are much more stringent in the protected bands. Because the S-band (and the L-band) are used for security and safety services, a more stringent maximum power level for spurious emissions should be defined.
In any case, the test and measurement of radar, LTE base stations, and user equipment is necessary to confirm spectral emission masks and prove robustness against other co-existing signals [6]. Off-the-shelf test & measurement equipment and dedicated test systems to characterize susceptibility to interference and jamming exist and can aid in the development of more robust communication equipment or in designing more efficient targeted jamming scenarios.
Integration of 4G LTE with Tactical Network
Advanced communication technology is a key component of military success. The integration of the 4G LTE network allows the dissemination of secured mission command data, imagery, streaming video, and voice transmission between dismounted soldiers and command centers. The availability of real-time, complete situational awareness of the surrounding area gives combat soldiers a clear advantage. Military mobile communications must keep up with the innovations in the commercial space. LTE offers lower latency, faster speeds, and a more efficient architecture than the latest wireless military network technology when it comes to two-way communication.
Mobilization of a military 4G LTE network can be done by installing the base stations on a moving vehicle or an unmanned aerial vehicle (UAV, commonly referred to as drones) overhead or even on satellites operating at UHF (300 MHz – 3 GHz). Streaming video feeds from various individual endpoints and UAV cameras can be safely transmitted on this 4G network. Depending on the frequency band, LTE service is supported for terminals moving at up to 350 km/h (220 mph) or 500 km/h (310 mph).
4G LTE makes it possible for the military to set up beyond-line-of-sight radio communication at a low cost. The low frequency bands (i.e., 700 MHz) make it possible for deployment in rural areas as the signal travels further and provides better in-building coverage. This means fewer base stations are required to serve the same area. On the other hand, with 700 MHz in urban areas, there is a higher possibility of running into capacity issues, as there are more users per cell. Typically, higher frequencies (such as 2.6 GHz) are used for small cells (micro, pico, femto, etc.) to increase system capacity in hotspot areas. Users are handed over to these cells to free up resources on the macro cell. It’s basically an overlay to the macro layer, which typically uses lower frequencies to provide wide-area coverage.
With 3GPP Release 12, two essential features were added to the LTE standard. First, there is device-to-device (D2D) communication. Here, two or more devices can directly communicate with each other, using uplink spectrum (FDD mode) at certain periodically occurring moments in time or uplink subframes (TDD mode). This feature is defined for in-coverage scenarios, where a base station still serves these devices, and out-of-coverage scenarios, where no network is available. Second, there is group communication on top of D2D, which, for instance, enables these devices to establish voice communication throughout the group using the D2D functionality.
With Release 13, the standard has been enhanced even further to support, for instance, mission-critical push-to-talk (MCPTT) services utilized by all types of terminals, ranging from popular smartphones to ruggedized devices. These and other features and applications are of interest in the case of public safety. When an emergency, disaster, or any unexpected event occurs, communication infrastructure is particularly important and plays a vital role. In many instances, the terrestrial communication infrastructure, especially core network functionality, can be seriously compromised and fail to ensure reliable communication for rescue teams. In times like these, the isolated EUTRAN operations, also part of Release 13, might be an interesting and effective solution to the problem. This feature enables the local routing of the communication (i.e., via base station only) when the interface to the core network is harmed or unavailable.
All-in-all, the features incorporated with Release 12 and 13 make LTE an interesting candidate for tactical communications as the underlying technology for next generation battlefield communications.
Conclusion
This article is intended to point out vulnerabilities of LTE and LTE-Advanced. We’ve discussed a number of commonly used jamming techniques as well as more recently developed “smart” approaches, such as barrage jamming, partial band jamming, single-tone jamming, multi-tone jamming, asynchronous off-tone jamming, and pilot tone jamming and nulling. Even though every jamming scheme has its own advantages and disadvantages, asynchronous off-tone jamming has shown to be more efficient in terms of figure of merit than the other schemes.
This article has also reviewed unwanted interference and jamming mitigation schemes. We’ve offered a few solutions, including frequency division multiplexing-based solutions, time division multiplexing-based solutions, transmit power control-based solutions, and the popular frequency hopping-based solution.
We’ve also addressed the coexistence issue of LTE with S-band frequencies and in critical environments such as hospitals. The coexistence issue of LTE and S-band frequency is extremely critical and requires constant attention because air traffic control radars and air traffic surveillance radars operate in the S-band. A coupling of the LTE transmitted power in the receiver of a radar may cause a rise in the noise floor and result in a failure to detect an object in the sky.
We have identified the vulnerabilities of the technology and shared strategies and techniques to address them. It goes without mentioning that user equipment and the eNodeB need to be more robust in design. Both are used in security-relevant applications and should be designed to be “self-aware” of interference and jamming cases and programmed to take action to maintain un-degraded communication.
Testing and measurement are key components in all steps of the development and maintenance process of LTE and LTE-Advanced systems and devices, ensuring proper communication even in environments with high interference.
References
- C. Shahriar, S. Sodagari, R. McGwier, and T. C. Clancy, “Performance impact of asynchronous off‑tone jamming attacks against OFDM,” 2013 IEEE International Conference on Communications (ICC), Budapest, Hungary, 2013, pp. 2177-2182, doi: 10.1109/ICC.2013.6654850.
- J. Luo, J. Andrian, and C. Zhou, “Bit error rate analysis of jamming for OFDM systems,” Wireless Telecommunications Symposium, WTS 2007, pp. 1-8, April 2007.
- S. Chao, W. Ping, and S. Guozhong, “Performance of OFDM in the presence of multitone jamming,” 2012 IEEE Symposium on Robotics and Applications (ISRA), Kuala Lumpur, Malaysia, 2012, pp. 118-121, doi: 10.1109/ISRA.2012.6219135.
- T. C. Clancy, “Efficient OFDM Denial: Pilot Jamming and Pilot Nulling,” 2011 IEEE International Conference on Communications (ICC), Kyoto, Japan, 2011, pp. 1-5, doi: 10.1109/icc.2011.5962467.
- Rohde & Schwarz, White Paper PD 3683.3965.52, “Wireless coexistence testing based on IoT device application use cases.”
- Rohde & Schwarz, Application Note 1MA211, “Coexistence Test of LTE and Radar systems.”
- 3GPP TS 36.211, “Physical Channels and Modulation (Rel. 8).” https://www.3gpp.org
- Z. Hu, R. Susitaival, Z. Chen, I-K. Fu, P. Dayal, and S. Kumar Baghel, “Interference Avoidance for In-Device Coexistence in 3GPP LTE-Advanced: Challenge and Solutions,” IEEE Communication Magazine, November 2012.