Machine Learning for Hardware-Constrained Wireless Communication Systems

190951-Thumbnail Image.png
Description
Millimeter wave (mmWave) and massive multiple-input multiple-output (MIMO) systems are intrinsic components of 5G and beyond. These systems rely on using beamforming codebooks for both initial access and data transmission. Current beam codebooks, however, are not optimized for the given

Millimeter wave (mmWave) and massive multiple-input multiple-output (MIMO) systems are intrinsic components of 5G and beyond. These systems rely on using beamforming codebooks for both initial access and data transmission. Current beam codebooks, however, are not optimized for the given deployment, which can sometimes incur noticeable performance loss. To address these problems, in this dissertation, three novel machine learning (ML) based frameworks for site-specific analog beam codebook design are proposed. In the first framework, two special neural network-based architectures are designed for learning environment and hardware aware beam codebooks through supervised and self-supervised learning respectively. To avoid explicitly estimating the channels, in the second framework, a deep reinforcement learning-based architecture is developed. The proposed solution significantly relaxes the system requirements and is particularly interesting in scenarios where the channel acquisition is challenging. Building upon it, in the third framework, a sample-efficient online reinforcement learning-based beam codebook design algorithm that learns how to shape the beam patterns to null the interfering directions, without requiring any coordination with the interferers, is developed. In the last part of the dissertation, the proposed beamforming framework is further extended to tackle the beam focusing problem in near field wideband systems. %Specifically, the developed solution can achieve beam focusing without knowing the user position and can account for unknown and non-uniform array geometry. All the frameworks are numerically evaluated and the simulation results highlight their potential of learning site-specific codebooks that adapt to the deployment. Furthermore, a hardware proof-of-concept prototype based on mmWave phased arrays is built and used to evaluate the developed online beam learning solutions in realistic scenarios. The learned beam patterns, measured in an anechoic chamber, show the performance gains of the developed framework. All that highlights a promising ML-based beam/codebook optimization direction for practical and hardware-constrained mmWave and terahertz systems.
Date Created
2023
Agent

Blockage Prediction for Millimeter Wave Communications System

168677-Thumbnail Image.png
Description

This work addresses the following four problems: (i) Will a blockage occur in the near future? (ii) When will this blockage occur? (iii) What is the type of the blockage? And (iv) what is the direction of the moving blockage?

This work addresses the following four problems: (i) Will a blockage occur in the near future? (ii) When will this blockage occur? (iii) What is the type of the blockage? And (iv) what is the direction of the moving blockage? The proposed solution utilizes deep neural networks (DNN) as well as non-machine learning (ML) algorithms. At the heart of the proposed method is identification of special patterns of received signal and sensory data before the blockage occurs (\textit{pre-blockage signatures}) and to infer future blockages utilizing these signatures. To evaluate the proposed approach, first real-world datasets are built for both in-band mmWave system and LiDAR-aided in mmWave systems based on the DeepSense 6G structure. In particular, for in-band mmWave system, two real-world datasets are constructed -- one for indoor scenario and the other for outdoor scenario. Then DNN models are developed to proactively predict the incoming blockages for both scenarios. For LiDAR-aided blockage prediction, a large-scale real-world dataset that includes co-existing LiDAR and mmWave communication measurements is constructed for outdoor scenarios. Then, an efficient LiDAR data denoising (static cluster removal) algorithm is designed to clear the dataset noise. Finally, a non-ML method and a DNN model that proactively predict dynamic link blockages are developed. Experiments using in-band mmWave datasets show that, the proposed approach can successfully predict the occurrence of future dynamic blockages (up to 5 s) with more than 80% accuracy (indoor scenario). Further, for the outdoor scenario with highly-mobile vehicular blockages, the proposed model can predict the exact time of the future blockage with less than 100 ms error for blockages happening within the future 600 ms. Further, our proposed method can predict the size and moving direction of the blockages. For the co-existing LiDAR and mmWave real-world dataset, our LiDAR-aided approach is shown to achieve above 95% accuracy in predicting blockages occurring within 100 ms and more than 80% prediction accuracy for blockages occurring within one second. Further, for the outdoor scenario with highly-mobile vehicular blockages, the proposed model can predict the exact time of the future blockage with less than 150 ms error for blockages happening within one second. In addition, our method achieves above 92% accuracy to classify the type of blockages and above 90% accuracy predicting the blockage moving direction. The proposed solutions can potentially provide an order of magnitude saving in the network latency, thereby highlighting a promising approach for addressing the blockage challenges in mmWave/sub-THz networks.

Date Created
2022
Agent