A Review on Machine Learning Assisted Handover Mechanisms for Future Generation IoT Networks

Main Article Content

Vaidehi Bakshi, Rakesh Kumar

Abstract

Machine Learning and Deep Learning Algorithms have been explored widely to identify potential avenues to optimize future generation IoT networks. One such area happens to be a data driven model for initiating handover among multiple access techniques such as FDM, OFDM, OTFS and NOMA. The amount of data which is generated for variable channel conditions typically in IoT applications is enormously large and hence conventional rule based mechanisms do not render high accuracy in handover problems in IoT and wireless Ad-Hoc networks. With the increasing data handling ability of machine learning and deep learning models, handover based on various channel metrics such as fading factor, received SNR and error rates can be implemented. This rules out the need for conventional handover mechanisms for software defined networks. Multiple machine learning and deep learning models have been explored thus far for initiating handovers for IoT applications, which are explored and discussed in this paper. The salient features of each of the approaches has been highlighted along with identifying potential research gaps, thereby paving the path for future research  the domain.

Article Details

Section
Articles