Deep Learning-based Channel Estimation for Massive MIMO Systems: A Review of Recent Advancements and Future Directions

Authors

  • Tawadod Ali Hussein Department of Electrical Engineering, College of Engineering, University of Tikrit, Tikrit, Iraq Author

DOI:

https://doi.org/10.65204/djes.v3i1.508

Abstract

A large-scale Multiple-Input Multiple-Output (MIMO) device is a primary 5G and beyond method, which guarantees annotatable advantages in spectral and power performance. However, their full capacity is an issue to the challenge of obtaining the exact channel status information (CSI) for a huge wide variety of antennas. Traditional methods together with the least squares (LS) and minimum mean square errors (MMSE) are regularly computationally intensive or require earlier statistical computations and struggle with the high-dimensional, real-time of today’s Wi-Fi channels. Over the beyond three years, Machine Learning (ML) and Deep Learning (DL) have emerged as a powerful paradigm to move those boundaries. Recent research has proven notable overall performance; the DL model has carried out a 30% discount in normalized mean square errors (NMSE) as compared to traditional linear estimates within the surroundings with low signal-to-noise ratio (SNR). In addition, the time has shown an average improvement of 15-25% in bit-error-rate (BER), time monitoring for the hybrid DL structure, recurrent neural networks (RNNs) or conventional neural networks (CNN). These advances are improved using the ML architecture, which is intended to investigate complex channel structures and reduce problems that cause pilot pollution, leading to a more scalable, efficient, and powerful system.

Downloads

Published

2026-03-22