Erasing Noise in Signal Detection with Diffusion Model: From Theory to Application

Wang, Xiucheng, Zheng, Peilin, Cheng, Nan

arXiv.org Artificial Intelligence 

In this paper, a signal detection method based on the denoise diffusion model (DM) is proposed, which outperforms the maximum likelihood (ML) estimation method that has long been regarded as the optimal signal detection technique. Theoretically, a novel mathematical theory for intelligent signal detection based on stochastic differential equations (SDEs) is established in this paper, demonstrating the effectiveness of DM in reducing the additive white Gaussian noise in received signals. Moreover, a mathematical relationship between the signal-to-noise ratio (SNR) and the timestep in DM is established, revealing that for any given SNR, a corresponding optimal timestep can be identified. Furthermore, to address potential issues with out-of-distribution inputs in the DM, we employ a mathematical scaling technique that allows the trained DM to handle signal detection across a wide range of SNRs without any fine-tuning. Xiucheng Wang, Peilin Zheng, Nan Cheng are with the State Key Laboratory of ISN and School of Telecommunications Engineering, Xidian University, Xi'an 710071, China. Signal detection plays a critical role in digital baseband transmission, since it estimates which symbols are transmitted by the sender, from the noisy received signals. Thus, the performance of signal detection directly impacts the symbol error rate (SER) of data transmission, which in turn determines the error-free transmission rate, also known as the Shannon threshold [1]. As a result, numerous signal detection techniques have been developed to minimize the SER and bring the transmission rate as close as possible to the Shannon threshold.