Compensation for nonlinear signal distortions is a key challenge for ensuring high throughput and increasing the range of modern fiber-optic communication systems. Nonlinear effects significantly limit data transmission quality, especially with increasing signal power and propagation over long distances. Therefore, the development of accurate, robust, and computationally efficient compensation methods incorporating modern machine learning approaches is particularly relevant in optical telecommunications engineering. The paper presents a comparative analysis of efficient nonlinear distortion compensation algorithms in fiber-optic communication systems using machine learning methods. Modifications of the classical digital backpropagation (DBP) method are discussed: learned DBP (LDBP), enhanced DBP (EnDBP), a perturbation-based model (PBM), a hybrid PB-DBP scheme, and an approach using convolutional neural networks (CNNs). Using experimental data from a laboratory prototype of a 2000-km-long optical transmission line, the capabilities of parameter learning are demonstrated and the effectiveness of the methods in terms of compensation accuracy and computational complexity is compared. The methods demonstrate a significant increase in signal-to-noise ratio and allow a balance between accuracy and load on the digital signal processing module to be optimized.
扫码关注我们
求助内容:
应助结果提醒方式:
