0 avis
A Survey of Applied Machine Learning Techniques for Optical OFDM based Networks
Archive ouverte : Pré-publication, document de travail, ...
Edité par HAL CCSD
26 pages, 4 figures and 4 tables. In this survey, we analyze the newest machine learning (ML) techniques for optical orthogonal frequency division multiplexing (O-OFDM)-based optical communications. ML has been proposed to mitigate channel and transceiver imperfections. For instance, ML can improve the signal quality under low modulation extinction ratio or can tackle both determinist and stochastic-induced nonlinearities such as parametric noise amplification in long-haul transmission. The proposed ML algorithms for O-OFDM can in particularly tackle inter-subcarrier nonlinear effects such as four-wave mixing and cross-phase modulation. In essence, these ML techniques could be beneficial for any multi-carrier approach (e.g. filter bank modulation). Supervised and unsupervised ML techniques are analyzed in terms of both O-OFDM transmission performance and computational complexity for potential real-time implementation. We indicate the strict conditions under which a ML algorithm should perform classification, regression or clustering. The survey also discusses open research issues and future directions towards the ML implementation.