Coupled Transceivers-Fiber Nonlinearity Compensation Based on Machine Learning for Probabilistic Shaping System

Thanh Tu Nguyen, Tingting Zhang, Elias Giacoumidis, Abdallah Ali, Mingming Tan, Paul Harper, Liam P. Barry, Andrew D. Ellis

Research output: Contribution to journalArticle

Abstract

In this paper, we experimentally demonstrate the combined benefit of artificial neural network-based nonlinearity compensation and probabilistic shaping for the first time. We demonstrate that the scheme not only compensates for transceiver’s nonlinearity, enabling the full benefits of shaping
to be achieved, but also the combined effects of transceiver and fiber propagation nonlinearities. The performance of the proposed artificial neural network is demonstrated at 28 Gbaud for both 64-QAM and 256-QAM probabilistically shaped systems and compared to that of uniformly distributed constellations. Our experimental results demonstrate: the expected performance gains for shaping alone; an additional SNR performance gain up to 1 dB in the linear region; an additional mutual information gain of 0.2 bits per channel use in the constellation-entropy limited region. In the presence of coupled transceiver and fiber-induced nonlinearities, an additional mutual information enhancement of ∼0.13 bits/symbol is experimentally observed for a fiber link of up to 500 km with the aid of the proposed artificial neural network.
Original languageEnglish
JournalJournal of Lightwave Technology
Early online date7 Oct 2020
DOIs
Publication statusE-pub ahead of print - 7 Oct 2020

Bibliographical note

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/

Funding: This work was supported by the UK EPSRC - Grants EP/S003436/1
(PHOS), EP/S016171/1 (EEMC) and EP/R035342/1 (TRANSNET).

Fingerprint Dive into the research topics of 'Coupled Transceivers-Fiber Nonlinearity Compensation Based on Machine Learning for Probabilistic Shaping System'. Together they form a unique fingerprint.

  • Cite this