Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

Sasipim Srivallapanondh, Pedro J. Freire, Bernhard Spinnler, Nelson Costa, Antonio Napoli, Sergei K. Turitsyn, Jaroslaw E. Prilepsky

Research output: Chapter in Book/Published conference outputConference publication

Abstract

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feedforward structure. The latter shows 38% latency decrease, while impacting the Q-factor by only 0.5 dB.
Original languageEnglish
Title of host publicationProceedings of 2023 Optical Fiber Communications Conference and Exhibition (OFC)
PublisherIEEE
ISBN (Electronic)9781957171180
DOIs
Publication statusPublished - 5 Mar 2023
Event2023 Optical Fiber Communications Conference and Exhibition (OFC) -
Duration: 5 Mar 20239 Mar 2023

Publication series

Name2023 Optical Fiber Communications Conference and Exhibition (OFC)
PublisherIEEE

Conference

Conference2023 Optical Fiber Communications Conference and Exhibition (OFC)
Abbreviated titleOFC
Period5/03/239/03/23

Bibliographical note

Funding Information:
Acknowledgements: This work has received funding from the EU Horizon 2020 program under the Marie Skłodowska-Curie grant agreement No. 956713 (MENTOR) and 813144 (REAL-NET). SKT acknowledges the support of the EPSRC project TRANSNET (EP/R035342/1).

Publisher Copyright:
© 2023 The Author(s).

Fingerprint

Dive into the research topics of 'Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection'. Together they form a unique fingerprint.

Cite this