Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range

I.S. Terekhov, A.V. Reznichenko, Ya A. Kharkov, S.K. Turitsyn

Research output: Contribution to journalArticle

Abstract

We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.

Original languageEnglish
Article number062133
Number of pages15
JournalPhysical Review E
Volume95
Issue number6
DOIs
Publication statusPublished - 26 Jun 2017

Fingerprint

channel capacity
Channel Capacity
Optical Fiber
optical fibers
Mutual Information
Range of data
entropy
Feynman Path Integral
Conditional Entropy
random noise
Gaussian Noise
signal to noise ratios
Entropy
output
Calculate
estimates
Output

Bibliographical note

© APS

Cite this

@article{80b41db40a9b455db7be2f022ca19952,
title = "Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range",
abstract = "We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.",
author = "I.S. Terekhov and A.V. Reznichenko and Kharkov, {Ya A.} and S.K. Turitsyn",
note = "{\circledC} APS",
year = "2017",
month = "6",
day = "26",
doi = "10.1103/PhysRevE.95.062133",
language = "English",
volume = "95",
journal = "Physical Review E",
issn = "1539-3755",
publisher = "American Physical Society",
number = "6",

}

Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range. / Terekhov, I.S.; Reznichenko, A.V.; Kharkov, Ya A.; Turitsyn, S.K.

In: Physical Review E, Vol. 95, No. 6, 062133, 26.06.2017.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range

AU - Terekhov, I.S.

AU - Reznichenko, A.V.

AU - Kharkov, Ya A.

AU - Turitsyn, S.K.

N1 - © APS

PY - 2017/6/26

Y1 - 2017/6/26

N2 - We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.

AB - We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.

UR - http://www.scopus.com/inward/record.url?scp=85021408281&partnerID=8YFLogxK

UR - https://journals.aps.org/pre/abstract/10.1103/PhysRevE.95.062133

U2 - 10.1103/PhysRevE.95.062133

DO - 10.1103/PhysRevE.95.062133

M3 - Article

VL - 95

JO - Physical Review E

JF - Physical Review E

SN - 1539-3755

IS - 6

M1 - 062133

ER -