Improving detection in optical communications using all-optical reservoir computing

Apostolos Argyris, Julián Bueno, Miguel C. Soriano, Ingo Fischer

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

Abstract

During the last decade reservoir computing (RC) has proven to be a powerful machine learning concept using recurrent networks to process sequential information streams [1]. Lately, the concept of RC was drastically simplified to reduce hardware requirements, thus making machine learning concepts more obtainable to apply in actual physical systems [2]. RC has been implemented in photonic configurations and has efficientìy processed analog optical signals following simple schemes and minimal training overhead. Especially the realization using a semiconductor laser with time-delayed optical feedback has shown the potential of all-optical machine learning, utilizing signals with transient states at the GHz regime [3]. In this work we exploit the properties of a small-scale RC in order to improve the detection capabilities of lightwave communication signals of very low quality. The RC has a limited number of virtual nodes (N=33) and a total internal time-delay of 1.65ns in order to allow fast post-processing. Its design follows an all-optical configuration, including a discrete-mode semiconductor laser (SL) that emits at 1550nm and an optical feedback cavity, as presented in [3]. We numerically simulate the lightwave communication signals by considering an onoff keying modulated binary stream at 10Gb/s that has undergone 2000 km of fiber transmission (20 segments of 100-km spools including optical amplification and dispersion compensation). At such distances, direct threshold detection results in BER values higher than 0.1. Even when applying offline post-processing with a ridge regression training algorithm, which considers the patterns of the detected bits within the complete detection period, the detection BER value improves only slightly to 0.031. When we employ the RC, the transmitted signal is optically injected into the RC's SL, resulting in a nonlinearly transformed output which depends directly on the operating conditions of the reservoir. After training the reservoir with the previously considered regression algorithm, we find many operating conditions for which the bit stream recovery potential is improved significantly (Fig. 1a). Especially when the memory properties of the reservoir are incorporated in the training process - i.e. by taking into account the transformed patterns of the previous bits - BER values lower than 10-4 have been retrieved (Fig. 1b).
LanguageEnglish
Title of host publication2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC)
Place of PublicationPiscataway, NJ
PublisherIEEE
ISBN (Electronic)9781509067367
DOIs
Publication statusPublished - 30 Oct 2017
Externally publishedYes
EventThe European Conference on Lasers and Electro-Optics, CLEO_Europe 2017 - Munich, Germany
Duration: 25 Jun 201729 Jun 2017

Conference

ConferenceThe European Conference on Lasers and Electro-Optics, CLEO_Europe 2017
CountryGermany
CityMunich
Period25/06/1729/06/17

Fingerprint

Optical communication
Semiconductor lasers
Learning systems
Optical feedback
Dispersion compensation
Reels
Communication
Processing
Photonics
Amplification
Time delay
Hardware
Data storage equipment
Recovery
Fibers

Keywords

  • reservoirs
  • optical feedback
  • training
  • transient analysis
  • optical fiber communication
  • photonics
  • optical fibers

Cite this

Argyris, A., Bueno, J., Soriano, M. C., & Fischer, I. (2017). Improving detection in optical communications using all-optical reservoir computing. In 2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC) Piscataway, NJ: IEEE. https://doi.org/10.1109/CLEOE-EQEC.2017.8086463
Argyris, Apostolos ; Bueno, Julián ; Soriano, Miguel C. ; Fischer, Ingo. / Improving detection in optical communications using all-optical reservoir computing. 2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC). Piscataway, NJ : IEEE, 2017.
@inproceedings{8d9adfeb2f6a4016bdac4c19ca3762e1,
title = "Improving detection in optical communications using all-optical reservoir computing",
abstract = "During the last decade reservoir computing (RC) has proven to be a powerful machine learning concept using recurrent networks to process sequential information streams [1]. Lately, the concept of RC was drastically simplified to reduce hardware requirements, thus making machine learning concepts more obtainable to apply in actual physical systems [2]. RC has been implemented in photonic configurations and has efficient{\`i}y processed analog optical signals following simple schemes and minimal training overhead. Especially the realization using a semiconductor laser with time-delayed optical feedback has shown the potential of all-optical machine learning, utilizing signals with transient states at the GHz regime [3]. In this work we exploit the properties of a small-scale RC in order to improve the detection capabilities of lightwave communication signals of very low quality. The RC has a limited number of virtual nodes (N=33) and a total internal time-delay of 1.65ns in order to allow fast post-processing. Its design follows an all-optical configuration, including a discrete-mode semiconductor laser (SL) that emits at 1550nm and an optical feedback cavity, as presented in [3]. We numerically simulate the lightwave communication signals by considering an onoff keying modulated binary stream at 10Gb/s that has undergone 2000 km of fiber transmission (20 segments of 100-km spools including optical amplification and dispersion compensation). At such distances, direct threshold detection results in BER values higher than 0.1. Even when applying offline post-processing with a ridge regression training algorithm, which considers the patterns of the detected bits within the complete detection period, the detection BER value improves only slightly to 0.031. When we employ the RC, the transmitted signal is optically injected into the RC's SL, resulting in a nonlinearly transformed output which depends directly on the operating conditions of the reservoir. After training the reservoir with the previously considered regression algorithm, we find many operating conditions for which the bit stream recovery potential is improved significantly (Fig. 1a). Especially when the memory properties of the reservoir are incorporated in the training process - i.e. by taking into account the transformed patterns of the previous bits - BER values lower than 10-4 have been retrieved (Fig. 1b).",
keywords = "reservoirs, optical feedback, training, transient analysis, optical fiber communication, photonics, optical fibers",
author = "Apostolos Argyris and Juli{\'a}n Bueno and Soriano, {Miguel C.} and Ingo Fischer",
year = "2017",
month = "10",
day = "30",
doi = "10.1109/CLEOE-EQEC.2017.8086463",
language = "English",
booktitle = "2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC)",
publisher = "IEEE",

}

Argyris, A, Bueno, J, Soriano, MC & Fischer, I 2017, Improving detection in optical communications using all-optical reservoir computing. in 2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC). IEEE, Piscataway, NJ, The European Conference on Lasers and Electro-Optics, CLEO_Europe 2017, Munich, Germany, 25/06/17. https://doi.org/10.1109/CLEOE-EQEC.2017.8086463

Improving detection in optical communications using all-optical reservoir computing. / Argyris, Apostolos; Bueno, Julián; Soriano, Miguel C.; Fischer, Ingo.

2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC). Piscataway, NJ : IEEE, 2017.

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

TY - GEN

T1 - Improving detection in optical communications using all-optical reservoir computing

AU - Argyris, Apostolos

AU - Bueno, Julián

AU - Soriano, Miguel C.

AU - Fischer, Ingo

PY - 2017/10/30

Y1 - 2017/10/30

N2 - During the last decade reservoir computing (RC) has proven to be a powerful machine learning concept using recurrent networks to process sequential information streams [1]. Lately, the concept of RC was drastically simplified to reduce hardware requirements, thus making machine learning concepts more obtainable to apply in actual physical systems [2]. RC has been implemented in photonic configurations and has efficientìy processed analog optical signals following simple schemes and minimal training overhead. Especially the realization using a semiconductor laser with time-delayed optical feedback has shown the potential of all-optical machine learning, utilizing signals with transient states at the GHz regime [3]. In this work we exploit the properties of a small-scale RC in order to improve the detection capabilities of lightwave communication signals of very low quality. The RC has a limited number of virtual nodes (N=33) and a total internal time-delay of 1.65ns in order to allow fast post-processing. Its design follows an all-optical configuration, including a discrete-mode semiconductor laser (SL) that emits at 1550nm and an optical feedback cavity, as presented in [3]. We numerically simulate the lightwave communication signals by considering an onoff keying modulated binary stream at 10Gb/s that has undergone 2000 km of fiber transmission (20 segments of 100-km spools including optical amplification and dispersion compensation). At such distances, direct threshold detection results in BER values higher than 0.1. Even when applying offline post-processing with a ridge regression training algorithm, which considers the patterns of the detected bits within the complete detection period, the detection BER value improves only slightly to 0.031. When we employ the RC, the transmitted signal is optically injected into the RC's SL, resulting in a nonlinearly transformed output which depends directly on the operating conditions of the reservoir. After training the reservoir with the previously considered regression algorithm, we find many operating conditions for which the bit stream recovery potential is improved significantly (Fig. 1a). Especially when the memory properties of the reservoir are incorporated in the training process - i.e. by taking into account the transformed patterns of the previous bits - BER values lower than 10-4 have been retrieved (Fig. 1b).

AB - During the last decade reservoir computing (RC) has proven to be a powerful machine learning concept using recurrent networks to process sequential information streams [1]. Lately, the concept of RC was drastically simplified to reduce hardware requirements, thus making machine learning concepts more obtainable to apply in actual physical systems [2]. RC has been implemented in photonic configurations and has efficientìy processed analog optical signals following simple schemes and minimal training overhead. Especially the realization using a semiconductor laser with time-delayed optical feedback has shown the potential of all-optical machine learning, utilizing signals with transient states at the GHz regime [3]. In this work we exploit the properties of a small-scale RC in order to improve the detection capabilities of lightwave communication signals of very low quality. The RC has a limited number of virtual nodes (N=33) and a total internal time-delay of 1.65ns in order to allow fast post-processing. Its design follows an all-optical configuration, including a discrete-mode semiconductor laser (SL) that emits at 1550nm and an optical feedback cavity, as presented in [3]. We numerically simulate the lightwave communication signals by considering an onoff keying modulated binary stream at 10Gb/s that has undergone 2000 km of fiber transmission (20 segments of 100-km spools including optical amplification and dispersion compensation). At such distances, direct threshold detection results in BER values higher than 0.1. Even when applying offline post-processing with a ridge regression training algorithm, which considers the patterns of the detected bits within the complete detection period, the detection BER value improves only slightly to 0.031. When we employ the RC, the transmitted signal is optically injected into the RC's SL, resulting in a nonlinearly transformed output which depends directly on the operating conditions of the reservoir. After training the reservoir with the previously considered regression algorithm, we find many operating conditions for which the bit stream recovery potential is improved significantly (Fig. 1a). Especially when the memory properties of the reservoir are incorporated in the training process - i.e. by taking into account the transformed patterns of the previous bits - BER values lower than 10-4 have been retrieved (Fig. 1b).

KW - reservoirs

KW - optical feedback

KW - training

KW - transient analysis

KW - optical fiber communication

KW - photonics

KW - optical fibers

UR - http://www.scopus.com/inward/record.url?scp=85039918388&partnerID=8YFLogxK

U2 - 10.1109/CLEOE-EQEC.2017.8086463

DO - 10.1109/CLEOE-EQEC.2017.8086463

M3 - Conference contribution book

BT - 2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC)

PB - IEEE

CY - Piscataway, NJ

ER -

Argyris A, Bueno J, Soriano MC, Fischer I. Improving detection in optical communications using all-optical reservoir computing. In 2017 Conference on Lasers and Electro-Optics Europe & European Quantum Electronics Conference (CLEO/Europe-EQEC). Piscataway, NJ: IEEE. 2017 https://doi.org/10.1109/CLEOE-EQEC.2017.8086463