4.6 Article

Forecasting loss of signal in optical networks with machine learning

Journal

JOURNAL OF OPTICAL COMMUNICATIONS AND NETWORKING
Volume 13, Issue 10, Pages E109-E121

Publisher

Optica Publishing Group
DOI: 10.1364/JOCN.423667

Keywords

-

Ask authors/readers for more resources

By studying data from six international optical networks, it is possible to forecast LOS events accurately with supervised machine learning, although with relatively low recall. Training on multiple networks simultaneously improves precision, and a single model can forecast LOS from all facility types and networks.
Loss of signal (LOS) represents a significant cost for operators of optical networks. By studying large sets of real-world performance monitoring data collected from six international optical networks, we find that it is possible to forecast LOS events with good precision one to seven days before they occur, albeit at relatively low recall, with supervised machine learning (ML). Our study covers 12 facility types, including 100G lines and ETH10G clients. We show that the precision for a given network improves when training on multiple networks simultaneously relative to training on an individual network. Furthermore, we show that it is possible to forecast LOS from all facility types and all networks with a single model, whereas fine-tuning for a particular facility or network brings only modest improvements. Hence our ML models remain effective for optical networks previously unknown to the model, which makes them usable for commercial applications. (C) 2021 Optical Society of America

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available