论文标题
通过机器学习的光网络中信号的预测丢失
Forecasting Loss of Signal in Optical Networks with Machine Learning
论文作者
论文摘要
信号损失(LOS)代表了光网络运营商的重要成本。通过研究从六个国际光学网络收集的大量现实性能监控(PM)数据,我们发现可以在发生前1-7天进行精确度的LOS事件,尽管在相对较低的召回中,并且有监督的机器学习(ML)。我们的研究涵盖了十二种设施类型,包括100克线和ETH10G客户。我们表明,当相对于单个网络上的培训同时在多个网络上培训时,给定网络的精度会提高。此外,我们表明,可以从所有设施类型和所有网络中预测具有单个模型的网络,而针对特定设施或网络的微调仅带来适度的改进。因此,我们的ML模型对于该模型以前未知的光网络仍然有效,这使其可用于商业应用。
Loss of Signal (LOS) represents a significant cost for operators of optical networks. By studying large sets of real-world Performance Monitoring (PM) data collected from six international optical networks, we find that it is possible to forecast LOS events with good precision 1-7 days before they occur, albeit at relatively low recall, with supervised machine learning (ML). Our study covers twelve facility types, including 100G lines and ETH10G clients. We show that the precision for a given network improves when training on multiple networks simultaneously relative to training on an individual network. Furthermore, we show that it is possible to forecast LOS from all facility types and all networks with a single model, whereas fine-tuning for a particular facility or network only brings modest improvements. Hence our ML models remain effective for optical networks previously unknown to the model, which makes them usable for commercial applications.