*Article* **Self-Attention Mechanism-Based Multi-Channel QoT Estimation in Optical Networks**

**Yuhang Zhou <sup>1</sup> , Xiaoli Huo <sup>2</sup> , Zhiqun Gu 1,\*, Jiawei Zhang <sup>1</sup> , Yi Ding <sup>2</sup> , Rentao Gu <sup>1</sup> and Yuefeng Ji <sup>1</sup>**

> <sup>1</sup> State Key Lab of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications (BUPT), Beijing 100876, China

<sup>2</sup> China Telecom Research Institute, Beijing 102209, China

**\*** Correspondence: guzhiqun@bupt.edu.cn

**Abstract:** It is essential to estimate the quality of transmission (QoT) of lightpaths before their establishment for efficient planning and operation of optical networks. Due to the nonlinear effect of fibers, the deployed lightpaths influence the QoT of each other; thus, multi-channel QoT estimation is necessary, which provides complete QoT information for network optimization. Moreover, the different interfering channels have different effects on the channel under test. However, the existing artificial-neural-network-based multi-channel QoT estimators (ANN-QoT-E) neglect the different effects of the interfering channels in their input layer, which affects their estimation accuracy severely. In this paper, we propose a self-attention mechanism-based multi-channel QoT estimator (SA-QoT-E) to improve the estimation accuracy of the ANN-QoT-E. In the SA-QoT-E, the input features are designed as a sequence of feature vectors of channels that route the same path, and the self-attention mechanism dynamically assigns weights to the feature vectors of interfering channels according to their effects on the channel under test. Moreover, a hyperparameter search method is used to optimize the SA-QoT-E. The simulation results show that, compared with the ANN-QoT-E, our proposed SA-QoT-E achieves higher estimation accuracy, and can be directly applied to the network wavelength expansion scenarios without retraining.

**Keywords:** quality of transmission (QoT) estimation; nonlinear effect; multi-channel; self-attention mechanism
