Power Grid Monitoring Alarm Events Identification Based on Large Language Model
Qiang Xu1,*, Leyao Cong1, Jianing Wang1, Xingyu Zhu1, Shaojun Cui1, Guoqiang Sun2, Xueheng Shi2
1 State Grid Wuxi Power Supply Company, State Grid Jiangsu Electric Power Co., Ltd., Wuxi, 214061, China
2 School of Electrical and Power Engineering, Hohai University, Nanjing, 211100, China
* Corresponding Author: Qiang Xu. Email:
Energy Engineering https://doi.org/10.32604/ee.2025.073947
Received 29 September 2025; Accepted 12 November 2025; Published online 09 December 2025
Abstract
Power system faults can trigger a massive influx of complex alarm signals to the operation and maintenance center, posing significant challenges for dispatchers in accurately identifying the underlying faults. To address the issues of sample imbalance and low accuracy in traditional power grid monitoring alarm event identification methods, a power grid monitoring alarm event identification method based on BERT large language model is proposed. Firstly, information entropy is employed to filter effective monitoring alarm signals, and the k-means clustering algorithm is used to group all alarm signals into different event types, forming the initial power grid monitoring alarm event samples. Then, to mitigate the issue of sample imbalance in power grid monitoring alarm events, a pre-trained SimBERT model is proposed to augment minority class samples, thereby reducing the imbalance ratio. Finally, the augmented samples of power grid monitoring alarm events are used to fine-tune the bidirectional encoder representation from transformer (BERT) model. A mix-training optimization strategy is adopted during fine-tuning to ultimately obtain the power grid monitoring alarm event identification model. Case study results demonstrate that the proposed model in this paper achieves superior identification precision of power grid monitoring alarm events compared to traditional deep learning methods.
Keywords
Power grid monitoring alarm event; BERT; information entropy; SimBERT; mix-training