TY - GEN
T1 - Transfer Learning for Event-Type Differentiation on Power Systems
AU - Li, Haoran
AU - Ma, Zhihao
AU - Weng, Yang
AU - Farantatos, Evangelos
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Machine Learning (ML) models are continuously introduced to power systems in domains like state estimation and event identification. However, training an ML model usually requires a lot of data. For data-limited grids, we propose a transfer learning framework to transfer knowledge from a source grid with rich Phasor Measurement Unit (PMU) data for the event-type differentiation problem. The goal is challenging due to (1) different dimensionalities of the source and the target measurement spaces, (2) dissimilar data distributions, and (3) redundant PMU's information. Thus, we project the source and the target measurement space into a latent feature space, which reduces and aligns the dimensionality of input measurements, maintains close data distributions in the latent space, and enables the transferability from the source domain to the target domain. Then, we introduce transfer learning in supervised learning by vectorizing each PMU's measurement window as one training sample, forming the latent space. We theoretically show that our approach minimizes the upper bound of misclassification rate and experimentally demonstrates the high performance on various synthetic datasets.
AB - Machine Learning (ML) models are continuously introduced to power systems in domains like state estimation and event identification. However, training an ML model usually requires a lot of data. For data-limited grids, we propose a transfer learning framework to transfer knowledge from a source grid with rich Phasor Measurement Unit (PMU) data for the event-type differentiation problem. The goal is challenging due to (1) different dimensionalities of the source and the target measurement spaces, (2) dissimilar data distributions, and (3) redundant PMU's information. Thus, we project the source and the target measurement space into a latent feature space, which reduces and aligns the dimensionality of input measurements, maintains close data distributions in the latent space, and enables the transferability from the source domain to the target domain. Then, we introduce transfer learning in supervised learning by vectorizing each PMU's measurement window as one training sample, forming the latent space. We theoretically show that our approach minimizes the upper bound of misclassification rate and experimentally demonstrates the high performance on various synthetic datasets.
KW - Event-type differentiation
KW - dimensionality reduction
KW - latent feature space
KW - power systems
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85134205055&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85134205055&partnerID=8YFLogxK
U2 - 10.1109/SGSMA51733.2022.9805850
DO - 10.1109/SGSMA51733.2022.9805850
M3 - Conference contribution
AN - SCOPUS:85134205055
T3 - 2022 International Conference on Smart Grid Synchronized Measurements and Analytics, SGSMA 2022 - Proceedings
BT - 2022 International Conference on Smart Grid Synchronized Measurements and Analytics, SGSMA 2022 - Proceedings
A2 - Nordstrom, Lars
A2 - Holjevac, Ninoslav
A2 - Kuzle, Igor
A2 - Ivankovic, Igor
A2 - Kezunovic, Mladen
A2 - Paulone, Mario
A2 - Mohsenian-Rad, Hamed
A2 - Muscas, Carlo
A2 - Basakarad, Tomislav
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Conference on Smart Grid Synchronized Measurements and Analytics, SGSMA 2022
Y2 - 24 May 2022 through 26 May 2022
ER -