BERT-PIN: A BERT-based Framework for Recovering Missing Data Segments in Time-series Load Profiles
Hu, Yi, Ye, Kai, Kim, Hyeonjin, Lu, Ning
–arXiv.org Artificial Intelligence
Inspired by the success of the Transformer model in natural language processing and computer vision, this paper introduces BERT-PIN, a Bidirectional Encoder Representations from Transformers (BERT) powered Profile Inpainting Network. BERT-PIN recovers multiple missing data segments (MDSs) using load and temperature time-series profiles as inputs. To adopt a standard Transformer model structure for profile inpainting, we segment the load and temperature profiles into line segments, treating each segment as a word and the entire profile as a sentence. We incorporate a top candidates selection process in BERT-PIN, enabling it to produce a sequence of probability distributions, based on which users can generate multiple plausible imputed data sets, each reflecting different confidence levels. We develop and evaluate BERT-PIN using real-world dataset for two applications: multiple MDSs recovery and demand response baseline estimation. Simulation results show that BERT-PIN outperforms the existing methods in accuracy while is capable of restoring multiple MDSs within a longer window. BERT-PIN, served as a pre-trained model, can be fine-tuned for conducting many downstream tasks, such as classification and super resolution.
arXiv.org Artificial Intelligence
Oct-26-2023
- Country:
- North America > United States
- California (0.14)
- North Carolina (0.14)
- North America > United States
- Genre:
- Research Report > New Finding (0.66)
- Industry:
- Energy
- Power Industry (1.00)
- Renewable (0.68)
- Government > Regional Government
- Energy
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning
- Neural Networks > Deep Learning (0.50)
- Statistical Learning (1.00)
- Natural Language (1.00)
- Vision (1.00)
- Machine Learning
- Data Science (1.00)
- Artificial Intelligence
- Information Technology