SUSTAINABLE ENERGY GRIDS & NETWORKS, vol.38, 2024 (SCI-Expanded)
Forecasting short-term residential energy consumption is critical in modern decentralized power systems. Deep learning-based prediction methods that can handle the high variability of residential electrical loads have made models more accurate. On the other hand, these methods need a lot of sensitive information about how much people use something gathered centrally to train a forecasting model. This is not good for privacy and scalability. Moreover, models may become less accurate over time due to changing conditions. In this work, we propose a framework for energy consumption forecasting that exploits adaptive learning, federated learning, and edge computing concepts. A central server aggregates numerous long short-term memory (LSTM) models that users at various locations train with their energy consumption data to create a generalized model that uses adaptive learning to detect data drifts and enhance forecasting at the edge layer. Our findings show that adaptive federated learning performs better than centralized learning while preserving privacy, reducing communication overhead, lowering the forecast error rate by 8%, and decreasing the training time by approximately 80%.