Optimizing Day-Ahead Energy Trading with Proximal Policy Optimization and Blockchain
–arXiv.org Artificial Intelligence
The increasing penetration of renewable energy sources in day-ahead energy markets introduces challenges in balancing supply and demand, ensuring grid resilience, and maintaining trust in decentralized trading systems. This paper proposes a novel framework that integrates the Proximal Policy Optimization (PPO) algorithm, a state-of-the-art reinforcement learning method, with blockchain technology to optimize automated trading strategies for prosumers in day-ahead energy markets. We introduce a comprehensive framework that employs a Reinforcement Learning (RL) agent for multi-objective energy optimization and blockchain for tamper-proof data and transaction management. Simulations using real-world data from the Electricity Reliability Council of Texas (ERCOT) demonstrate the effectiveness of our approach. The RL agent achieves demand-supply balancing within 2% of the demand and maintains near-optimal supply costs for the majority of the operating hours. Moreover, it generates robust battery storage policies capable of handling variability in solar and wind generation. All decisions are recorded on an Algorand-based blockchain, ensuring transparency, au-ditability, and security - key enablers for trustworthy multi-agent energy trading. Our key contributions are a novel system architecture, the use of curriculum learning to train the RL agent, and policy insights that support real-world deployment.
arXiv.org Artificial Intelligence
Dec-9-2025
- Country:
- Europe > Greece (0.04)
- North America > United States
- Texas (0.25)
- Genre:
- Research Report (0.50)
- Industry:
- Technology: