Learn to Bid: Deep Reinforcement Learning with Transformer for Energy Storage Bidding in Energy and Contingency Reserve Markets (Papers Track)

Jinhao Li (Monash University); Changlong Wang (Monash University); Yanru Zhang (University of Electronic Science and Technology of China); Hao Wang (Monash University)

Paper PDF Slides PDF Recorded Talk NeurIPS 2022 Poster Topia Link Cite
Power & Energy Reinforcement Learning


As part of efforts to tackle climate change, grid-scale battery energy storage systems (BESS) play an essential role in facilitating reliable and secure power system operation with variable renewable energy (VRE). BESS can balance time-varying electricity demand and supply in the spot market through energy arbitrage and in the frequency control ancillary services (FCAS) market through service enablement or delivery. Effective algorithms are needed for the optimal participation of BESS in multiple markets. Using deep reinforcement learning (DRL), we present a BESS bidding strategy in the joint spot and contingency FCAS markets, leveraging a transformer-based temporal feature extractor to exploit the temporal trends of volatile energy prices. We validate our strategy on real-world historical energy prices in the Australian National Electricity Market (NEM). We demonstrate that the novel DRL-based bidding strategy significantly outperforms benchmarks. The simulation also reveals that the joint bidding in both the spot and contingency FCAS markets can yield a much higher profit than in individual markets. Our work provides a viable use case for the BESS, contributing to the power system operation with high penetration of renewables.

Recorded Talk (direct link)