TY - JOUR
T1 - An Applied Deep Reinforcement Learning Approach to Control Active Networked Microgrids in Smart Cities with Multi-Level Participation of Battery Energy Storage System and Electric Vehicles
AU - Sepehrzad, Reza
AU - Godazi Langeroudi, Amir Saman
AU - Khodadadi, Amin
AU - Adinehpour, Sara
AU - Al-Durra, Ahmed
AU - Anvari-Moghaddam, Amjad
PY - 2024/7/15
Y1 - 2024/7/15
N2 - This study proposed an intelligent energy management strategy for islanded networked microgrids (NMGs) in smart cities considering the renewable energy sources uncertainties and power fluctuations. Energy management of active power and frequency control approach is based on the intelligent probabilistic wavelet fuzzy neural network-deep reinforcement learning algorithm (IPWFNN-DRLA). The control strategy is formulated with deep reinforcement learning approach based on the Markov decision process and solved by the soft actor-critic algorithm. The NMG local controller (NMGLC) provides information such as the frequency, active power, power generation data, and status of the electric vehicle's battery energy storage system to the NMG central controller (NMGCC). Then the NMGCC calculates active power and frequency support based on the IPWFNN-DRLA approach and sends the results to the NMGLC. The proposed model is developed in a continuous problem-solving space with two structures of offline training and decentralized distributed operation. For this purpose, each NMG has a control agent (NMGCA) based on the IPWFNN algorithm, and the NMGCA learning model is formulated based the online back-propagation learning algorithm. The proposed approach demonstrates a computation accuracy exceeding 98%, along with a 7.82% reduction in computational burden and a 61.1% reduction in computation time compared to alternative methods.
AB - This study proposed an intelligent energy management strategy for islanded networked microgrids (NMGs) in smart cities considering the renewable energy sources uncertainties and power fluctuations. Energy management of active power and frequency control approach is based on the intelligent probabilistic wavelet fuzzy neural network-deep reinforcement learning algorithm (IPWFNN-DRLA). The control strategy is formulated with deep reinforcement learning approach based on the Markov decision process and solved by the soft actor-critic algorithm. The NMG local controller (NMGLC) provides information such as the frequency, active power, power generation data, and status of the electric vehicle's battery energy storage system to the NMG central controller (NMGCC). Then the NMGCC calculates active power and frequency support based on the IPWFNN-DRLA approach and sends the results to the NMGLC. The proposed model is developed in a continuous problem-solving space with two structures of offline training and decentralized distributed operation. For this purpose, each NMG has a control agent (NMGCA) based on the IPWFNN algorithm, and the NMGCA learning model is formulated based the online back-propagation learning algorithm. The proposed approach demonstrates a computation accuracy exceeding 98%, along with a 7.82% reduction in computational burden and a 61.1% reduction in computation time compared to alternative methods.
KW - Networked microgrid
KW - Electric vehicles
KW - Energy management
KW - Deep reinforcement learning algorithm
UR - http://www.scopus.com/inward/record.url?scp=85191004487&partnerID=8YFLogxK
U2 - 10.1016/j.scs.2024.105352
DO - 10.1016/j.scs.2024.105352
M3 - Journal article
SN - 2210-6707
VL - 107
SP - 1
EP - 30
JO - Sustainable Cities and Society
JF - Sustainable Cities and Society
M1 - 105352
ER -