Posted on Leave a comment

Real-time automatic control in a smart energy-hub

Reference:

Qiu, D., Dong, Z., Zhang, X., Wang, Y., & Strbac, G. (2022). Safe reinforcement learning for real-time automatic control in a smart energy-hub. Applied Energy309, 118403.

https://doi.org/10.1016/j.apenergy.2021.118403

Overview:

An adaptable platform for meeting energy needs is provided by energy hubs. Because multi-energy systems can mix several energy carriers, they are widely used. However, because of their volatility, monitoring renewable energy sources in real time and evaluating their effects on the economy and environment are difficult.

This study suggests a model-free deep reinforcement learning method to effectively control a renewable energy hub while abiding by operational constraints. Tested on real-world data, this solution performs better than previous approaches in terms of cost reduction, emissions reduction, and computational time. It also does a good job of generalizing and handling operational constraints while taking storage flexibility and carbon pricing into account.

Issues in the previous existing methodologies:

  • Limited Control: Conventional models lack control over renewable variability.
  • Predictability Challenges: Difficulty in accurately predicting renewable outputs.
  • Complex Optimization: Requires comprehensive mathematical models.
  • Scalability Issues: Existing methods struggle with large-scale applications.
  • Inflexible Models: Traditional approaches are rigid and not adaptable.
  • High Computational Demand: Optimization processes are resource-intensive.
  • Carbon Emission Constraints: Limited integration of carbon reduction strategies.
  • Integration Difficulties: Struggles to combine multiple energy sources effectively.

Methodology:

Energy Hub Model

In order to maximize output and consumption, this study suggests a smart Energy Hub (EH) model that integrates different energy sectors and resources. Electric and heat needs, storage systems (thermal energy storage and hydrogen storage system), renewable energy sources (solar photovoltaic and wind generator), and conversion units (electric heat pump, gas boiler, and combined heat and power) are all included in the EH model. The approach uses hydrogen storage to instantly balance supply and demand, giving priority to meeting electric demand with renewable energy. The approach provides flexibility in terms of heat demand by utilizing thermal storage and numerous conversion units. Through electrolysis, the hydrogen storage system transforms extra renewable energy into hydrogen. This hydrogen may then be stored and transformed back into electricity as needed.

Optimization and Methodology

Demand-supply balance is guaranteed, operating expenses and carbon emissions are kept to a minimum, and energy storage and conversion mathematical models are developed accordingly. The methodology utilizes an LSTM-SDDPG method, a deep reinforcement learning approach, to improve real-time energy management choices and address uncertainties in the system.

Overall structure of the proposed SDDPG method from the study by Qiu, D., Dong, Z., Zhang, X., Wang, Y., & Strbac, G. (2022).

 

Conclusion:

  • The suggested LSTM-SDDPG approach satisfies operating restrictions and efficiently reduces energy expenses and carbon emissions.
  • Uncertainties related to demand and renewable energy are better managed when the LSTM module and DDPG are integrated.
  • According to experimental data, LSTM-SDDPG reduces carbon emissions and curtails the use of renewable energy more effectively than cutting-edge techniques.
  • Increased carbon pricing encourage low-carbon transitions by moving energy purchase from natural gas to electricity.

Sakthivel R

I am a First-year M.Sc., AIML student enrolled at SASTRA University

I possess a high level of proficiency in a variety of programming languages and frameworks including Python. I have experience with cloud and database technologies, including SQL, Excel, Pandas, Scikit, TensorFlow, Git, and Power BI

Leave a Reply

Your email address will not be published. Required fields are marked *