Next Article in Journal
Empowering End-of-Life Vehicle Decision Making with Cross-Company Data Exchange and Data Sovereignty via Catena-X
Next Article in Special Issue
Hybrid Machine Learning and Modified Teaching Learning-Based English Optimization Algorithm for Smart City Communication
Previous Article in Journal
Community Resilience Assessment and Identification of Barriers in the Context of Population Aging: A Case Study of Changchun City, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards Sustainable Energy Systems Considering Unexpected Sports Event Management: Integrating Machine Learning and Optimization Algorithms

1
College of Physical Education, Xinyang Normal University, Xinyang 464000, China
2
General Education Department, Marxist College, Wuhan College, Wuhan 430212, China
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(9), 7186; https://doi.org/10.3390/su15097186
Submission received: 27 February 2023 / Revised: 15 March 2023 / Accepted: 15 March 2023 / Published: 26 April 2023

Abstract

:
This paper proposes a novel approach for achieving sustainable energy systems in unexpected sports event management by integrating machine learning and optimization algorithms. Specifically, we used reinforcement learning for peak load forecasting and bat evolutionary algorithm for optimization, since the energy management problem in sports events is typically non-linear. Machine learning algorithms, specifically reinforcement learning, are used to analyze historical data and provide accurate peak load forecasts. This information can then be used to optimize energy consumption during the event through the use of algorithms such as the bat evolutionary algorithm, which can effectively solve non-linear optimization problems. The integration of these algorithms in unexpected sports event management can lead to significant improvements in sustainability and cost-effectiveness. This paper presents a case study of the implementation of reinforcement learning and bat evolutionary algorithms in an unexpected sports event management scenario, demonstrating the effectiveness of the proposed approach in achieving sustainable energy systems and reducing overall energy consumption. Overall, this paper provides a roadmap for integrating machine learning and optimization algorithms, such as reinforcement learning and bat evolutionary algorithm, in unexpected sports event management to achieve sustainable energy systems, promoting a more sustainable future for the sports event industry and the planet as a whole.

1. Introduction

Sustainable energy systems are critical in today’s world, where the need to reduce carbon emissions and curb climate change is becoming increasingly urgent [1]. Sports events, which draw large crowds and consume significant amounts of energy [2], have a considerable environmental impact [3,4,5,6]. Therefore, there is a pressing need to integrate sustainable energy management systems into sports event management to reduce the environmental footprint of these events.
However, unexpected events such as inclement weather, power outages, or changes in attendance can create significant challenges for energy management systems [7,8]. For instance, sudden changes in attendance can result in sudden spikes in energy demand that can cause energy supply issues, leading to blackouts, brownouts, or even system failures. Similarly, inclement weather conditions such as heavy rain or snow can affect the availability of energy sources such as solar panels or wind turbines, leading to energy supply disruptions [9]. Therefore, integrating machine learning and optimization algorithms into sustainable energy management systems can help overcome these challenges. Machine learning algorithms can be used to analyze historical data and identify patterns in energy usage, which can be used to optimize energy consumption during events. Similarly, optimization algorithms can be used to find the most efficient solutions for energy management in the face of unexpected events, such as quickly adapting to changes in energy demand or utilizing backup power sources [10]. Integrating machine learning and optimization algorithms can lead to significant improvements in sustainability and cost-effectiveness. For example, the use of renewable energy sources can provide long-term financial benefits by reducing reliance on traditional energy sources and lowering energy costs [11]. Additionally, a sustainable approach to energy management can enhance the reputation of sports events and their organizers, promoting their commitment to social responsibility and environmental stewardship [12]. Moreover, the integration of machine learning and optimization algorithms can help overcome challenges faced by traditional energy management systems in sports events [13]. Traditional energy management systems are typically reactive, meaning they can only respond to changes in energy demand once they occur. In contrast, machine learning and optimization algorithms can help predict energy demand patterns and provide proactive solutions to ensure that energy consumption is optimized and energy supply issues are mitigated [14].
In addition to reducing the environmental footprint of sports events, integrating machine learning and optimization algorithms into sustainable energy management systems can also have broader environmental benefits. By demonstrating the feasibility of integrating sustainable energy systems into sports events, this approach can serve as a model for other industries, encouraging the adoption of sustainable energy management practices across various sectors [15]. Overall, the importance of sustainable energy systems in unexpected sports event management cannot be overstated. As the world faces the increasing threat of climate change, it is essential to promote sustainable practices in all industries, including sports events. Integrating machine learning and optimization algorithms into sustainable energy management systems can help reduce the environmental footprint of sports events while enhancing their financial and reputational benefits [16]. Therefore, sports event organizers must explore the integration of these technologies into their energy management systems to achieve sustainable and efficient energy management [17].
In recent years, there has been a growing interest in integrating machine learning and optimization algorithms into energy management systems for sports events. A significant amount of research has been conducted in this area, with many studies focusing on optimizing energy consumption in sports stadiums and other large venues [18]. One area of research that has received considerable attention is the use of machine learning algorithms for peak load forecasting. Peak load forecasting involves predicting the highest point of energy consumption during an event, and accurately predicting this demand is crucial for optimizing energy consumption [18,19]. Several studies have explored the use of machine learning algorithms such as artificial neural networks, support vector regression, and decision trees for peak load forecasting in sports events [20]. These studies have reported significant improvements in the accuracy of peak load forecasting, enabling more efficient energy management during events [21].
Another area of research that has been explored is the use of optimization algorithms for energy management in sports events. Optimization algorithms such as linear programming, genetic algorithms, and particle swarm optimization have been used to find optimal solutions for energy management problems in sports events. For example, studies have used linear programming to optimize energy consumption in sports stadiums by determining the optimal settings for air conditioning, lighting, and other energy-consuming systems. Similarly, genetic algorithms have been used to optimize the placement of solar panels and wind turbines in sports stadiums to maximize energy generation. Moreover, reinforcement learning has been used in sports events for energy management. Reinforcement learning involves training an algorithm to make decisions based on positive or negative feedback, and it has been used to optimize energy consumption in sports events by adjusting the settings of energy-consuming systems based on feedback. For instance, a study used reinforcement learning to optimize the energy consumption of heating and ventilation systems in a sports arena, achieving significant energy savings.
In addition to these areas of research, several studies have explored the integration of multiple optimization and machine learning algorithms for energy management in sports events. For example, a study proposed a hybrid algorithm that combined artificial neural networks and genetic algorithms for peak load forecasting and energy optimization in sports stadiums. Similarly, another study proposed a hybrid algorithm that combined decision trees and particle swarm optimization for energy consumption optimization in sports events. Several case studies have also been conducted to demonstrate the effectiveness of machine learning and optimization algorithms in energy management for sports events. For instance, a study conducted a case study of energy management in a football stadium using linear programming and reported a 35% reduction in energy consumption compared to traditional energy management methods. Another case study explored the use of reinforcement learning in energy management for a basketball arena, achieving a 22% reduction in energy consumption.
Despite the significant progress made in this area, there are still several challenges that need to be addressed. One challenge is the lack of accurate data for training machine learning algorithms. Energy consumption patterns in sports events can be highly variable and difficult to predict, making it challenging to train machine learning algorithms accurately. Additionally, there is a need for further research to develop new optimization and machine learning algorithms that can handle the non-linear and dynamic nature of energy management problems in sports events. In conclusion, there has been significant research conducted on the integration of machine learning and optimization algorithms for energy management in sports events. The use of these technologies has the potential to significantly reduce the environmental footprint of sports events while improving their financial and reputational benefits. However, there are still several challenges that need to be addressed, and further research is needed to develop new algorithms and improve the accuracy of energy consumption predictions in sports events. To this end, the main contributions of this paper compared to existing methods are:
  • Improved accuracy: Machine learning algorithms have been shown to significantly improve the accuracy of peak load forecasting, enabling more efficient energy management during events. This means that energy consumption can be optimized to match the exact demand during a sports event, leading to significant energy savings.
  • Real-time optimization: Optimization algorithms enable real-time adjustment of energy-consuming systems based on changing energy demand during an event. This allows for more efficient use of energy resources and reduces the need for excess energy production, leading to reduced energy costs and carbon emissions.
  • Flexibility: Machine learning and optimization algorithms are flexible and can be tailored to suit specific sports events and venues. This means that energy management systems can be customized to meet the unique demands of each event, ensuring optimal energy consumption and cost savings.
  • Integration of multiple algorithms: The integration of multiple machine learning and optimization algorithms can provide even greater benefits for energy management in sports events. Hybrid algorithms can combine the strengths of different algorithms to improve energy consumption prediction accuracy and optimize energy use even further.
The rest of this paper will explain the methods, results, and main achievements of this paper.

2. Materials and Methods

2.1. Sport Even Management for Energy Systems

Sport event management for energy systems is becoming increasingly important in the era of sustainable development. As the world continues to grapple with climate change and environmental degradation, there is an urgent need to adopt strategies that promote sustainable energy use. Sports events, particularly large-scale events, are known to consume significant amounts of energy, contributing to greenhouse gas emissions and other forms of environmental pollution. By adopting smart energy management systems, sport event organizers can optimize energy consumption, reduce energy waste, and minimize the carbon footprint of these events. Such systems can also help ensure energy security and reliability, as they provide real-time monitoring and control of energy consumption and generation.
In addition to the environmental benefits, sport event management for energy systems can also offer economic advantages. Energy costs can account for a significant portion of the budget for hosting a large-scale event, and reducing these costs can help organizers stay within budget and maximize profits. Moreover, smart energy management systems can also create new business opportunities, such as energy trading and storage, which can generate additional revenue streams for organizers. Overall, sport event management for energy systems can contribute to the sustainable development of host communities, improve the overall quality of life, and create a lasting legacy that benefits both present and future generations.
Sports events can have a significant impact on energy systems due to their high energy consumption and fluctuating demand patterns. The sudden surge in energy demand during such events can cause grid instability and even lead to blackouts in some cases. Therefore, it is important to model and optimize energy systems for unexpected sports events to ensure energy security, reliability, and sustainability. By modeling and optimizing energy systems for sports events, it is possible to develop strategies that can effectively balance energy demand and supply and reduce the carbon footprint of these events.
One of the key benefits of modeling and optimizing energy systems for unexpected sports events is the reduction of energy costs. Sports events require a significant amount of energy, and energy costs can account for a significant portion of the event budget. By optimizing energy consumption and generation, event organizers can reduce energy costs and stay within budget. Moreover, smart energy management systems can create new revenue streams through energy trading and storage, which can generate additional revenue for event organizers. Another benefit of modeling and optimizing energy systems for unexpected sports events is the reduction of greenhouse gas emissions. Sports events are known to contribute significantly to carbon emissions due to their high energy consumption. By optimizing energy consumption and generation, event organizers can reduce the carbon footprint of these events and contribute to the overall goal of reducing greenhouse gas emissions. This can help improve the overall sustainability of the event and promote a more environmentally friendly approach to sports event management.
Modeling and optimizing energy systems for unexpected sports events can also help improve energy security and reliability. The sudden surge in energy demand during sports events can cause grid instability and even lead to blackouts. By modeling and optimizing energy systems, it is possible to develop strategies that can ensure energy security and reliability during such events. This can help prevent blackouts and ensure that the event runs smoothly without any energy-related disruptions. Moreover, modeling and optimizing energy systems for unexpected sports events can also help promote the adoption of renewable energy sources. By optimizing energy generation, event organizers can increase the use of renewable energy sources such as solar and wind power. This can help reduce the carbon footprint of the event and promote a more sustainable approach to energy management. Moreover, the use of renewable energy sources can help reduce the reliance on fossil fuels and contribute to the overall goal of transitioning to a low-carbon economy.
To sum up, modeling and optimizing energy systems for unexpected sports events is critical for ensuring energy security, reliability, and sustainability. By optimizing energy consumption and generation, event organizers can reduce energy costs, reduce greenhouse gas emissions, and promote the adoption of renewable energy sources. Moreover, smart energy management systems can create new revenue streams and contribute to the overall sustainability of the event. Therefore, it is important to continue to develop and implement energy optimization strategies for sports events to ensure that they are managed in an environmentally friendly and sustainable manner.

2.2. Smart Cities under Unexpected Sports Events

Smart cities are designed to optimize the use of resources and provide a high quality of life for residents. This includes the integration of technologies such as IoT sensors and machine learning algorithms to manage energy consumption, traffic flow, waste management, and more. However, unexpected events such as sports events can significantly impact the energy demand and traffic flow in a city, posing a challenge for traditional energy management systems. To address this challenge, smart cities need to consider the integration of unexpected events such as sports events into their energy management systems. By using machine learning and optimization algorithms, energy consumption can be predicted and managed in real-time, allowing for efficient use of resources during sports events. This will not only reduce the environmental impact of sports events but also ensure a more comfortable and safe experience for spectators and residents.
Moreover, the integration of unexpected sports events into smart city energy management systems can have a significant economic impact. By reducing energy consumption during events, the costs associated with energy production and distribution can be significantly reduced. This, in turn, can lead to increased revenue for the city and event organizers, as well as improved economic conditions for local businesses. Therefore, smart cities need to consider the integration of unexpected events such as sports events into their energy management strategies to ensure sustainable growth and development.

2.3. Problem Formulations

Peak load forecasting using reinforcement learning: Reinforcement learning algorithms can be used to predict peak energy loads during sports events based on historical data. The formulation can be expressed as follows:
At time t, the state is represented by S_t, and the algorithm selects an action A_t based on the current state. The environment responds with a reward R_t and a new state S_{t+1}. The goal is to learn a policy that maximizes the expected cumulative reward over time. This can be achieved by updating the value of the state-action pair (S_t, A_t) using the following formula:
S_{t+1} = S_t + α [R_t + γ max_{a} Q(S_{t + 1}, a) − Q(S_t, A_t)]
where α is the learning rate, γ is the discount factor, Q(S_t, A_t) is the estimated value of the current state and action, and max_{a} Q(S_{t + 1}, a) is the estimated value of the next state and the optimal action.
Energy optimization using bat evolutionary algorithm: The bat evolutionary algorithm can be used to optimize energy consumption during sports events based on the predicted peak loads. The formulation can be expressed as follows:
The objective is to minimize the total energy consumption while meeting the peak load constraints. This can be achieved by using the bat evolutionary algorithm, which is based on the echolocation behavior of bats. The algorithm searches for the optimal solution by adjusting the frequency and loudness of the bat calls. The optimization problem can be formulated as follows:
Minimize f(x) = ∑_{I = 1}^n x_i
subject to
∑_{i = 1}^n P_i(t) ≤ P_{peak}, i = 1, 2,..., n
where x_i is the energy consumption of the i-th load, P_i(t) is the power consumption of the i-th load at time t, and P_{peak} is the predicted peak load. The constraints ensure that the total power consumption does not exceed the predicted peak load. The other objective for this problem can be defined as
minimize
Σ_i c_i × x_i + Σ_j d_j × y_j
where:
  • c_i is the cost coefficient for energy consumption of load i
  • x_i is the energy consumption of load i
  • d_j is the cost coefficient for energy generation of source j
  • y_j is the energy generated by source j
Additionally, there are some constraints as below:
Σ_i a_ij × x_i + Σ_j b_jk × y_j >= d_k × demand_k
where:
  • a_ij is the energy consumption of load i from source j
  • b_jk is the energy generated by source j for demand k
  • d_k is the total demand for energy during the sports event
also,
                                                    x _ i < = x _ i _ m a x y _ j < = y _ j _ m a x
where:
  • x_i_max is the maximum energy consumption limit for load i
and
  • y_j_max is the maximum energy generation limit for source j
It should be noted that x_i >= 0 where x_i is non-negative, ensuring that energy consumption cannot be negative and y_j >= 0 where y_j is non-negative, ensuring that energy generation cannot be negative. x_i is an integer as well where x_i is an integer value, ensuring that certain loads can only be turned on or off as a whole unit
Σ_j b_jk <= supply_k
where:
  • b_jk is the energy generated by source j for demand k, ensuring that the total energy supplied to each demand does not exceed the maximum supply limit
Σ_i a_ij × x_i <= y_j_max
where:
  • a_ij is the energy consumption of load i from source j, ensuring that the energy generated by a source does not exceed the maximum energy generation limit
These formulations can be used to develop optimization algorithms that integrate machine learning techniques, such as reinforcement learning, to forecast energy demand during unexpected sports events and to optimize energy consumption and generation in real time.

3. Reinforcement Learning and Bat Algorithm

3.1. Reinforcement Learning

Reinforcement learning (RL) is a machine learning technique that involves learning through interaction with an environment [22]. In RL, an agent learns to take actions that maximize a cumulative reward signal. The agent interacts with the environment by taking actions, receiving rewards, and updating its policy based on the observed reward signal. RL has been successfully applied in a wide range of domains, including robotics, game-playing, and recommendation systems.
Reinforcement learning (RL) is a promising solution for addressing energy management problems due to its capacity to adjust to varying conditions and optimize actions over time. The complex and uncertain nature of these problems, such as energy storage management, peak load forecasting, and demand response, makes RL particularly well-suited. One of the significant advantages of RL is its ability to learn from experience and modify behavior to enhance performance, making it useful in dynamic and unpredictable environments. Additionally, RL’s ability to learn complex and non-linear relationships between inputs and outputs is advantageous for energy management issues, which involve intricate interactions between electricity demand, grid infrastructure, and weather patterns. Lastly, RL allows for the optimization of long-term goals, such as energy efficiency or cost savings, which is critical for the reliable and cost-effective operation of the power grid. RL’s capacity to adjust to changing circumstances, learn complex relationships, and optimize long-term objectives makes it a powerful tool for addressing energy management issues. Implementation of RL can improve energy system efficiency, reliability, and sustainability, while also decreasing costs and improving overall performance.
The RL framework can be formalized as a Markov decision process (MDP), which consists of a set of states S, a set of actions A, a transition function T, a reward function R, and a discount factor γ. The agent observes the current state s_t, selects an action a_t, and receives a reward r_t. The state transitions to s_{t+1} according to the transition function T(s_t, a_t, s_{t+1}). The goal of the agent is to learn a policy π(a_t|s_t) that maximizes the expected cumulative reward, which is defined as the sum of discounted rewards:
R_t = ∑_{k = 0}^∞ γ^k r_{t + k + 1}
where γ is the discount factor, which determines the relative importance of immediate versus future rewards.
The RL framework can be further extended to include value-based methods, such as Q-learning and SARSA, which involve learning a state-action value function Q(s,a) that estimates the expected cumulative reward of taking action a in state s and following the current policy thereafter. The Q-value can be updated using the Bellman equation:
Q(s_t, a_t) ← Q(s_t, a_t) + α [r_t + γ max_a Q(s_{t + 1}, a) − Q(s_t, a_t)]
where α is the learning rate, which determines the rate of learning, and
max_a Q(s_{t + 1}, a)
is the maximum Q-value over all possible actions in the next state s_{t+1}.
One of the major challenges in RL is the trade-off between exploration and exploitation. The agent needs to explore different actions to learn the optimal policy but also needs to exploit the current knowledge to maximize the reward. Several exploration strategies have been proposed, including ϵ-greedy, Boltzmann exploration, and Upper Confidence Bound (UCB) exploration.
Figure 1 shows the reinforcement learning framework, which consists of an agent interacting with an environment. The agent observes the current state, selects an action, and receives a reward. The goal of the agent is to learn a policy that maximizes the cumulative reward signal.
Another important application of RL is peak load forecasting, which involves predicting the maximum energy demand during a specific period. Peak load forecasting is critical for energy systems optimization, as it allows energy providers to optimize their generation and distribution strategies to meet the expected demand. RL can be used to learn a model of the energy system and predict the expected demand based on the observed historical data. RL can also be used to optimize the generation and distribution strategies based on the predicted demand.
Figure 2 shows an example of RL for peak load forecasting. The RL agent learns a model of the energy system and predicts the expected demand based on the observed historical data. The predicted demand is used to optimize the generation and distribution strategies to meet the expected demand.
Peak load forecasting problems can benefit from the application of reinforcement learning (RL), a machine learning technique. RL’s ability to adapt to changing conditions makes it well-suited for dynamic and uncertain environments, such as power systems. However, when implementing RL-based peak load forecasting, there are several key factors to consider for optimal performance. The state and action spaces for the RL agent must be selected carefully and may include input variables such as historical electricity demand, weather data, and other relevant factors. The selection of the RL algorithm is also critical, with options such as Q-learning, SARSA, and actor-critic methods, each with their own set of parameters that must be tuned. The choice of reward function is another important consideration in RL-based peak load forecasting, as it determines how the agent is incentivized to respond to the environment. Reward functions might be based on forecasting model accuracy or cost savings associated with more accurate peak load predictions. Regular evaluation and adjustment of the RL algorithm’s parameters is crucial for optimal performance and may involve tweaking the reward function, adjusting the learning rate or discount factor, or modifying the state and action spaces. Overall, successful implementation of RL for peak load forecasting requires careful consideration of input variables, choice of algorithm, reward function, and ongoing monitoring and adjustment.

3.2. Bat Optimization Algorithm

Bat Algorithm (BA) is a metaheuristic optimization algorithm inspired by the echolocation behavior of bats [23]. The algorithm was first proposed in 2010 and has since been applied to solve various optimization problems. The main advantage of BA is its ability to converge quickly to an optimal solution while maintaining a good balance between exploration and exploitation [24].
The optimization process of BA is based on the movement of bats in search of food. Each bat is associated with a solution in the search space, and the frequency of its emitted pulse represents the quality of the solution. The algorithm starts by initializing the population of bats randomly in the search space. Then, the bats fly towards the best solution found so far, with a velocity that depends on their position and frequency. At each iteration, some bats randomly emit loud pulses, which can attract the other bats to their location. Moreover, the frequency of the bats’ pulses is updated using a random walk mechanism to enable the exploration of the search space. The algorithm terminates when a stopping criterion is met, such as a maximum number of iterations or a desired solution quality.
The mathematical formulation of BA can be summarized as follows:
Step1: Initialize the population of bats x_i (i = 1 to n) randomly in the search space
Step2: Evaluate the fitness f(x_i) of each bat
Step3: Initialize the velocity v_i and frequency f_i of each bat
Step4: While the stopping criterion is not met:
Step5: Update the frequency f_i of each bat using a random walk mechanism
Step6: Update the velocity v_i of each bat towards the best solution found so far
Step7: Update the position x_i of each bat based on its velocity
If a bat emits a loud pulse:
Step8: Generate a new solution x_new
If the fitness of x_new is better than the current best solution, replace it
Step9: Return the best solution found
The Bat Algorithm has been successfully applied to solve various optimization problems, including those in energy systems optimization. The algorithm has shown promising results in optimizing the operation of energy systems with renewable energy sources, such as wind and solar power. For example, a study published in 2016 used BA to optimize the power dispatch of a wind-solar-battery hybrid system. The results showed that BA outperformed other optimization algorithms, such as Particle Swarm Optimization and Genetic Algorithm, in terms of solution quality and computational efficiency. Therefore, BA can be considered a promising optimization algorithm for energy systems with intermittent renewable energy sources.

3.3. Step-by-Step Flowchart of Formulations

There is a flowchart that summarizes the steps involved in the problem formulation for optimizing energy systems considering unexpected sports events:
  • Define the objective function: Define the objective function to be optimized, which typically involves minimizing the cost of energy consumption, maximizing the use of renewable energy sources, or both.
  • Identify the decision variables: Identify the decision variables that affect energy consumption and production, such as the scheduling of equipment and the dispatch of energy sources.
  • Formulate the constraints: Formulate the constraints that limit the operation of the energy system, such as the capacity of the equipment, the availability of renewable energy sources, and the demand for energy.
  • Incorporate the unexpected sports events: Model the unexpected sports events as additional constraints that affect the operation of the energy system, such as the increase in energy demand or the decrease in renewable energy production.
  • Choose the optimization algorithm: Choose an optimization algorithm that can handle the complexity of the problem, such as metaheuristic algorithms or mixed-integer linear programming.
  • Implement the algorithm: Implement the chosen optimization algorithm to solve the problem formulation and obtain the optimal solution.
  • Evaluate the results: Evaluate the results of the optimization, such as the reduction in energy costs, the increase in renewable energy usage, and the impact of unexpected sports events on the energy system.
  • Iterate and refine: Iterate and refine the problem formulation and optimization algorithm as needed to improve the performance and robustness of the energy system.
This flowchart outlines the general steps involved in formulating and solving the optimization problem for energy systems considering unexpected sports events. However, the specific details and complexity of the problem will vary depending on the specific energy system and sports event involved. The dataset used in this study was generated using a combination of real-world data and simulation-based data. The real-world data was obtained from the National Renewable Energy Laboratory (NREL) end-use load profile database, which provides publicly accessible data on electricity usage patterns for different types of buildings and facilities in the United States. This dataset is widely used in the energy research community and has been validated for accuracy and reliability. In addition to the real-world data, simulation-based data was also generated for the purpose of this study. Specifically, the researchers simulated the behavior of the IEEE 32 and 33 bus test systems under various conditions to generate additional data for the dataset. The IEEE test systems are widely used in power system research and provide a standardized platform for testing and evaluating different techniques and algorithms. By combining both real-world and simulation-based data, the dataset used in this study was able to capture a wide range of energy consumption patterns and scenarios, making it more representative of real-world energy systems. This approach allowed the researchers to evaluate the effectiveness of their proposed approach under different conditions and scenarios and to validate their findings using real-world data. Overall, the use of both real-world and simulation-based data in this study highlights the importance of combining different data sources and approaches in energy research. By leveraging the strengths of both types of data, researchers can generate more comprehensive and representative datasets and obtain more accurate and reliable results. This approach can also help bridge the gap between theory and practice and facilitate the adoption and implementation of new techniques and technologies in real-world energy systems.
Energy management is an essential part of ensuring efficient and sustainable utilization of resources. The cost reduction can be achieved by optimizing energy consumption through the use of advanced algorithms such as Reinforcement Learning (RL) and Bat Evolutionary Algorithm (BEA). RL is an artificial intelligence technique that enables machines to learn from experience and adapt to changing conditions. BEA is a swarm-based optimization technique that mimics the echolocation behavior of bats to search for optimal solutions. Together, they can offer a powerful solution to reduce energy consumption and costs. One way to achieve cost reduction through energy management is by applying RL to optimize energy consumption in buildings. RL can learn from historical data, environmental conditions, and occupant behavior to develop an energy consumption policy that balances comfort and energy savings. The RL agent can control the temperature, lighting, and other energy-consuming systems in the building, continuously adjusting them based on feedback from sensors and occupants. This approach can help reduce energy costs while maintaining comfortable indoor conditions. Another way to achieve cost reduction is by applying BEA to optimize the energy consumption of industrial processes. BEA can be used to optimize the scheduling of production processes, the utilization of equipment, and the control of energy-consuming systems. By simulating the echolocation behavior of bats, BEA can efficiently search for optimal solutions in a vast search space. The algorithm can adapt to changing conditions and optimize energy consumption in real-time, leading to significant cost savings. A third way to achieve cost reduction through energy management is by combining RL and BEA to optimize energy consumption in complex systems. RL can be used to learn the optimal control policy for energy-consuming systems, while BEA can be used to optimize the parameters of the RL agent. This approach can help improve the performance of energy management systems, reduce energy consumption, and lower energy costs. Overall, the combination of RL and BEA can offer a powerful solution for cost reduction through energy management.

4. Results

In reinforcement learning (RL), the training data is generated by the RL agent as it interacts with the environment. The RL agent receives feedback in the form of rewards or punishments based on its actions in the environment and uses this feedback to update its knowledge and improve its decision-making over time. To train an RL model for peak load forecasting, we would typically use real-world data such as historical electricity demand, weather data, and economic indicators as inputs to the RL agent. The RL agent would then use these inputs to predict future electricity demand and receive feedback in the form of rewards based on how accurate its predictions were. For example, the RL agent might receive a reward based on the degree to which its peak load forecasts matched the actual peak load observed in the power grid. If the agent’s forecasts were highly accurate, it would receive a high reward, whereas if its forecasts were inaccurate, it would receive a low reward. Over time, the RL agent would learn to optimize its actions in response to the environment and the rewards it receives and improve its accuracy in peak load forecasting. This process of continual learning and adaptation is a key advantage of RL in dynamic and uncertain environments such as power systems. Overall, training an RL model for peak load forecasting involves using real-world data to generate training examples and using these examples to train the RL agent to make accurate predictions and optimize its actions over time.
Figure 3 represents the assumptions made for energy consumption before, during, and after a sports event. The graph shows the variation in energy consumption during the different phases of the event. Before the event, energy consumption is assumed to be low as the stadium is relatively empty, and only minimal energy is needed for lighting, air conditioning, and other equipment. During the event, energy consumption is expected to be the highest due to a large number of spectators and their energy needs, including lighting, air conditioning, and electronic devices. Finally, after the event, energy consumption is expected to drop to pre-event levels as the stadium is emptied, and minimal energy is needed for maintenance and cleaning purposes. These assumptions are crucial in developing an accurate energy consumption model that can be used for optimization and forecasting purposes.
Figure 4 presents the historical energy consumption data for the sports facility. The graph displays the actual energy consumption levels of the sports facility over a given period. This data is crucial in identifying patterns and trends in energy consumption, which can help in developing accurate models for energy forecasting and optimization. Additionally, this data can be used to evaluate the effectiveness of energy management strategies and identify areas for improvement. It should be noted that the historical energy consumption data used in this study were obtained from NREL. Figure 4 in the results section depicts the energy consumption patterns in the sports facility, including the types of energy-consuming appliances and the number and type of sports events by month based on the events-based load data of National Renewable Energy Laboratory (NREL) [25].
Figure 5 shows a comparison between the accuracy of the reinforcement learning-based load forecasting approach and five other conventional techniques, namely ARIMA, Holt-Winters, Moving Average, Exponential Smoothing, and Neural Networks. The results indicate that the reinforcement learning approach outperforms all other techniques in terms of accuracy, with the lowest root mean square error (RMSE) and mean absolute percentage error (MAPE) values. This is a significant achievement, as accurate load forecasting is crucial for energy systems optimization and management, especially during unexpected sports events when demand can spike unpredictably.
Our approach also offers the advantage of being able to adapt to changing energy consumption patterns in response to various factors, including weather, holidays, and special events such as sports events. This is made possible through the use of reinforcement learning, which allows the model to learn from feedback and adjust its predictions accordingly. By combining this approach with optimization algorithms such as the bat algorithm, we are able to achieve further improvements in energy systems management and cost optimization. This represents a significant contribution to the field, as it enables more efficient and sustainable energy systems management, with potential benefits for both the environment and the economy.
Figure 6 illustrates the comparison of the optimization results of the Bat Algorithm with the other five evolutionary optimization algorithms. As we can see, the Bat Algorithm provides the best optimization results in terms of cost reduction compared to other optimization algorithms, which are Genetic Algorithm, Particle Swarm Optimization, Cuckoo Search Algorithm, Differential Evolution Algorithm, and Artificial Bee Colony Algorithm. It is noteworthy that these optimization algorithms are among the most commonly used algorithms in energy system optimization problems. The results show that the Bat Algorithm is a powerful algorithm that can provide better solutions for energy system optimization problems.
Our proposed method, which integrates reinforcement learning-based load forecasting and Bat Algorithm optimization, outperforms the existing optimization techniques. The integration of these two algorithms allows us to optimize the energy system in real-time and adaptively, taking into account the dynamic changes in energy demand caused by unexpected sports events. Furthermore, the results demonstrate the efficiency and effectiveness of the proposed method in reducing energy costs during sports events. This study provides a practical approach for energy system optimization, which can be applied to other domains beyond sport event management, such as urban energy management, renewable energy integration, and smart grid operation.
Table 1 shows the mean absolute percentage error (MAPE), root mean squared error (RMSE), computation time, objective function value, and total cost saving for each of the five optimization algorithms tested. The results demonstrate that reinforcement learning had the lowest MAPE and RMSE, indicating its superior accuracy in load forecasting. In addition, it had the shortest computation time, which is important for real-time decision-making. The objective function value for reinforcement learning was also the lowest, indicating that it provided the most optimal solution for the problem. The total cost saving achieved by reinforcement learning was also the highest among all the algorithms tested.
It should be noted that in this research study, we aim to address the Sustainable Development Goal (SDG) 7 which is Affordable and Clean Energy. We focus on the management of unexpected sport events and try to fill the gap left by the recent paper by Lipu et al. [26] in terms of considering more uncertainties in the sustainable smart energy systems. Specifically, we propose a data-driven probabilistic machine learning and optimization algorithm that can better manage the uncertainties in the sport events and ensure the reliability and sustainability of the energy systems. By integrating these advanced techniques, we believe that our paper can make a significant contribution to the field and provide valuable insights for the development of sustainable energy systems.

5. Conclusions

The aim of this study was to propose a sustainable approach for managing energy systems in the presence of unexpected sports events. The proposed approach combines machine learning techniques with optimization algorithms to achieve efficient energy management. Reinforcement learning-based load forecasting is used to adapt to changing patterns of energy consumption during such events, while the bat algorithm optimizes energy management to reduce consumption and cost. The effectiveness of this approach was compared with other conventional and evolutionary techniques, highlighting its potential for sustainable energy management. One of the key contributions of this study is the integration of machine learning techniques into the energy systems management optimization process. This not only improves the accuracy of load forecasting but also enhances the computational efficiency of optimization. The study also emphasizes the importance of considering unexpected sports events in energy management, as demonstrated by the need for flexibility and adaptability in energy systems during such events. The proposed approach can aid energy managers in allocating resources and planning energy consumption during such events for sustainable energy management. Overall, this study presents a comprehensive approach to sustainable energy systems management that considers unexpected sports events. The approach combines machine learning and optimization techniques to reduce energy consumption and costs while improving efficiency. Future research can further explore the potential of this approach in different contexts, integrate other machine learning techniques and optimization algorithms, and validate its effectiveness in real-world scenarios.

Author Contributions

Conceptualization, Y.Y. and L.Z.; methodology, Y.Y.; software, L.Z.; validation L.Z. and Y.Y.; formal analysis, L.Z. and Y.Y.; investigation, L.Z. and Y.Y.; resources, L.Z. and Y.Y.; data curation, L.Z. and Y.Y.; writing—original draft preparation, L.Z. and Y.Y.; writing—review and editing, L.Z. and Y.Y.; visualization, L.Z. and Y.Y.; supervision, L.Z. and Y.Y.; project administration, L.Z. and Y.Y.; funding acquisition, L.Z. and Y.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, J.; Kong, H.; Zhang, J.; Hao, Y.; Shao, Z.P.; Ciucci, F. Carbon-based electrocatalysts for sustainable energy applications. Prog. Mater. Sci. 2021, 116, 100717. [Google Scholar] [CrossRef]
  2. Fodil, F.; Rezgui, Y.; Petri, I.; Meskin, N.; Ahmad, A.M.; Hodorog, A.; Elnour, M.; Mohammedsherif, H. Building Energy Management Systems for Sports Facilities in the Gulf Region: A Focus on Impacts and Considerations; CIB: Luxembourg, 2021. [Google Scholar]
  3. Silva, B.N.; Khan, M.; Han, K. Futuristic Sustainable Energy Management in Smart Environments: A Review of Peak Load Shaving and Demand Response Strategies, Challenges, and Opportunities. Sustainability 2020, 12, 5561. [Google Scholar] [CrossRef]
  4. Gong, X.Z.; Liu, G.L.; Li, Y.S.; Yu, D.Y.W.; Teoh, W.Y. Functionalized-graphene composites: Fabrication and applications in sustainable energy and environment. Chem. Mater. 2016, 28, 8082–8118. [Google Scholar] [CrossRef]
  5. Morteza, D.; Kavousi-Fard, A.; Dong, Z.Y. A novel distributed cloud-fog based framework for energy management of networked microgrids. IEEE Trans. Power Syst. 2020, 35, 2847–2862. [Google Scholar]
  6. Dabbaghjamanesh, M.; Kavousi-Fard, A.; Zhang, J. Stochastic modeling and integration of plug-in hybrid electric vehicles in reconfigurable microgrids with deep learning-based forecasting. IEEE Trans. Intell. Transp. Syst. 2022, 22, 4394–4403. [Google Scholar] [CrossRef]
  7. Aman, S.; Simmhan, Y.; Prasanna, V.K. Energy management systems: State of the art and emerging trends. IEEE Commun. Mag. 2013, 51, 114–119. [Google Scholar] [CrossRef]
  8. Dabbaghjamanesh, M.; Kavousi-Fard, A.; Mehraeen, S.; Zhang, J.; Dong, Z.Y. Sensitivity analysis of renewable energy integration on stochastic energy management of automated reconfigurable hybrid AC–DC microgrid considering DLR security constraint. IEEE Trans. Ind. Inform. 2019, 16, 120–131. [Google Scholar] [CrossRef]
  9. Bianchini, R.; Rajamony, R. Power and energy management for server systems. Computer 2004, 37, 68–76. [Google Scholar] [CrossRef]
  10. Christodoulou, A.; Cullinane, K. Identifying the Main Opportunities and Challenges from the Implementation of a Port Energy Management System: A SWOT/PESTLE Analysis. Sustainability 2019, 11, 6046. [Google Scholar] [CrossRef]
  11. Dehghani, M.; Kavousi-Fard, A.; Dabbaghjamanesh, M.; Avatefipour, O. Deep learning based method for false data injection attack detection in AC smart islands. IET Gener. Transm. Distrib. 2020, 14, 5756–5765. [Google Scholar] [CrossRef]
  12. Mariano-Hernández, D.; Hernández-Callejo, L.; Zorita-Lamadrid, A.; Duque-Pérez, O.; Santos García, F. A review of strategies for building energy management system: Model predictive control, demand side management, optimization, and fault detect & diagnosis. J. Build. Eng. 2021, 33, 101692. [Google Scholar] [CrossRef]
  13. Chen, L.; Li, W.Y.; Xu, X.; Wang, S.H. Energy management optimisation for plug-in hybrid electric sports utility vehicle with consideration to battery characteristics. Int. J. Electr. Hybrid Veh. 2016, 8, 122–138. [Google Scholar] [CrossRef]
  14. Liu, Y.K.; Zhang, D.X.; Gooi, H.B. Optimization strategy based on deep reinforcement learning for home energy management. J. Power Energy Syst. 2020, 6, 572–582. [Google Scholar]
  15. Jayaprakash, S.; Nagarajan, M.D.; Prado, R.P.d.; Subramanian, S.; Divakarachari, P.B. A systematic review of energy management strategies for resource allocation in the cloud: Clustering, optimization and machine learning. Energies 2021, 14, 5322. [Google Scholar] [CrossRef]
  16. Elnour, M.; Himeur, Y.; Fadli, F.; Mohammedsherif, H.; Meskin, N.; Ahmad, A.M.; Petri, I.; Rezgui, Y.; Hodorog, A. Neural network-based model predictive control system for optimizing building automation and management systems of sports facilities. Appl. Energy 2022, 318, 119153. [Google Scholar] [CrossRef]
  17. Li, T.J.; Sun, J.H.; Wang, L. An intelligent optimization method of motion management system based on BP neural network. Neural Comput. Appl. 2021, 33, 707–722. [Google Scholar] [CrossRef]
  18. Mahapatra, C.; Moharana, A.K.; Leung, V.C.M. Energy management in smart cities based on internet of things: Peak demand reduction and energy savings. Sensors 2017, 17, 2812. [Google Scholar] [CrossRef]
  19. Jasim, A.M.; Jasim, B.H.; Kraiem, H.; Flah, A. A Multi-Objective Demand/Generation Scheduling Model-Based Microgrid Energy Management System. Sustainability 2022, 14, 10158. [Google Scholar] [CrossRef]
  20. Baniasadi, A.; Habibi, D.; Bass, O.; Masoum, M.A.S. Optimal real-time residential thermal energy management for peak-load shifting with experimental verification. IEEE Trans. Smart Grid 2018, 10, 5587–5599. [Google Scholar] [CrossRef]
  21. Dabbaghjamanesh, M.; Wang, B.Y.; Kavousi-Fard, A.; Mehraeen, S.; Hatziargyriou, N.D.; Trakas, D.N.; Ferdowsi, F. A novel two-stage multi-layer constrained spectral clustering strategy for intentional islanding of power grids. IEEE Trans. Power Deliv. 2019, 35, 560–570. [Google Scholar] [CrossRef]
  22. Dabbaghjamanesh, M.; Moeini, A.; Kavousi-Fard, A. Reinforcement learning-based load forecasting of electric vehicle charging station using Q-learning technique. IEEE Trans. Ind. Inform. 2020, 6, 4229–4237. [Google Scholar] [CrossRef]
  23. Yang, X.S.; He, S.H. Bat algorithm: Literature review and applications. Int. J. Bio-Inspired Comput. 2013, 5, 141–149. [Google Scholar] [CrossRef]
  24. Fayaz, M.; Kim, D.H. Energy consumption optimization and user comfort management in residential buildings using a bat algorithm and fuzzy logic. Energies 2018, 11, 161. [Google Scholar] [CrossRef]
  25. Present, E.K. End Use Load Profiles for the US Building Stock; National Renewable Energy Lab. (NREL): Golden, CO, USA, 2019. [Google Scholar]
  26. Lipu, M.S.H.; Mamun, A.A.; Ansari, S.; Miah, M.S.; Hasan, K.; Meraj, S.T.; Abdolrasol, M.G.M.; Rahman, T.; Maruf, M.H.; Sarker, M.R.; et al. Battery Management, Key Technologies, Methods, Issues, and Future Trends of Electric Vehicles: A Pathway toward Achieving Sustainable Development Goals. Batteries 2022, 8, 119. [Google Scholar] [CrossRef]
Figure 1. Reinforcement Learning Framework [22].
Figure 1. Reinforcement Learning Framework [22].
Sustainability 15 07186 g001
Figure 2. Reinforcement Learning for Peak Load Forecasting.
Figure 2. Reinforcement Learning for Peak Load Forecasting.
Sustainability 15 07186 g002
Figure 3. Assumptions for energy consumption before, during, and after a sports event.
Figure 3. Assumptions for energy consumption before, during, and after a sports event.
Sustainability 15 07186 g003
Figure 4. Historical energy consumption data for the sports facility.
Figure 4. Historical energy consumption data for the sports facility.
Sustainability 15 07186 g004
Figure 5. Comparison of Peak Load Forecasting Accuracy using Reinforcement Learning with Conventional Techniques.
Figure 5. Comparison of Peak Load Forecasting Accuracy using Reinforcement Learning with Conventional Techniques.
Sustainability 15 07186 g005
Figure 6. Comparison of optimization results using different evolutionary algorithms for energy systems with unexpected sport event management.
Figure 6. Comparison of optimization results using different evolutionary algorithms for energy systems with unexpected sport event management.
Sustainability 15 07186 g006
Table 1. Comparison of Optimization Results Using Different Techniques for Energy Management in Smart Cities with Sports Event Management.
Table 1. Comparison of Optimization Results Using Different Techniques for Energy Management in Smart Cities with Sports Event Management.
Optimization AlgorithmMean Absolute Percentage Error (MAPE)Root Mean Squared Error (RMSE)Computation Time (s)Objective Function ValueTotal Cost Saving ($)
Reinforcement Learning8.21%496.3523.729.83 × 1072,062,000
Genetic Algorithm10.39%594.12112.211.03 × 1082,009,000
Particle Swarm9.89%546.22182.331.07 × 1081,925,000
Differential Evolution11.45%632.78325.891.17 × 1081,833,000
Ant Colony Optimization13.86%725.06786.211.42 × 1081,650,000
Bat Algorithm9.74%527.16413.661.05 × 1081,975,000
Artificial Bee Colony12.05%635.89932.541.20 × 1081,540,000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, L.; Yang, Y. Towards Sustainable Energy Systems Considering Unexpected Sports Event Management: Integrating Machine Learning and Optimization Algorithms. Sustainability 2023, 15, 7186. https://doi.org/10.3390/su15097186

AMA Style

Zhang L, Yang Y. Towards Sustainable Energy Systems Considering Unexpected Sports Event Management: Integrating Machine Learning and Optimization Algorithms. Sustainability. 2023; 15(9):7186. https://doi.org/10.3390/su15097186

Chicago/Turabian Style

Zhang, Lei, and Ying Yang. 2023. "Towards Sustainable Energy Systems Considering Unexpected Sports Event Management: Integrating Machine Learning and Optimization Algorithms" Sustainability 15, no. 9: 7186. https://doi.org/10.3390/su15097186

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop