Next Article in Journal
Pulse Width Modulation-Controlled Switching Impedance for Wireless Power Transfer
Previous Article in Journal
Numerical Methods to Evaluate Hyperelastic Transducers: Hexagonal Distributed Embedded Energy Converters
Previous Article in Special Issue
Employing Bibliometric Analysis to Identify the Current State of the Art and Future Prospects of Electric Vehicles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach

1
Higher School of Communication of Tunis (Sup’Com), University of Carthage, 2083 Ariana, Tunisia
2
Chair of Distributed Information Systems, University of Passau, Innstraße 41, 94032 Passau, Germany
*
Author to whom correspondence should be addressed.
Energies 2023, 16(24), 8102; https://doi.org/10.3390/en16248102
Submission received: 10 November 2023 / Revised: 8 December 2023 / Accepted: 11 December 2023 / Published: 16 December 2023
(This article belongs to the Special Issue Recent Advancement in Electric Vehicles)

Abstract

The worldwide adoption of Electric Vehicles (EVs) has embraced promising advancements toward a sustainable transportation system. However, the effective charging scheduling of EVs is not a trivial task due to the increase in the load demand in the Charging Stations (CSs) and the fluctuation of electricity prices. Moreover, other issues that raise concern among EV drivers are the long waiting time and the inability to charge the battery to the desired State of Charge (SOC). In order to alleviate the range of anxiety of users, we perform a Deep Reinforcement Learning (DRL) approach that provides the optimal charging time slots for EV based on the Photovoltaic power prices, the current EV SOC, the charging connector type, and the history of load demand profiles collected in different locations. Our implemented approach maximizes the EV profit while giving a margin of liberty to the EV drivers to select the preferred CS and the best charging time (i.e., morning, afternoon, evening, or night). The results analysis proves the effectiveness of the DRL model in minimizing the charging costs of the EV up to 60%, providing a full charging experience to the EV with a lower waiting time of less than or equal to 30 min.
Keywords: smart EV charging; day-ahead planning; deep Q-Network; data-driven approach; waiting time; cost minimization; real dataset smart EV charging; day-ahead planning; deep Q-Network; data-driven approach; waiting time; cost minimization; real dataset

Share and Cite

MDPI and ACS Style

Azzouz, I.; Fekih Hassen, W. Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach. Energies 2023, 16, 8102. https://doi.org/10.3390/en16248102

AMA Style

Azzouz I, Fekih Hassen W. Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach. Energies. 2023; 16(24):8102. https://doi.org/10.3390/en16248102

Chicago/Turabian Style

Azzouz, Imen, and Wiem Fekih Hassen. 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach" Energies 16, no. 24: 8102. https://doi.org/10.3390/en16248102

APA Style

Azzouz, I., & Fekih Hassen, W. (2023). Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach. Energies, 16(24), 8102. https://doi.org/10.3390/en16248102

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop