Journal
IEEE TRANSACTIONS ON SMART GRID
Volume 6, Issue 4, Pages 1795-1805Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSG.2015.2393059
Keywords
Demand-side management; electric vehicles (EVs); reinforcement learning (RL); stochastic programming (SP)
Categories
Funding
- DistriNet Research Group of the Department of Computer Science, Catholic University of Leuven
- Vlaamse Instelling Voor Technologisch Onderzoek, Flemish Institute for Technological Research
- Electrical Energy and Computing Architectures Research Group of the Department of Electrical Engineering, Catholic University of Leuven
- Department of Electrical Engineering and Computer Science, University of Liege
- Institute for the Promotion of Innovation by Science and Technology in Flanders
Ask authors/readers for more resources
This paper addresses the problem of defining a day-ahead consumption plan for charging a fleet of electric vehicles (EVs), and following this plan during operation. A challenge herein is the beforehand unknown charging flexibility of EVs, which depends on numerous details about each EV (e.g., plug-in times, power limitations, battery size, power curve, etc.). To cope with this challenge, EV charging is controlled during opertion by a heuristic scheme, and the resulting charging behavior of the EV fleet is learned by using batch mode reinforcement learning. Based on this learned behavior, a cost-effective day-ahead consumption plan can be defined. In simulation experiments, our approach is benchmarked against a multistage stochastic programming solution, which uses an exact model of each EVs charging flexibility. Results show that our approach is able to find a day-ahead consumption plan with comparable quality to the benchmark solution, without requiring an exact day-ahead model of each EVs charging flexibility.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available