Online Data-Driven Energy Management of a Hybrid Electric Vehicle Using Model-Based Q-Learning
HEEYUN LEE1, CHANGBEOM KANG, YEONG-IL PARK, NAMWOOK KIM#, SUK WON CHA#
Abstract: The energy management strategy of a hybrid electric vehicle directly determines the fuel economy of the vehicle. As a supervisory control strategy to divide the required power into its multiple power sources, engines and batteries, many studies have been conducting using rule-based and optimization-based approaches for energy management strategy so far. Recently, studies using various machine learning techniques have been conducted. In this paper, a novel control framework implementing Model-based Q-learning is developed for the optimal control problem of hybrid electric vehicles. As an online energy management strategy, a new approach could learn the characteristics of a current given driving environment and adaptively change the control policy through learning. Especially, for the proposed algorithm, the internal powertrain environment and external driving environment are separated so they can be learned via the reinforcement learning framework, which results in a simpler and more intuitive control strategy that can be explained using the vehicle state approximation model. The proposed algorithm is tested and verified through simulations, and the simulation results present near optimal solution. The simulation results are compared with conventional rule-based strategies and optimal control solutions acquired from Dynamic Programming.