Researchers at the University of California, Riverside have demonstrated an energy management system (EMS) that learns from historical driving cycles how to combine the power streams from an electric motor and an internal-combustion engine in the most energy-efficient way.
In “Data-Driven Reinforcement Learning-Based Real-Time Energy Management System for Plug-In Hybrid Electric Vehicles,” published in Transportation Research Record, Xuewei Qi, Guoyuan Wu and colleagues explain how their EMS can improve the efficiency of PHEVs by almost 12% compared to the standard binary mode control strategy.
Most PHEVs start in all-electric mode, running on electricity until their battery pack is depleted and then switching to hybrid mode. This binary mode EMS strategy is easy to apply, but isn’t the most efficient way to combine the two power sources.
In lab tests, blended discharge strategies, in which power from the battery is used throughout the trip, have proven to be more efficient, but haven’t been practical for real-world applications. “Blended discharge strategies have the ability to be extremely energy-efficient, but those proposed previously require upfront knowledge about the nature of the trip, road conditions and traffic information, which in reality is almost impossible to provide,” said Xuewei Qi.
The new EMS is based on a machine learning technique called reinforcement learning. It does require trip-related information, but it also gathers data in real time using onboard sensors and communications devices.
In comparison-based tests on a 20-mile commute in Southern California, the UCR team’s EMS outperformed binary mode systems, with average fuel savings of 11.9%. The system gets smarter the more it’s used and is not model- or driver-specific – it can be applied to any PHEV driven by any individual.
“In our reinforcement learning system, the vehicle learns everything it needs to be energy-efficient based on historical data,” said Xuewei Qi. “As more data are gathered and evaluated, the system becomes better at making decisions.”
“Our current findings have shown how individual vehicles can learn from their historical driving behavior to operate in an energy efficient manner,” said Xuewei Qi. “The next step is to extend the proposed mode to a cloud-based vehicle network where vehicles not only learn from themselves but also each other.”