Battery Degradation Modeling For Vehicle Applications

Loading...
Thumbnail Image

Date

2014-09-19

Authors

Finley, Thomas Dylan

Advisor

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Fuel efficiency is a fundamental part of the automotive industry and its impact on the global environment. This is a direct result of the Corporate Average Fuel Efficiency (CAFE) standards imposing a 70% improvement of fuel efficiency on all light duty line-ups between 2014 and 2025. To achieve such an improvement, automotive manufacturers will need to electrify their powertrains. Lithium ion battery technology has emerged as a leading component in electrification with the development of hybrid, plug-in hybrid, and battery electric vehicles. Therefore, the design and sizing of these battery packs must be accurate. The correct design and sizing of a battery pack must account for the lifetime of the battery. In plug-in hybrid and battery electric vehicles, the battery pack is directly responsible for the all-electric range of that vehicle. As the battery ages, this range decreases. Convention has been to size the battery to account for a 20% loss in electric range; however the degradation rate varies from vehicle to vehicle depending on the driver’s behavior. The convention can lead to severely oversized battery packs, which decreases operational efficiencies, and increases vehicle mass, and greenhouse gas emissions. Therefore, it is important to consider realistic driver behavior when sizing the battery pack. The A123 AMP20 pouch battery was selected for the degradation analysis. A semi-empirical single particle battery degradation model was developed in MATLAB Simulink for the AMP20. 2 mAh half-cell coin cells were built from the AMP20 materials and cycled at C/50 to obtain a close approximation of the electrode open circuit potentials at various states of lithiation. The open circuit potentials were used in the single particle model. Additionally, rate capability tests and degradation cycling are conducted on the AMP20 to fit the single particle model parameters. The LFP particle resistance was empirically fit and depended upon the state of lithiation and whether the battery was charging or discharging. A sensitivity analysis of the Tafel equation was performed to determine that the parasitic current density was a function of the negative electrode potential, the solid electrolyte interface (SEI) film resistance, and the negative electrode current density. The operational state-of-charge (SOC), the depth-of-discharge (DOD), the history of the battery, and the battery current are all vehicle parameters that impact the parasitic current density. For low current operations, a change in the SOC will yield the largest change in parasitic current density. For high current operation of a fresh battery, a change in the SEI resistance yields the largest change in parasitic current density; while an aged battery yields the largest change in parasitic current density from a change in battery current. It was determined that the SEI resistance did not prove to be a significant factor affecting battery degradation. It was also determined that a long charging time, a high operational SOC, a large DOD, and aggressive current demand are primary factors that increase battery degradation. Simulations on the single particle model were conducted to assess the degradation rates of common Environmental Protection Agency (EPA) drive cycles. The simulation results showed that the degradation rate significantly depends upon the duty cycle. The UDDS cycle degraded the battery at a dramatically faster rate than the US06 and HWFET cycles for 80% initial SOC, 17.41, 3.08, and 4.64 (µAh Li+)(Ah Processed)^-1; at 50% initial SOC, 7.04, 1.79, and 2.14 (µAh Li+)(Ah Processed)^-1; and at 20% initial SOC, 1.85, 0.46, and 0.54 (µAh Li+)(Ah Processed)^-1. It was concluded that the operational SOC, the charging time, and the current demand are the primary factors that affect the degradation rate of a duty cycle. Further simulations were performed with 1C charging after the duty cycle to return the operational SOC to its initial value and to account for the duty cycle DOD. Accounting for the duty cycle DOD increased degradation by between 47% and 86%; providing evidence that the DOD is an important factor of degradation. An analysis of battery degradation on realistic driving behaviour was conducted using four sets of real-world driving data of Nissan Leaf drivers taken from CrossChasm Technologies Inc.’s real-world driving database. The charging time, mean operational SOC, mean DOD, and current demand were used to hypothesize that Driver 1, Driver 3, Driver 2, and Driver 4 would have the highest to lowest degradation over an eight year period. The simulation results on the driver’s duty cycles agreed with the hypothesis, producing 5.51%, 5.17%, 4.16%, and 0.75% capacity fade, respectively. Therefore the conclusions made from the sensitivity analysis and the EPA study are applicable for real-world data. The key finding from this work is that battery degradation depends on the duty cycle. Specifically, the charging time, the operational SOC, the DOD, and the current demand are all factors affecting battery degradation. Ultimately, the battery degradation rate is unique for each driver, depending on those factors.

Description

Keywords

Lithium Ion Battery, Degradation, Modeling, Vehicles, LiFePO4, Battery, Drive Cycle, Duty Cycle

LC Keywords

Citation