In a paper entitled “Understanding the limits of rapid charging using instrumented commercial 18650 high-energy Li-ion cells,” published in the journal Electrochimica Acta, the researchers explain that their test of internal temperature and electrode potentials works in situ during a battery’s normal operation and has no impact on performance. They claim that the new technology will enable advances in battery materials science, flexible battery charging rates, and thermal and electrical engineering of new battery materials and technology.
When a Li-ion battery becomes overheated, it risks severe damage or even catastrophic failure. Situations can arise in which the electrolyte breaks down to form gases that are both flammable and cause significant pressure build-up. Also, overcharging of the anode can lead to excessive lithium electroplating which forms metallic dendrites that can pierce the separator, causing an internal short circuit and subsequent catastrophic failure.
To prevent this sort of thing, manufacturers stipulate a maximum charging rate based on estimations of the maximum allowable temperature and potential levels. Until now, internal temperature testing of a battery has proved impractical, at least without significantly affecting the battery’s performance. Instead, manufacturers have had to rely on limited, external instrumentation. This method provides imprecise readings and leads manufacturers to assign very conservative limits on maximum charging speed.
However, the new test developed by the Warwick researchers “allows direct, highly precise internal temperature and per-electrode status monitoring.“ This is achieved by using in-situ battery sensing that “employs miniature reference electrodes and Fiber Bragg Gratings (FBG) threaded through a bespoke strain protection layer. An outer skin of fluorinated ethylene propylene (FEP)…