Battery cooling

Battery thermal runaway CFD simulation with Simcenter STAR-CCM+ illustrating gas diffusion details from cylindrical cells
(Image courtesy of Siemens Simcenter)

Cool chargings

Peter Donaldson examines the complexities of fluid-based thermal management in high-energy battery systems

As batteries pack more energy into a given mass and volume while charging speeds increase, engineers face growing challenges in managing heat dissipation efficiently and safely. In wrestling with this problem, they are reshaping thermal management strategies while navigating the complex considerations involved in selecting cooling media and distribution systems, and balancing the thermal, electrical and mechanical properties of materials.

Modern batteries generate significantly higher heat loads than did their predecessors of just a few years ago. The shift toward larger cells has reduced their surface-area-to-volume ratios, which makes heat rejection during fast charging more difficult.

To address this problem, some manufacturers are adopting immersion cooling techniques using dielectric fluids. These fluids enable direct contact with cell surfaces, allowing heat to be removed more efficiently while minimising temperature gradients within cells and across the pack.

Another innovative solution now entering production is the use of flexible cooling ribbons that conform to cell geometries, eliminating gap fillers, adhesives and thermal interface materials. Light and slim, they increase energy density and reduce the amount of coolant needed, while enlarging the area in contact with cylindrical cells thanks to a larger wrapping angle. This improves thermal management in fast charging and in extreme ambient temperatures.

The constant push for higher efficiency and reduced weight has driven innovation in thermal interface materials, including advanced adhesives and gap fillers. These materials play a vital role in bonding cells and conducting heat away from critical areas. New cell configurations and substrate materials require continuous adaptation of these solutions, leading to continuous improvement and innovation.

Another major trend is the integration of cooling systems directly into battery packs, moving away from external cooling plates. This shift has spurred the development of specialised dielectric and low-conductivity coolants that can safely interact with electrical components.

Faster charging demands systems that can handle high peak thermal loads during rapid charging while avoiding excessive cooling in normal operating conditions. This has given rise to advanced cooling architectures, including multi-functional heat exchangers and predictive control algorithms.

Safety is a perennial priority, with stricter regulations pushing for better thermal runaway mitigation. New simulation tools are being employed to model heat propagation and optimise cooling strategies, particularly for lithium iron phosphate, future chemistries and solid-state designs, which exhibit varied heat generation profiles.

 
Betamate TC thermal conductive adhesives are used for cell-to-cell bonding and other applications for battery pack assembly
(Image courtesy of DuPont)

Fluid choices

The choice of cooling fluid is a critical decision that balances thermal performance, system complexity, cost and safety. Several options are available, each with distinct advantages and limitations.

Water–ethylene glycol (WEG) solutions have become the industry standard for many applications, and are familiar to engineers from internal combustion applications. They offer excellent heat transfer properties, are cost-effective and work well in closed-loop systems. Weakly conductive or dielectric WEG variants reduce electrical risk in case of leaks, making them a safer option for battery packs.

Immersion cooling using dielectric oils provides superior thermal performance by enabling direct contact with battery cells. This approach can simplify pack design by eliminating the need for separate cooling plates or channels. While the higher cost and integration challenges have so far limited its adoption to high-performance or specialised applications, this could soon change.

Two-phase cooling systems using refrigerants offer rapid heat extraction, making them ideal for fast charging. However, they are more complex, requiring additional components like condensers and evaporators. Additionally, flammability concerns and regulatory restrictions on certain refrigerants further complicate their use.

Cold plates must be compatible with a wide range of fluids, from water–glycol mixtures to dielectric fluids to ensure corrosion resistance and pressure containment etc
(Image courtesy of Baknor)

Calculating heat flux

Determining the maximum heat flux a cooling system must handle begins with understanding cell behaviour under extreme conditions. Fast-charging events typically represent the most thermally demanding scenarios, requiring systems to dissipate substantial heat loads while maintaining safe cell operating temperatures. Engineers employ multi-scale modelling approaches, starting with electrochemical analysis of individual cells and scaling up to full pack simulations.

Modern thermal analysis combines several critical methodologies. Electrochemical-thermal models quantify heat generation from multiple sources including Joule heating and entropy changes during charge/discharge cycles. These models account for how internal resistance varies with both state of charge and temperature – a crucial factor because resistance decreases as the battery heats up (while electrical resistance in busbars and connectors increases with temperature), but the temperature must remain below the thresholds that accelerate degradation. Conjugate heat transfer analysis is used in the modelling of the complex interaction between solid components and cooling fluids, predicting interface heat fluxes with high precision.

Validation remains essential, with calorimeter testing and infrared thermography used to verify simulation results within 5% error margins. The most advanced workflows now integrate 1D system models of pumps, radiators and HVAC loops with detailed 3D computational fluid dynamics models of systems where the detailed geometry is of specific interest. This combined approach captures not just cell-level heating but also secondary effects such as busbar temperature rise and transient thermal inertia. The computational demands are substantial, driving adoption of GPU-accelerated solvers that provide order-of-magnitude speed improvements over traditional CPU-based systems.

Miba employs a U-flow configuration and parallel routing of individual Flexcooler ribbons to
each thermal interface to maintain stable temperatures even among cells in high C-rate charging

(Image courtesy of Miba)

Uniformity and efficiency

Maintaining tight temperature uniformity across cells has emerged as one of the most demanding aspects of battery thermal management. Modern systems target ±2–3 C differentials, a significant tightening from the ±10 C tolerances common just a few years ago. This challenge is compounded by fundamental material characteristics – most cell designs exhibit anisotropic thermal conductivity, with cylindrical cells conducting heat more than an order of magnitude more effectively along their axis than radially.

These material properties create significant and complex thermal gradients if not properly managed. Conventional bottom-cooling approaches can inadvertently worsen temperature differentials in cylindrical cell stacks by effectively cooling only one end while heat accumulates at the opposite pole. This has driven development of more sophisticated cooling strategies that account for directional heat transfer characteristics. Consequently, immersion cooling has gained attention for its ability to provide omnidirectional heat extraction, while advanced cold plate designs incorporate microchannels and tailored flow distribution to maintain uniform interface temperatures.

Flow path optimisation plays a critical role in achieving temperature uniformity. U-flow configurations with parallel cooling channels, combined with precisely calibrated hydraulic restrictors, help balance mass flow distribution across complex pack geometries. Counter-flow designs minimise outlet temperature rise, while conformable cooling ribbons maintain consistent thermal contact across cell surfaces. These mechanical solutions are increasingly paired with intelligent control systems incorporating real-time thermal sensors and adaptive algorithms that dynamically adjust cooling parameters in response to actual operating conditions.

The energy overhead of thermal management systems represents a significant design consideration, with typical consumption ranging from 0.5% to 10% of the total battery energy depending on system type and operating conditions. Liquid-cooled systems generally achieve the lowest parasitic losses, with well-designed implementations consuming less than 0.5% of energy throughput for pumping power alone. Air-cooled systems, while simpler in concept, typically require several percent of the total energy owing to air’s low heat capacity and the high volumetric flow rates needed for effective cooling.

Several factors influence overall energy consumption. Pressure drop in liquid systems dominates pumping power requirements, driving innovation in low-loss channel geometries and hydraulic optimisation. Fluid selection also plays a role, with low-viscosity dielectric fluids gaining favour for their ability to reduce pumping losses while maintaining excellent thermal performance. At the system level, variable-speed pumps and smart control strategies that match cooling capacity to immediate thermal loads can reduce energy consumption by 15–25% compared with fixed-flow designs.

Extreme conditions present additional challenges. Fast charging in hot climates may temporarily increase cooling energy demands beyond 10% of pack capacity when active refrigeration is required. Cold weather operation introduces complementary challenges, with battery heating requirements sometimes exceeding cooling loads. These scenarios have spurred development of integrated thermal management systems that share resources between battery cooling and vehicle HVAC systems, as well as increased use of passive thermal storage elements to smooth peak demands.

The field continues to evolve rapidly in response to three competing imperatives: handling increasing heat fluxes from 3C+ charging, minimising parasitic losses to meet efficiency targets, and maintaining safety margins throughout cell lifespan. Recent innovations include novel dielectric fluid formulations that optimise the balance between thermal performance and electrical safety, along with additive-manufactured cooling structures featuring optimised flow paths impossible to produce with conventional methods.

Digital twin approaches are gaining traction for their ability to support real-time thermal optimisation, while advanced control algorithms increasingly incorporate predictive capabilities that anticipate thermal transients rather than simply reacting to them.

To minimise energy consumption through pumping losses, simulation in cold plate design is used to optimise flow velocity, pressure and flow dynamics for efficient, effective cooling
(Image courtesy of Baknor)

Extreme cases

Designing thermal management systems that perform reliably across temperature extremes ranging from -30 to +50 C requires addressing fundamentally different challenges at each end of the spectrum. In cold climates, the primary obstacle involves maintaining cell temperatures above critical thresholds that enable charging and discharge. Modern systems employ several strategies to combat sub-zero conditions, beginning with preconditioning protocols that heat batteries before charging commences. Lack of adequate pre-heating can double charging times, so optimising the process is crucial.

The thermal path from heat source to battery cells presents multiple barriers in conventional systems. Traditional water–glycol cooling architectures require heat to traverse several interfaces: throughout the body of the cell, across thermal interface materials and through the cooling plate.

This tortuous path reduces efficiency and slows response times. Dielectric fluids in immersion cooling systems offer distinct advantages here, providing direct omnidirectional heat transfer to cell surfaces while requiring less energy to raise fluid temperatures compared with water-glycol mixtures owing to their lower specific heat capacity.

At the opposite extreme, high ambient temperatures demand robust cooling capacity to handle peak heat loads in 50+ C environments. System designers employ various techniques including two-phase cooling with refrigerants, radiative heat shields to reduce ambient heat ingress and advanced control algorithms that dynamically adjust cooling parameters. Material selection becomes critical at both extremes – polymers for seals and gaskets must maintain mechanical properties across the full temperature range, while metals in cold plates require careful thermal stress analysis to prevent fatigue failure during repeated cycling.

Kautex’s cell-to-pack demonstrator shows a structural cell holder, integrated emergency venting, two-phase immersion cooling and flexible foil cooler options for its Pentatonic battery packs
(Image courtesy of the author)

Runaway mitigation strategies

As battery packs grow in terms of both size and energy density, preventing thermal runaway propagation has become a paramount safety concern. Multiple protection strategies have emerged, each with distinct mechanisms and implementation challenges. Immersion cooling has demonstrated particular effectiveness in testing scenarios, where dielectric fluids have successfully contained thermal events originating from deliberately punctured cells. The fluid’s ability to rapidly absorb and distribute heat, combined with proper venting system design, can prevent cascade failures even when multiple cells are driven into thermal runaway simultaneously.

Material science plays an equally critical role in runaway prevention. Phase -change materials (PCMs) between cells can absorb large amounts of heat via latent heat storage, while specialised thermal barriers using aerogels or mica sheets provide insulation against radiant and conductive heat transfer. Some systems leverage the properties of water–glycol mixtures themselves – when exposed to extreme heat, the water component vaporises, absorbing significant thermal energy through phase change.

Venting system design represents another crucial consideration. Effective thermal runaway mitigation requires careful routing of vent gases away from unaffected cells, often through dedicated manifolds or channels. This becomes particularly challenging in immersion systems where particular attention must be taken to ensure vent paths avoid displacing large volumes of dielectric fluid near the cells. Modern designs incorporate pressure relief valves and burst discs calibrated to specific failure modes, with geometries optimised to maintain fluid contact with cells even during gas expulsion events.

Fail safe design

Robust thermal management systems incorporate multiple layers of protection to maintain safety when primary cooling functions are compromised. At the most basic level, comprehensive sensor networks monitor temperatures throughout the pack, fluid flow rates and pump performance. These feed into battery management systems (BMSs) capable of detecting incipient failures through subtle changes in system parameters. When anomalies are detected, the BMS can initiate graduated responses ranging from power derating to complete system shutdown, depending on severity.

The thermal mass of battery systems themselves provides inherent short-term protection. Even with failed cooling, most packs can absorb several minutes of heat generation before reaching critical temperatures. This buffer period allows time for protective measures to engage, although the duration varies significantly with pack design and operating conditions. Some implementations augment this inherent capacity with passive thermal buffers – PCMs or heat sinks that extend the window for safe shutdown or limited ‘limp home’ operation.

Redundancy approaches differ substantially by application. Automotive systems typically prioritise cost and weight savings over full redundancy, instead relying on robust single-channel designs proven to meet safety standards even with complete cooling failure. In contrast, aerospace applications such as eVTOL aircraft often incorporate fully redundant cooling circuits and pumps, reflecting the catastrophic consequences of total system failure in flight. Across all implementations, the trend toward digital twin technology enables more sophisticated failure prediction, with real-time system models comparing actual performance against expected parameters to identify degradation before it causes a failure.

Valeo’s modular thermal management approach encompasses refrigerant plate, coolant plate and immersion cooling systems, all exploiting heat pumps
(Image courtesy of Valeo)

Regulation and simulation

Battery developers face increasingly stringent global safety standards including UN ECE R100, ISO 6469, and China’s forthcoming GB38031-2025 rules. These standards mandate rigorous thermal runaway mitigation, vibration resistance and electrical safety protocols that fundamentally influence cooling system design. Compliance verification now requires extensive simulation and physical testing, adding approximately 10–15% to development timelines because engineers must demonstrate system resilience under worst-case scenarios.

Advanced simulation tools have become indispensable for navigating these requirements. Engineers employ multi-physics modelling to replicate thermal shock tests, mechanical abuse conditions and propagation scenarios specified in the standards. The most sophisticated workflows combine computational fluid dynamics for thermal analysis with structural simulations to evaluate mechanical integrity. These virtual verification frameworks allow designers to explore multiple protection strategies before committing to physical prototypes, significantly reducing compliance certification costs.

The regulatory push has particularly impacted thermal runaway prevention methodologies. Where early standards focused primarily on electrical safety, newer iterations such as GB38031-2025 demand complete prevention of fires or explosions – a requirement that favours immersion cooling solutions with their proven ability to contain thermal events. This has accelerated development of dielectric fluids that combine excellent heat transfer with intrinsic electrical insulation properties, creating systems that address both performance and safety requirements simultaneously.

Condensation considerations

While modern battery systems predominantly use liquid cooling, condensation remains a critical design consideration – particularly for applications where air exposure cannot be completely avoided. In humid environments, uncontrolled moisture accumulation can lead to electrical shorts, corrosion and reduced thermal performance. Traditional air-cooled systems faced these challenges acutely, requiring active dehumidification strategies that often compromised energy efficiency.

Contemporary liquid-cooled systems address moisture intrusion through multiple barriers. Sealed enclosures with IP67 or higher ratings prevent ambient air ingress, while specialised dielectric fluids exhibit hydrophobic properties that prevent water absorption to maintain the fluid’s dielectric properties. These fluids thermodynamically destabilise water mixtures, causing moisture to separate and collect in designated drainage points rather than dispersing through the cooling circuit. Material selection plays an equally important role, with anodised aluminium surfaces and hydrophobic coatings reducing condensation adhesion even in hybrid cooling systems that incorporate some air exposure.

For applications where complete sealing isn’t feasible, engineers have developed predictive strategies using real-time humidity sensors coupled with adaptive fan control algorithms. These systems maintain surface temperatures slightly above dew point through precise modulation of cooling intensity, preventing condensation formation while minimising energy expenditure. The approach demonstrates how modern thermal management increasingly relies on integrated hardware and software solutions to address environmental challenges.

Weight optimisation

Cooling system mass represents a significant penalty in battery pack design, typically adding 0.5–5 kg per kWh of capacity depending on architecture and materials. This weight directly impacts vehicle range and performance, driving intensive efforts to minimise cooling system mass without compromising thermal performance. Consequently, several innovative approaches have emerged.

Material science breakthroughs have enabled thinner, stronger cooling components. High-conductivity, high-strength aluminium alloys now allow cold plate thickness reductions of up to 30% compared with that of conventional designs, while maintaining structural integrity and heat transfer capability. Some systems eliminate traditional thermal interface materials entirely, instead using direct cooling surfaces that conform to cell geometries – an approach that both saves weight and lowers thermal resistance.

System architecture plays an equally important role in weight reduction. Microchannel designs optimise fluid flow paths to minimise required coolant volumes, while integrated structural cooling components serve dual purposes as both heat exchangers and load-bearing elements. The most advanced implementations achieve cell spacing as tight as 1 mm through precision cooling channel routing, maximising energy density without creating thermal bottlenecks. These compact designs demonstrate how holistic engineering can reconcile the seemingly competing objectives of thermal performance and mass efficiency.

Pressure drop optimisation

Hydraulic efficiency represents a critical but often overlooked aspect of cooling system performance. Excessive pressure drop increases pump energy consumption and can lead to uneven cooling distribution across large battery packs. Modern systems employ several strategies to minimise flow resistance while maintaining thermal transfer effectiveness.

Channel geometry optimisation stands as the primary tool for pressure drop reduction. Computational fluid dynamics enables designers to develop smooth, curved flow paths and exploit turbulence to enhance heat transfer and minimise pressure losses – turbulent boundary layers are less likely to separate in adverse pressure gradients. Parallel channel architectures distribute flow more evenly than traditional serial designs, reducing local velocity peaks that drive pressure losses. Some implementations achieve 20–30% reductions in hydraulic resistance through topology-optimised channel layouts that balance thermal and fluid dynamic requirements.

Fluid selection complements these geometric improvements. Low-viscosity dielectric oils can reduce pumping power by 10–15% compared with conventional water–glycol mixtures while maintaining comparable heat transfer characteristics. When paired with variable-speed pumps that precisely match flow to thermal demand, these fluids enable substantial energy savings across the operating envelope.

Integrated cooling

The trend toward integrating battery cooling with vehicle HVAC systems represents one of the most significant efficiency opportunities in modern EV design. Combined systems can reduce total thermal management energy consumption by 10–15% while decreasing component count and weight.

Shared refrigerant loops employ the HVAC system’s existing chiller capacity to cool batteries during high-load conditions. This approach eliminates redundant components while providing access to the HVAC system’s superior temperature control capabilities. Reversible heat pump configurations take integration further, allowing waste heat from one system to benefit another – such as using battery heat to warm the cabin in cold conditions. These configurations require sophisticated control algorithms to dynamically balance competing thermal demands, but can significantly improve overall vehicle efficiency.

The integration challenge extends beyond just fluid circuits. Advanced thermal storage systems using PCMs can temporarily absorb heat peaks, smoothing demands on shared cooling resources. Predictive thermal management algorithms anticipate driving and charging patterns to precondition systems optimally. Together, these technologies demonstrate how breaking down traditional thermal silos can yield substantial performance benefits across the entire vehicle system.

Maintaining energy density

Every gram and cubic centimetre dedicated to cooling represents lost potential for energy storage, making thermal system compactness crucial for maximising battery pack energy density. Modern cooling approaches typically reduce gravimetric energy density (unit: Wh/kg) by 5–10% and volumetric density (unit: Wh/L) by 6–10%, although advanced designs can mitigate these impacts significantly.

The most effective strategies focus on eliminating non-functional mass and volume. Conformal cooling systems that integrate directly with cell geometries avoid the empty space inherent in traditional cooling plate arrangements. Some designs incorporate cooling channels into cell casings themselves, achieving thermal transfer without any separate cooling component volume. Material selection plays an equally important role, with composite materials and thin-wall metals reducing structural mass while maintaining performance.

Developing immersion cooling approaches may fundamentally change the energy density equation. By eliminating inter-cell spacing materials and leveraging the coolant itself as both thermal transfer medium and electrical insulator, these systems can achieve packaging factors approaching 90% – a significant improvement over conventional designs.

Tough duty

Off-highway EVs and industrial applications present unique thermal management challenges that go beyond typical automotive requirements. The combination of dust, water, vibration and extreme swings of temperature demands robust cooling solutions with enhanced protection features. Modern systems address these challenges through multiple layers of protection, beginning with sealed enclosures rated to IP67 or higher standards that prevent particulate and moisture ingress.

The most advanced implementations employ specialised hydrophobic coatings on critical components to repel water and prevent corrosion, while high-efficiency particulate filters maintain cooling performance in dusty conditions. Some designs incorporate self-cleaning mechanisms using periodic reverse airflow pulses to clear accumulated debris from heat exchangers. Again, material selection plays a crucial role, with corrosion-resistant aluminium alloys and polymer composites replacing conventional materials in harsh environments.

It’s just a phase

PCMs continue to attract research interest for their potential to smooth thermal transients and reduce peak cooling loads. Current exploration focuses on two primary categories: paraffin-based organic PCMs offering high latent heat capacity (150 J/g), and advanced bio-derived materials.

However, the limited temperature operating range of PCMs presents a fundamental challenge. These materials provide their full thermal benefit only during phase transition, becoming much less effective once fully melted or vaporised. This narrow window of operation requires precise system tuning to match the expected thermal load profile. Weight efficiency remains another concern because the mass of PCM required to meaningfully impact battery temperatures often offsets the energy density gains.

Two-phase cooling systems using specialised refrigerants show more immediate potential for high-heat-flux applications. Modern low-global-warming-potential fluids like R1234yf can handle heat fluxes of 5–10 W/cm² while maintaining stable performance across wide temperature ranges. However, these systems introduce their own complexities, including potential material compatibility issues and the need for robust containment systems to handle pressure variations. Furthermore, maintaining temperature uniformity in two-phase systems is challenging as the transformed fluid no longer transfers heat.

Two directions

Vehicle-to-grid and vehicle-to-load capabilities are creating new thermal management challenges as batteries cycle more frequently between charging and discharging states. While most residential bidirectional systems operate at relatively modest 7–11 kW levels, the cumulative effect of repeated thermal cycling can accelerate battery degradation if not properly managed.

The key challenge lies in maintaining tight temperature control during these bidirectional power flows, particularly in extreme ambient conditions. Effective thermal management can extend the operational envelope for vehicle-to-everything applications, allowing battery systems to participate in grid services even during cold weather when the economic value may be highest. Some implementations now incorporate predictive preconditioning algorithms that warm batteries to optimal temperatures in anticipation of scheduled discharge events.

Thermal system durability also becomes more critical in bidirectional applications. Components such as pumps and valves must withstand significantly increased duty cycles compared with conventional single-direction charging scenarios. This has driven development of more robust cooling system components with enhanced wear resistance and maintenance-free operation targets.

Cool intelligence

Artificial intelligence (AI) and machine learning are transforming battery thermal management from reactive to predictive systems. Modern implementations use AI in several key ways: forecasting thermal loads based on driving patterns and charging history, optimising coolant flow in real time to minimise energy consumption and detecting early signs of thermal anomalies.

The most advanced systems combine reduced-order physical models with machine learning trained on extensive operational data. This hybrid approach maintains the interpretability of physics-based models while exploiting AI’s pattern recognition capabilities for subtle thermal behaviours. Some implementations have demonstrated 15–20% reductions in cooling energy use.

Emerging ‘smart virtual sensing’ techniques show particular promise, using AI to infer internal battery temperatures from easily measurable external parameters. This could eventually reduce or eliminate the need for extensive internal sensor arrays while maintaining or improving thermal monitoring accuracy.

Looking ahead

The coming half-decade will likely see several significant advances in battery thermal management reach commercial maturity. Immersion cooling appears poised for mainstream adoption, with its ability to enable faster charging, improve safety and increase energy density through tighter cell packing. Early implementations suggest potential for 40% faster charging rates and up to 22% longer battery life. The competition between immersion and flexible ribbon cooling will be worth watching.

Solid-state battery commercialisation will drive corresponding innovations in thermal system design. While solid-state cells typically operate at higher temperatures than conventional lithium-ion cells, maintaining inter- and intra-cell temperature limits is still key, and warm-up is important. Their thermal management often demands tighter tolerances, favouring compact cooling systems capable of maintaining temperatures within narrow windows. Some designs may reduce cooling capacity requirements by 30% while maintaining safety margins.

Sustainable materials will play an increasingly prominent role, with bio-based PCMs and recyclable coolant formulations helping meet stringent environmental regulations. The European Union’s evolving sustainability mandates in particular are driving this trend, pushing developers to consider full lifecycle impacts in thermal system design.

Integration will remain a central theme, both at the component level through structural cooling solutions and at the system level through tighter coupling with vehicle HVAC systems. These developments collectively point toward a future where thermal management becomes less of a stand-alone system and more of an intrinsic, optimised property of the pack.

Acknowledgements

The author would like to thank the following for their help with this feature: Michael Ruscigno, co-founder at Baknor; Mark Payne, senior manager for research, and Robert Timmis, expert in thermal management modelling at Castrol (BP); Sergio Grunder, scientist at DuPont; Boris Meisterjahn, senior manager, sales, battery systems, and Gero Mimberg, manager, thermal systems at Kautex Textron; Florian Wiedrich, head of sales, and Franz Poehn, team lead r&d, testing, simulation at Miba; Mike Tomlin, principal engineer, attributes, Scott Porteous, principal engineer, attributes, Temoc Rodriguez, GTE, electrification, and Nicholas Higginson, principal engineer, design & development at Ricardo; Gaetan Bouzard, Simcenter industry lead, battery & heavy equipment at Siemens Digital Industries Software; Francois Bordes, power thermal cluster director at Valeo.

Some suppliers of battery cooling 

Baknor

Boyd Corp       

Castrol 

Dober 

DuPont

Grayson Thermal Systems

Kautex Textron

Miba

Ricardo

Siemens Digital Industries Software

TLX Technologies

Valeo

Voss Automotive

Webasto          

         

 

 

 

           

    

           

     

            

ONLINE PARTNERS