Efficient thermal management is essential for the reliability and performance of traction inverters. However, direct optimization via Computational Fluid Dynamics (CFD) is often impractical due to the high dimensionality of the design space and the high computational cost of each simulation. To overcome this limitation, a surrogate-based optimization framework is developed to enhance the thermal and hydraulic performance of an automotive traction inverter cooling system. The methodology integrates CFD, deep neural networks (DNNs), and a multi-objective evolutionary algorithm. A simplified representation of the ACEPACKTM DRIVE power module is employed to generate an extensive dataset through automated, GPU-accelerated CFD simulations, making data generation computationally feasible while avoiding the prohibitive cost of direct optimization. A DNN surrogate model is trained to accurately predict pressure drop and heated-wall temperature, achieving mean relative errors below 3% and 1%, respectively. This surrogate model then guides a Non-Dominated Sorting Genetic Algorithm III in the optimization of key geometric parameters, including pin-fin diameter, spacing, height, wall clearance, as well as of physical parameter such as the surface roughness of the pin-fins. CFD-based validation of the Pareto-optimal designs, performed on the full inverter geometry, indicates reductions of up to 25% in pressure drop and approximately 2% in junction temperature. These results suggest that the proposed methodology promises robustness and generalizability, showing good potential for further application in data-driven thermal design optimization.
Optimization of pin-fin arrangement in traction inverter cooling systems: A framework based on CFD simulations, deep neural networks and evolutionary algorithms
Luca Donetti;Gaetano Patti;Stefano Mauro;Gaetano Sequenzia;
2025-01-01
Abstract
Efficient thermal management is essential for the reliability and performance of traction inverters. However, direct optimization via Computational Fluid Dynamics (CFD) is often impractical due to the high dimensionality of the design space and the high computational cost of each simulation. To overcome this limitation, a surrogate-based optimization framework is developed to enhance the thermal and hydraulic performance of an automotive traction inverter cooling system. The methodology integrates CFD, deep neural networks (DNNs), and a multi-objective evolutionary algorithm. A simplified representation of the ACEPACKTM DRIVE power module is employed to generate an extensive dataset through automated, GPU-accelerated CFD simulations, making data generation computationally feasible while avoiding the prohibitive cost of direct optimization. A DNN surrogate model is trained to accurately predict pressure drop and heated-wall temperature, achieving mean relative errors below 3% and 1%, respectively. This surrogate model then guides a Non-Dominated Sorting Genetic Algorithm III in the optimization of key geometric parameters, including pin-fin diameter, spacing, height, wall clearance, as well as of physical parameter such as the surface roughness of the pin-fins. CFD-based validation of the Pareto-optimal designs, performed on the full inverter geometry, indicates reductions of up to 25% in pressure drop and approximately 2% in junction temperature. These results suggest that the proposed methodology promises robustness and generalizability, showing good potential for further application in data-driven thermal design optimization.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


