Edge-side Prognostics and Health Management (PHM) for Electric Vehicles (EVs) demands vehicle sub-systems Remaining Useful Life (RUL) predictors running on automotive-grade Microcontroller Units (MCUs) under tight memory, latency, and energy budgets. Classical Knowledge Distillation (KD) often degrades long-horizon accuracy and fails on the complex forecasting-to-decision pipeline for Silicon-Carbide (SiC) traction-inverter power modules. We propose Neuro-Modulated Knowledge Distillation (NM-KD), distilling a Multi-Scale Temporal Fusion Transformer (TFT-MS) into a lightweight model deployable on MCUs. NM-KD adapts loss weights via learned gates, bounds student sensitivity with a Lipschitz regularizer, and leverages Stochastic Weight Averaging (SWA) modulation. The TFT-MS teacher is benchmarked against several architectures spanning six design families, while the student selection is validated against five MCU-deployable alternatives. On SiC power devices, the student matches short-horizon performance and outperforms the teacher at long horizon (50-step accuracy 95.13% vs. 94.32%). On-target deployment reduces non-volatile memory by 83.12%, latency by 81.73%, and energy by 60.57%. Cross-batch generalization on 21 modules from two independent manufacturing lots confirms robustness to process variability. A lightweight temporal remapping converts accelerated-cycle predictions into field-relevant RUL estimates. NM-KD enables fast, low-power, accurate PHM on automotive Electronic Control Units (ECUs).
Stability–plasticity inspired knowledge distillation expert system with Lipschitz-regularized neuro-modulation for silicon-carbide power modules health monitoring in next-generation electric vehicles
Rundo F.;Spata M. O.;
2026-01-01
Abstract
Edge-side Prognostics and Health Management (PHM) for Electric Vehicles (EVs) demands vehicle sub-systems Remaining Useful Life (RUL) predictors running on automotive-grade Microcontroller Units (MCUs) under tight memory, latency, and energy budgets. Classical Knowledge Distillation (KD) often degrades long-horizon accuracy and fails on the complex forecasting-to-decision pipeline for Silicon-Carbide (SiC) traction-inverter power modules. We propose Neuro-Modulated Knowledge Distillation (NM-KD), distilling a Multi-Scale Temporal Fusion Transformer (TFT-MS) into a lightweight model deployable on MCUs. NM-KD adapts loss weights via learned gates, bounds student sensitivity with a Lipschitz regularizer, and leverages Stochastic Weight Averaging (SWA) modulation. The TFT-MS teacher is benchmarked against several architectures spanning six design families, while the student selection is validated against five MCU-deployable alternatives. On SiC power devices, the student matches short-horizon performance and outperforms the teacher at long horizon (50-step accuracy 95.13% vs. 94.32%). On-target deployment reduces non-volatile memory by 83.12%, latency by 81.73%, and energy by 60.57%. Cross-batch generalization on 21 modules from two independent manufacturing lots confirms robustness to process variability. A lightweight temporal remapping converts accelerated-cycle predictions into field-relevant RUL estimates. NM-KD enables fast, low-power, accurate PHM on automotive Electronic Control Units (ECUs).I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


