The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.

Metaheuristics in automated machine learning: Strategies for optimization

Francesco Zito;Claudia Cavallaro;Vincenzo Cutello;Mario F. Pavone
2025-01-01

Abstract

The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.
2025
Artificial neural network
Automated machine learning
Deep learning
Hyperparameter optimization
Metaheuristics
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/674399
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact