The notion of equivalent number of degrees of freedom (e.d.f.) to be used in neural network modeling from small datasets has been introduced in Ingrassia and Morlini (2005). It is much smaller than the total number of parameters and it does not depend on the number of input variables. We generalize our previous results and discuss the use of the e.d.f. in the general framework of multivariate nonparametric model selection. Through numerical simulations, we also investigate the behavior of model selection criteria like AIC, GCV and BIC/SBC, when the e.d.f. is used instead of the total number of the adaptive parameters in the model.
Equivalent number of degrees of freedom for neural networks
INGRASSIA, Salvatore;
2007-01-01
Abstract
The notion of equivalent number of degrees of freedom (e.d.f.) to be used in neural network modeling from small datasets has been introduced in Ingrassia and Morlini (2005). It is much smaller than the total number of parameters and it does not depend on the number of input variables. We generalize our previous results and discuss the use of the e.d.f. in the general framework of multivariate nonparametric model selection. Through numerical simulations, we also investigate the behavior of model selection criteria like AIC, GCV and BIC/SBC, when the e.d.f. is used instead of the total number of the adaptive parameters in the model.File | Dimensione | Formato | |
---|---|---|---|
Ingrassia_Morlini (2007)_StudiesInClassification.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Non specificato
Dimensione
292.72 kB
Formato
Adobe PDF
|
292.72 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.