In this paper a new type of multilayer feedforward neural network is introduced. Such a structure, called hypercomplex mu[tilayerperceptron (HMLP), is deve[opedin quaterniona[gebra anda[[ows quaternionic input and output signals to be akalt with, requiring a lower number of neurons than the real MLP, thus providing a reduced computational complexity. The structure introduced represents a generalization of the multilayer perception in the complex space (CA4LP) reported in the literature. Thefundamental result reported in the paper is a new density theorem whichmakes HA4LPs universal interpolators of quaternion valued continuousfactions. Moreover theproof of the akmsity theorem can be restricted in orab to formulate a almsity theorem in the complex space. Due to the iaimtity between the quaternion and thefour-dimensional real space, such a structure is also useful to approximate multidimensional real valuedfunctions with a lower number of real parameters, &creasing the probability of being trapped in local minima during the iearning phase. A numerical example is also reported in order to show the eficiency of the proposed structure.

Multilayer Perceptrons to Approximate Quaternion Valued Functions

ARENA, Paolo Pietro;MUSCATO, Giovanni;
1997-01-01

Abstract

In this paper a new type of multilayer feedforward neural network is introduced. Such a structure, called hypercomplex mu[tilayerperceptron (HMLP), is deve[opedin quaterniona[gebra anda[[ows quaternionic input and output signals to be akalt with, requiring a lower number of neurons than the real MLP, thus providing a reduced computational complexity. The structure introduced represents a generalization of the multilayer perception in the complex space (CA4LP) reported in the literature. Thefundamental result reported in the paper is a new density theorem whichmakes HA4LPs universal interpolators of quaternion valued continuousfactions. Moreover theproof of the akmsity theorem can be restricted in orab to formulate a almsity theorem in the complex space. Due to the iaimtity between the quaternion and thefour-dimensional real space, such a structure is also useful to approximate multidimensional real valuedfunctions with a lower number of real parameters, &creasing the probability of being trapped in local minima during the iearning phase. A numerical example is also reported in order to show the eficiency of the proposed structure.
1997
neural networks; quaternion; Function approximation; density theorem
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/12017
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 126
  • ???jsp.display-item.citation.isi??? 108
social impact