There is a growing interest in the joint application of graph signal processing and neural networks (NNs) for learning problems involving complex, non-linear and/or non-Euclidean datasets. This paper proposes an overparametrized graph-aware NN architecture able to represent a non-linear mapping between two graph signals, each defined on a different graph. The considered architecture is based on two NNs and a common latent space. Specifically, we consider an overparametrized graph-aware NN encoder which maps the input graph signal to a latent space, followed by an overparametrized graph-aware NN decoder that transforms the latent representation to the output graph signal. The parameters of the two NNs are jointly tuned by applying the back-propagation algorithm with an early stopping procedure to prevent overfitting. The overall architecture can be interpreted as an overparametrized graph-aware encoder/decoder NN operating over two different graphs. A key element in the encoder (decoding) scheme is the consideration of a nested collection of parametric graph-aware (down-) up-sampling operators, whose design will be studied in detail. We show by numerical simulations that the proposed scheme outperforms the corresponding benchmark NN architectures, previously introduced in the literature.

Overparametrized Deep Encoder-Decoder Schemes for Inputs and Outputs Defined over Graphs

Martino, L;
2021-01-01

Abstract

There is a growing interest in the joint application of graph signal processing and neural networks (NNs) for learning problems involving complex, non-linear and/or non-Euclidean datasets. This paper proposes an overparametrized graph-aware NN architecture able to represent a non-linear mapping between two graph signals, each defined on a different graph. The considered architecture is based on two NNs and a common latent space. Specifically, we consider an overparametrized graph-aware NN encoder which maps the input graph signal to a latent space, followed by an overparametrized graph-aware NN decoder that transforms the latent representation to the output graph signal. The parameters of the two NNs are jointly tuned by applying the back-propagation algorithm with an early stopping procedure to prevent overfitting. The overall architecture can be interpreted as an overparametrized graph-aware encoder/decoder NN operating over two different graphs. A key element in the encoder (decoding) scheme is the consideration of a nested collection of parametric graph-aware (down-) up-sampling operators, whose design will be studied in detail. We show by numerical simulations that the proposed scheme outperforms the corresponding benchmark NN architectures, previously introduced in the literature.
2021
978-9-0827-9705-3
Graph Neural Networks
Graph Autoencoders
Non-Euclidean Data
Geometric Deep Learning
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/538018
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact