In recent years, federated learning and other distributed machine learning solutions have emerged to enable collaborative synthesis of machine learning models in wireless sensor networks. However, these approaches require the exchange of large volumes of data to reach convergence, which is costly in terms of communication and computing resources. Gossiping, which naturally fits with the multihop communication paradigm featured in most wireless sensor networks application scenarios, can reduce the amount of exchanged data significantly. In this paper we investigate how the exploitation of topological information about the wireless sensor network can accelerate the convergence and also improve efficiency in terms of network resource consumption. More specifically, we introduce a gossiping learning protocol which exploits the centrality information to reduce the communication rounds needed and, thus, the amount of data exchanged. We performed a large number of experiments by considering different centrality measures and observed that exploitation of centrality information makes the convergence faster when compared to the case in which such information is not used.

Centrality-aware gossiping for distributed learning in wireless sensor networks

Galluccio L.;Morabito G.
2022-01-01

Abstract

In recent years, federated learning and other distributed machine learning solutions have emerged to enable collaborative synthesis of machine learning models in wireless sensor networks. However, these approaches require the exchange of large volumes of data to reach convergence, which is costly in terms of communication and computing resources. Gossiping, which naturally fits with the multihop communication paradigm featured in most wireless sensor networks application scenarios, can reduce the amount of exchanged data significantly. In this paper we investigate how the exploitation of topological information about the wireless sensor network can accelerate the convergence and also improve efficiency in terms of network resource consumption. More specifically, we introduce a gossiping learning protocol which exploits the centrality information to reduce the communication rounds needed and, thus, the amount of data exchanged. We performed a large number of experiments by considering different centrality measures and observed that exploitation of centrality information makes the convergence faster when compared to the case in which such information is not used.
2022
978-3-903176-48-5
centrality
distributed training
federated learning
gossiping
Wireless sensor networks
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/542106
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact