Vehicular networks are expected to support many time-critical services requiring huge amounts of computation resources with very low delay. However, such requirements may not be fully met by vehicle on-board devices due to their limited processing and storage capabilities. The solution provided by 5G is the application of the Multi-Access Edge Computing (MEC) paradigm, which represents a low-latency alternative to remote clouds. Accordingly, we envision a multi-layer job-offloading scheme based on three levels, i.e., the Vehicular Domain, the MEC Domain and Backhaul Network Domain. In such a view, jobs can be offloaded from the Vehicular Domain to the MEC Domain, and even further offloaded between MEC Servers for load balancing purposes. We also propose a framework based on a Markov Decision Process (MDP) to model the interactions among stakeholders working at the three different layers. Such a MDP model allows a Reinforcement Learning (RL) algorithm to take optimal decisions on both the number of jobs to offload between MEC Servers, and on the amount of computing power to allocate to each job. An extensive numerical analysis is presented to demonstrate the effectiveness of our algorithm in comparison with static policies not applying RL.

Designing a Multi-Layer Edge-computing Platform for Energy-efficient and Delay-aware Offloading in Vehicular Networks

Fabio Busacca;Christian Grasso;Sergio Palazzo;Giovanni Schembra
2021-01-01

Abstract

Vehicular networks are expected to support many time-critical services requiring huge amounts of computation resources with very low delay. However, such requirements may not be fully met by vehicle on-board devices due to their limited processing and storage capabilities. The solution provided by 5G is the application of the Multi-Access Edge Computing (MEC) paradigm, which represents a low-latency alternative to remote clouds. Accordingly, we envision a multi-layer job-offloading scheme based on three levels, i.e., the Vehicular Domain, the MEC Domain and Backhaul Network Domain. In such a view, jobs can be offloaded from the Vehicular Domain to the MEC Domain, and even further offloaded between MEC Servers for load balancing purposes. We also propose a framework based on a Markov Decision Process (MDP) to model the interactions among stakeholders working at the three different layers. Such a MDP model allows a Reinforcement Learning (RL) algorithm to take optimal decisions on both the number of jobs to offload between MEC Servers, and on the amount of computing power to allocate to each job. An extensive numerical analysis is presented to demonstrate the effectiveness of our algorithm in comparison with static policies not applying RL.
2021
5G; Edge Computing; Markov Models; Reinforcement Learning; Vehicular Networks
File in questo prodotto:
File Dimensione Formato  
Designing a multi-layer edge-computing platform.pdf

solo gestori archivio

Tipologia: Versione Editoriale (PDF)
Dimensione 4.44 MB
Formato Adobe PDF
4.44 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/521844
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 12
social impact