edge machine learning

Dynamic resource allocation for wireless edge machine learning with latency and accuracy guarantees

In this paper, we address the problem of dynamic allocation of communication and computation resources for Edge Machine Learning (EML) exploiting Multi-Access Edge Computing (MEC). In particular, we consider an IoT scenario, where sensor devices collect data from the environment and upload them to an edge server that runs a learning algorithm based on Stochastic Gradient Descent (SGD). The aim is to explore the optimal tradeoff between the overall system energy consumption, including IoT devices and edge server, the overall service latency, and the learning accuracy.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma