15 Sep 2021 |
Research article |
Information and Communications Technologies
Increasing Edge Computing Elasticity
Purchased on Istockphoto.com. Copyright.
An ever-increasing number of applications using data collected at the edge are emerging—the Internet of Things, smart homes, virtual and augmented reality, vehicular networks… The amount of data generated has become too large to entrust its processing solely to cloud data centers because of the associated costs and response time (latency), which must decrease to accommodate these new applications.
One solution is to use edge computing to process data as close to the point of service delivery as possible.
The Challenges of Edge Computing
Edge computing involves providing storage and processing resources to devices like sensors, and to micro data centers—centers operated by different service providers, strategically distributed to cover large areas. This environment presents special challenges, including lower reliability caused by the large number of heterogeneous centers that make up this type of network. Proper management of all these components is required to provide quality of service and user experience.
Also, energy aspects must be considered; micro data-centers tend to be less energy efficient than their cloud counterparts since they do not benefit from economies of scale.
Elasticity is a concept applied in cloud environments, where resources are automatically allocated based on service demand, ensuring a real-time balance between resource supply and demand. With this concept, the customer only pays for the actual resource use of their application.
But when trying to apply the concept of elasticity to edge computing, things get more complicated. Software frameworks and algorithms must be used to more easily deploy and configure the applications during runtime according to their changing load and infrastructure conditions. The heterogeneity of resources, their availability as well as their energy consumption must be taken into account in the process.
Energy Efficiency and Sustainability
Currently, a large number of models can estimate the amount of energy consumed by cloud applications, but similar information is scarce on edge computing. Models that better simulate application performance based on their location on the edge-cloud continuum must be developed to assess the energy implications of the decisions that are made. Depending on latency requirements, application requests could be routed according to the availability of edge resources and their energy source, to favour the use of renewable energy.
Marcos Dias De Assunção
Marcos Dias De Assunção is an associate professor in the Department of Software Engineering and Information Technology. His research interests include resource management algorithms, distributed processing of data streams and edge computing.