Abstract
The amount of data generated by sensors, actuators and other devices in the Internet of Things (IoT) has substantially increased in the last few years. IoT data are currently processed in the cloud, mostly through computing resources located in distant data centers. Therefore, network bandwidth and communication latency become serious bottlenecks. This article advocates edge computing for emerging IoT applications that leverage sensor streams to augment interactive applications. First, we classify and survey current edge computing architectures and platforms, then describe key IoT application scenarios that benefit from edge computing. Second, we carry out an experimental evaluation of edge computing and its enabling technologies in a selected use case represented by mobile gaming. To this end, we consider a resource-intensive 3D application as a paradigmatic example and evaluate the response delay in different deployment scenarios. Our experimental results show that edge computing is necessary to meet the latency requirements of applications involving virtual and augmented reality. We conclude by discussing what can be achieved with current edge computing platforms and how emerging technologies will impact on the deployment of future IoT applications.
1. Introduction
In the era of Industry 4.0, the fusion of digital technologies with traditional manufacturing processes has given rise to the concept of smart manufacturing. This case study delves into the realm of large-scale smart manufacturing, focusing on the production of precision components. The study harnesses the power of edge computing to address the challenges inherent in this sophisticated industrial landscape.
The emergence of edge computing technology is a recent paradigm shift, gaining significant traction due to the proliferation of the Internet of Things (IoT) and the imperative for instantaneous data processing and analysis. Characterized as a topology, edge computing serves as an organizational framework connecting computing resources. Unlike traditional centralized data centers, this paradigm situates computing resources at the network edge, in close proximity to the devices generating data. This decentralized approach offers tangible benefits, particularly in mitigating latency and enhancing performance—crucial aspects for time-sensitive applications such as autonomous vehicles and industrial automation.
1.1. Various types of edge computing exist, each tailored to specific needs:
Fog Computing: Positioned between edge devices and the cloud, it facilitates data processing near the source.
Mobile Edge Computing: Involves sitting computing resources on mobile network infrastructure, such as base stations or access points.
Cloudlet Computing: Entails locating computing resources at the edge of the cloud, enabling processing closer to the device and minimizing latency.
The advantages of edge computing are manifold, encompassing reduced latency, heightened performance, and increased reliability. However, challenges such as security and the effective management of distributed computing resources are inherent to this paradigm. Organizations considering the adoption of edge computing technology must meticulously assess their requirements. With its capacity to facilitate real-time processing and data analysis, edge computing technology is poised to play a pivotal role in the future of computing.
In the context of substantial growth in data generated by mobile and IoT devices, including sensors, smartphones, and wearables, there is a pressing need to address the limitations posed by their computational and energy resources. Current solutions involve offloading processing and storage from these resource-constrained devices to the cloud. While cloud computing offers on-demand and scalable resources, the inherent communication latency between end-users and the cloud, coupled with increased data exchange, poses challenges.
To address these issues, the concept of edge or fog computing has been proposed. In this paradigm, computing resources are made available at the edge of the network, close to or co-located with end devices. This proximity reduces communication latency and allows for the processing of network-intensive data one hop away from end devices, alleviating bandwidth demands on network links to distant data centers.
Edge computing platforms support mobility of devices and geographically distributed applications, making them particularly beneficial for IoT deployments. Applications include content delivery to vehicles, real-time analytics of data collected by mobile devices, and environmental monitoring through geographically distributed wireless sensor networks.
This article contributes to the field in two main ways. First, it classifies and surveys current edge computing architectures and platforms, providing insights into the key enabling technologies and their application scenarios. Second, it presents an experimental evaluation of edge computing in the context of mobile gaming, a use case that relies on low latency and reliable communications.
2. Edge Computing classes and its architecture.
Various architectures have been proposed to implement edge computing platforms. Upon reviewing these architectures, it becomes apparent that the edge of the network lacks a clear definition, and the nodes expected to participate at the edge can vary. Furthermore, there is a significant disparity in the terminology used to describe the edge, with the same term being employed to define different architectures and functionalities. Therefore, our approach involves classifying the proposed architectures into three categories, based on common deployment features. It’s essential to note that, in practice, features from one category can be used in combination with others.
2.1. Resource-Rich Servers Deployed at the Edge
One approach to implementing an edge computing platform is deploying resource-rich servers in the network to which end-users connect. Satyanarayanan et al introduce the concept of virtual machine (VM)-based cloudlets deployed on Wi-Fi access points, located one hop away from end-devices. Described as a “data center in a box,” a cloudlet offers a cluster of multicore computing capacity, storage, and wireless LAN connectivity toward the edge. Ha et al. propose a multi-tiered system using cloudlets to provide cognitive assistance for users. This involves processing video and sensor data collected from users through Google Glass on the cloudlet to offer real-time assistance. Simoens et al. present a scalable three-tier system using cloudlets for analytics and automated tagging of crowd-sourced video from user devices. Subsequent research has explored integrating cloudlets with femtocells, LTE base stations, and even cars. Greenberg et al. design micro data centers consisting of thousands of servers, capable of hosting interactive applications for end-users. While initially used for deploying CDNs and email applications, these data centers can be repurposed to host cloudlets. Similar to cloudlets, Wang et al. propose deploying a small set of servers on Wi-Fi access points or a base station in the radio access network, referring to this deployment as a micro cloud. In the telecommunications ecosystem, multi-access edge computing (MEC) follows a similar approach by deploying resource-rich resources at the edge. In MEC, cloud computing resources, storage, and IT (Information Technology) services are deployed in the radio access component of mobile networks. The platform consists of MEC servers integrated onto base stations or radio network controllers, with applications running on these servers through VMs. This architecture benefits from exposing real-time radio link information to applications deployed at the edge. MEC’s scope has expanded beyond mobile-edge computing to include various access technologies. A multi-access edge computing platform can serve as a gateway in indoor environments, providing services such as augmented reality, building management, and social network applications.
2.1. Heterogeneous Edge Nodes
In contrast to the first category, edge computing platforms can leverage a diverse set of computing resources. Bonomi et al. propose a fog platform characterized by a highly virtualized system of heterogeneous nodes, ranging from resource-rich servers to more constrained edge routers, access points, set-top boxes, and even end-devices such as smartphones and connected vehicles. The solution acknowledges the heterogeneity of wireless connectivity in end-devices, supporting different wireless access technologies. Similar concepts are presented in utilizing edge devices, routers, and on-demand dedicated compute instances for processing data in a fog platform. Chiang and Zhang describe a system leveraging computing resources on end-devices (smartphones, Google Glass, home storage devices) and the cloud for real-time data stream mining. Nishio et al. define fog computing as a cooperation-based mobile cloud, wherein heterogeneous mobile devices opportunistically share resources to deliver services and applications. The architecture consists of a local cloud formed by mobile devices in a neighboring area, with nodes sharing resources within this local cloud. Elkhatib et al. propose the use of small, low-power computers such as Raspberry Pis to host fog services. Raspberry Pis can be clustered together as independent, portable mini-clouds, deployable in various environments.
2.2. Edge-Cloud Federation
An alternative approach to realizing edge platforms involves the federation of computing resources at the edge and centralized data centers. Chang et al. introduce this concept as an edge cloud, where edge apps deliver services both at the edge and in distant cloud centers. This architecture utilizes edge apps to deploy indoor 3D localization and video monitoring applications. Farris et al. propose the federation of private and public clouds to enable integrated IoT applications. The edge node dynamically orchestrates the federation to maximize the number of executed tasks. Federation of clouds is also a central aspect of the FUSION architecture proposed by Griffin et al where services are deployed on a cloud infrastructure distributed throughout the Internet. Application developers can deploy services in geographically distributed execution zones, located on IP routers, access points, base stations in the radio access network, and so on. Elias et al. leverage an edge cloud, mirroring public cloud services, along with a federated cloud to perform image classification with low time and bandwidth requirements. The federated architecture and mirroring of the public cloud enable the use of existing open-source repositories for machine learning and image classification at the edge.
3. Enabling Technologies.
The previously discussed edge computing platforms rely on several key enabling technologies, which are instrumental in the evolution of current mobile networks toward the fifth generation (5G). 5G introduces various technologies to address challenges such as low latency, reliable communications, radio spectrum scarcity, energy-efficient operations, and the increasing volume of data from diverse devices. Additionally, 5G networks aim to support programmable and flexible service deployments through technologies like Network Function Virtualization (NFV) and Software-Defined Networking (SDN) [34]. These technologies are expected to play a crucial role in the development of edge computing platforms.
3.1. Virtualization:
Virtualization allows cloud computing providers to run multiple independent software instances on a single physical server. In the context of cloud computing environments, Virtual Machines (VMs) are the predominant means of deploying virtualized instances. VMs run on a hypervisor, an abstraction layer between VMs and physical hardware, facilitating their use of underlying CPU, storage, and networking resources. However, hypervisor-based virtualization incurs some overhead. An alternative is container-based virtualization, where virtualized instances share the resources of the underlying host OS without the need for a separate OS for each instance. This approach, while reducing instance start times and improving performance, also enables the migration of VMs or containers, a crucial capability for edge computing platforms.
3.1. Network Function Virtualization (NFV) and Software-Defined Networking (SDN):
NFV involves implementing network functions as software modules that can run on general-purpose hardware. It decouples software from underlying hardware, allowing different network functions and services to run on general-purpose nodes. SDN complements NFV by separating the management or control plane from the data plane, enabling more flexible network management. SDN utilizes a logically centralized controller for policy and forwarding decisions. Together, NFV and SDN enable flexible and programmable deployment of software-based modules, simplifying network configuration and management. These technologies are crucial for network operators to quickly deploy new software functions with limited costs. For edge computing platforms, NFV facilitates automated deployment of virtual resources to meet traffic increases, while SDN enables automated orchestration of virtualized instances and flexible policy control.
3.2. Computation Offloading:
Computation and storage offloading from resource-constrained mobile devices to the cloud is a widespread practice. Offloading involves sending processing-heavy tasks to the cloud, which then returns the results to the devices. While this practice traditionally relies on centralized cloud resources, edge computing platforms can also be involved. End-devices access cloud resources as thin clients or through web browsers, providing several benefits such as extended battery life for resource-constrained devices and lower energy consumption at the end-device when offloading to the edge. Computation offloading supports various applications on resource-constrained devices, including mobile gaming, mobile learning, natural language processing, and mobile healthcare.
4. EDGE COMPUTING FOR IOT APPLICATIONS:
The Internet of Things (IoT), characterized by resource-constrained devices connected to the Internet, is well-suited for edge computing platforms. Key features of the IoT, such as low-latency communication, bandwidth-intensive data generation, and geographical distribution, align with the capabilities of edge computing. The following use cases exemplify the benefits of edge computing for IoT applications:
4.1. Low-Latency Communication:
Low-latency communication is crucial for IoT applications like connected vehicles, mobile gaming, remote health monitoring, warehouse logistics, and industrial control systems. These scenarios demand real-time actions or responses based on data generated by end-devices, emphasizing the importance of edge computing for achieving low-latency communication.
4.1. Bandwidth-Intensive Data Generation:
IoT deployments generate an increasing amount of bandwidth-intensive data, including video from surveillance cameras, police patrol cars, and user devices. Edge computing platforms, by placing computational resources one hop away from high-bandwidth data sources, reduce the need to send large volumes of data to distant cloud data centers. This is particularly relevant for applications like public safety, where videos and sensor data from hazardous locations can be processed locally to provide real-time information to responders.
4.2. Geographical Distribution:
Many IoT applications rely on sensor networks with geographical distribution. Edge computing platforms support the processing of data locally, reducing the need to send data to distant cloud data centers. For example, collision avoidance systems at the edge of vehicular networks benefit from local data processing, achieving low-latency communications by processing sensor data locally.
4.3. Device Mobility:
Device mobility in IoT applications necessitates low-latency processing of device data. Edge computing platforms facilitate the migration of virtualized resources based on the mobility of end-devices, enabling the processing of data locally. This ensures a satisfactory quality of experience for applications where rapid actions are expected.
The integration of digital data with physical environments in real-time wireless communications represents the next step in IoT evolution. Applications involving the integration of sensor-generated inputs and artificially created 3D scenarios, such as virtual and augmented reality, pose stringent requirements on latency and reliability. Edge computing, by providing cloud-like features near users, addresses these requirements and is particularly relevant for interactive applications with low tolerance for delays. The subsequent focus is on exploring the capabilities of edge computing in the context of a specific use case—mobile gaming.
5. CONCLUSION:
In conclusion, the advent of edge computing represents a paradigm shift that significantly impacts various domains, particularly in the context of the Internet of Things (IoT) and mobile networks transitioning to 5G. This case study has delved into the critical enabling technologies that underpin edge computing platforms, emphasizing their role in the evolution of mobile networks and the facilitation of real-time, low-latency applications.
5.1. Key Enabling Technologies:
The case study highlighted the importance of virtualization, Network Function Virtualization (NFV), and Software-Defined Networking (SDN) as foundational technologies for edge computing platforms. Virtualization, whether through Virtual Machines (VMs) or container-based approaches, enables efficient resource utilization and migration capabilities. NFV and SDN complement each other, offering flexibility and programmability in deploying software-based modules, simplifying network configuration and management.
5.1. Computation Offloading and its Implications:
The practice of computation offloading, traditionally directed to centralized cloud resources, gains new dimensions with the integration of edge computing. The ability to offload computation to the edge, in addition to or instead of the cloud, brings advantages such as extended battery life for devices and reduced energy consumption. This, in turn, opens avenues for diverse applications, including mobile gaming, mobile learning, natural language processing, and mobile healthcare.
5.2. Edge Computing for IoT Applications:
The IoT, characterized by diverse, resource-constrained devices, finds a natural ally in edge computing. The case study explored how edge computing addresses specific needs of IoT applications, ranging from low-latency communication in scenarios like connected vehicles to handling bandwidth-intensive data generation in applications such as surveillance and public safety. The geographical distribution of IoT applications benefits from local data processing at the edge, and the mobility of devices is seamlessly supported through edge computing platforms.
5.3. Future Directions and Considerations:
Looking ahead, the case study acknowledged the continuous evolution of edge computing and its potential integration with emerging technologies, such as 5G. The role of edge computing in optimizing industrial processes, as discussed in the detailed case study, serves as a testament to its transformative impact. Continuous monitoring, optimization, and the integration of advanced technologies, including 5G connectivity, were identified as crucial considerations for the future development of edge computing in industrial automation.
5.4. Closing Thoughts:
In essence, the case study underscored that edge computing is not merely a technological advancement but a strategic approach reshaping the landscape of real-time data processing. Its deployment in industrial automation, as outlined in the earlier sections, showcased tangible benefits such as reduced latency, enhanced system performance, and improved security. The ability of edge computing to bring computational resources closer to the data source is instrumental in meeting the demands of time-sensitive applications, making it a cornerstone in the era of Industry 4.0.
As industries, IoT applications, and mobile networks continue to evolve, edge computing stands poised as a catalyst for innovation, efficiency, and responsiveness. Its transformative impact spans beyond the realms of industrial automation, promising to redefine the way we approach data processing, decision-making, and connectivity in a wide array of domains. The journey of edge computing is ongoing, and its future promises a tapestry of possibilities in the dynamic landscape of modern technology.
Mohammad Hanzla
Associate Consultant