51
|
Roig PJ, Alcaraz S, Gilly K, Bernad C, Juiz C. Modeling of a Generic Edge Computing Application Design. SENSORS 2021; 21:s21217276. [PMID: 34770582 PMCID: PMC8587040 DOI: 10.3390/s21217276] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2021] [Revised: 10/25/2021] [Accepted: 10/27/2021] [Indexed: 12/29/2022]
Abstract
Edge computing applications leverage advances in edge computing along with the latest trends of convolutional neural networks in order to achieve ultra-low latency, high-speed processing, low-power consumptions scenarios, which are necessary for deploying real-time Internet of Things deployments efficiently. As the importance of such scenarios is growing by the day, we propose to undertake two different kind of models, such as an algebraic models, with a process algebra called ACP and a coding model with a modeling language called Promela. Both approaches have been used to build models considering an edge infrastructure with a cloud backup, which has been further extended with the addition of extra fog nodes, and after having applied the proper verification techniques, they have all been duly verified. Specifically, a generic edge computing design has been specified in an algebraic manner with ACP, being followed by its corresponding algebraic verification, whereas it has also been specified by means of Promela code, which has been verified by means of the model checker Spin.
Collapse
Affiliation(s)
- Pedro Juan Roig
- Computer Engineering Department, Miguel Hernández University, 03202 Elche, Spain; (S.A.); (C.B.)
- Correspondence: (P.J.R.); (K.G.); Tel.: +34-966658388 (P.J.R.); +34-966658565 (K.G.)
| | - Salvador Alcaraz
- Computer Engineering Department, Miguel Hernández University, 03202 Elche, Spain; (S.A.); (C.B.)
| | - Katja Gilly
- Computer Engineering Department, Miguel Hernández University, 03202 Elche, Spain; (S.A.); (C.B.)
- Correspondence: (P.J.R.); (K.G.); Tel.: +34-966658388 (P.J.R.); +34-966658565 (K.G.)
| | - Cristina Bernad
- Computer Engineering Department, Miguel Hernández University, 03202 Elche, Spain; (S.A.); (C.B.)
| | - Carlos Juiz
- Mathematics and Computer Science Department, University of the Balearic Islands, 07022 Palma de Mallorca, Spain;
| |
Collapse
|
52
|
Boost Precision Agriculture with Unmanned Aerial Vehicle Remote Sensing and Edge Intelligence: A Survey. REMOTE SENSING 2021. [DOI: 10.3390/rs13214387] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
In recent years unmanned aerial vehicles (UAVs) have emerged as a popular and cost-effective technology to capture high spatial and temporal resolution remote sensing (RS) images for a wide range of precision agriculture applications, which can help reduce costs and environmental impacts by providing detailed agricultural information to optimize field practices. Furthermore, deep learning (DL) has been successfully applied in agricultural applications such as weed detection, crop pest and disease detection, etc. as an intelligent tool. However, most DL-based methods place high computation, memory and network demands on resources. Cloud computing can increase processing efficiency with high scalability and low cost, but results in high latency and great pressure on the network bandwidth. The emerging of edge intelligence, although still in the early stages, provides a promising solution for artificial intelligence (AI) applications on intelligent edge devices at the edge of the network close to data sources. These devices are with built-in processors enabling onboard analytics or AI (e.g., UAVs and Internet of Things gateways). Therefore, in this paper, a comprehensive survey on the latest developments of precision agriculture with UAV RS and edge intelligence is conducted for the first time. The major insights observed are as follows: (a) in terms of UAV systems, small or light, fixed-wing or industrial rotor-wing UAVs are widely used in precision agriculture; (b) sensors on UAVs can provide multi-source datasets, and there are only a few public UAV dataset for intelligent precision agriculture, mainly from RGB sensors and a few from multispectral and hyperspectral sensors; (c) DL-based UAV RS methods can be categorized into classification, object detection and segmentation tasks, and convolutional neural network and recurrent neural network are the mostly common used network architectures; (d) cloud computing is a common solution to UAV RS data processing, while edge computing brings the computing close to data sources; (e) edge intelligence is the convergence of artificial intelligence and edge computing, in which model compression especially parameter pruning and quantization is the most important and widely used technique at present, and typical edge resources include central processing units, graphics processing units and field programmable gate arrays.
Collapse
|
53
|
Shah SK, Tariq Z, Lee J, Lee Y. Event-Driven Deep Learning for Edge Intelligence (EDL-EI). SENSORS (BASEL, SWITZERLAND) 2021; 21:6023. [PMID: 34577228 PMCID: PMC8468758 DOI: 10.3390/s21186023] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 08/26/2021] [Accepted: 08/27/2021] [Indexed: 12/03/2022]
Abstract
Edge intelligence (EI) has received a lot of interest because it can reduce latency, increase efficiency, and preserve privacy. More significantly, as the Internet of Things (IoT) has proliferated, billions of portable and embedded devices have been interconnected, producing zillions of gigabytes on edge networks. Thus, there is an immediate need to push AI (artificial intelligence) breakthroughs within edge networks to achieve the full promise of edge data analytics. EI solutions have supported digital technology workloads and applications from the infrastructure level to edge networks; however, there are still many challenges with the heterogeneity of computational capabilities and the spread of information sources. We propose a novel event-driven deep-learning framework, called EDL-EI (event-driven deep learning for edge intelligence), via the design of a novel event model by defining events using correlation analysis with multiple sensors in real-world settings and incorporating multi-sensor fusion techniques, a transformation method for sensor streams into images, and lightweight 2-dimensional convolutional neural network (CNN) models. To demonstrate the feasibility of the EDL-EI framework, we presented an IoT-based prototype system that we developed with multiple sensors and edge devices. To verify the proposed framework, we have a case study of air-quality scenarios based on the benchmark data provided by the USA Environmental Protection Agency for the most polluted cities in South Korea and China. We have obtained outstanding predictive accuracy (97.65% and 97.19%) from two deep-learning models on the cities' air-quality patterns. Furthermore, the air-quality changes from 2019 to 2020 have been analyzed to check the effects of the COVID-19 pandemic lockdown.
Collapse
Affiliation(s)
- Sayed Khushal Shah
- Department of Computer Science and Engineering, University of North Texas, Denton, TX 76207, USA;
| | - Zeenat Tariq
- Department of Computer Science and Engineering, University of North Texas, Denton, TX 76207, USA;
| | - Jeehwan Lee
- College of Architecture, Myongji University, Seoul 03674, Korea;
| | - Yugyung Lee
- Department of Computer Science and Electrical Engineering, University of Missouri, Kansas City, MO 64110, USA;
| |
Collapse
|
54
|
Moreno-Rodenas AM, Duinmeijer A, Clemens FHLR. Deep-learning based monitoring of FOG layer dynamics in wastewater pumping stations. WATER RESEARCH 2021; 202:117482. [PMID: 34365321 DOI: 10.1016/j.watres.2021.117482] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Revised: 07/22/2021] [Accepted: 07/26/2021] [Indexed: 06/13/2023]
Abstract
Accumulation of fat, oil and grease (FOG) in the sumps of wastewater pumping stations is a common failure cause for these facilities. Floating solids are often not transported by the pump suction inlets and the individual solids can accumulate to stiff and thick FOG layers. The lack of data about the dynamics in FOG layer formation still hampers the design of effective measures towards its mitigation. In this article, we present a low-cost camera-based automated system for the observation of FOG layer dynamics in wastewater pumping stations at high-frequency (minutes) over extended time windows (months). Optical imagery is processed through a deep-learning computer vision routine that allows describing FOG layer dynamics (e.g. accumulation rate and changes in shape) and various hydraulic processes in the pump sump (e.g. the water level, surface flow velocity fields, vorticity, or circulation). Furthermore, the system can perform in-camera image processing, thus allowing the transfer of compressed-processed datasets when deployed in remote locations (Edge AI computing), which could be of great utility for the hydro-ecological monitoring community. In this study, the technology applied is illustrated with a dataset (six months, two-minute frequency) collected at a wastewater pumping station at the municipality of Rotterdam, The Netherlands. This monitoring system represents a source of information for the management of (waste)water pumping stations (e.g. detection of free-surface vortices and scheduling of sump cleaning operations) and facilitates the collection of standardized high-frequency FOG layer dynamics data for a detailed description of FOG build-up and transport processes.
Collapse
Affiliation(s)
| | - Alex Duinmeijer
- Engineering's Consultancy of the Municipality of Rotterdam, Rotterdam, the Netherlands
| | - Francois H L R Clemens
- Department of Hydraulic Engineering, Deltares, Delft 2600 MH, the Netherlands; Norwegian University of Science & Technology, Faculty of Engineering, Department of Civil & Environmental Engineering, Trondheim, Norway
| |
Collapse
|
55
|
Clustering Algorithms on Low-Power and High-Performance Devices for Edge Computing Environments. SENSORS 2021; 21:s21165395. [PMID: 34450837 PMCID: PMC8397962 DOI: 10.3390/s21165395] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/09/2021] [Revised: 07/28/2021] [Accepted: 08/07/2021] [Indexed: 11/16/2022]
Abstract
The synergy between Artificial Intelligence and the Edge Computing paradigm promises to transfer decision-making processes to the periphery of sensor networks without the involvement of central data servers. For this reason, we recently witnessed an impetuous development of devices that integrate sensors and computing resources in a single board to process data directly on the collection place. Due to the particular context where they are used, the main feature of these boards is the reduced energy consumption, even if they do not exhibit absolute computing powers comparable to modern high-end CPUs. Among the most popular Artificial Intelligence techniques, clustering algorithms are practical tools for discovering correlations or affinities within data collected in large datasets, but a parallel implementation is an essential requirement because of their high computational cost. Therefore, in the present work, we investigate how to implement clustering algorithms on parallel and low-energy devices for edge computing environments. In particular, we present the experiments related to two devices with different features: the quad-core UDOO X86 Advanced+ board and the GPU-based NVIDIA Jetson Nano board, evaluating them from the performance and the energy consumption points of view. The experiments show that they realize a more favorable trade-off between these two requirements than other high-end computing devices.
Collapse
|
56
|
Energy-Efficient Non-Von Neumann Computing Architecture Supporting Multiple Computing Paradigms for Logic and Binarized Neural Networks. JOURNAL OF LOW POWER ELECTRONICS AND APPLICATIONS 2021. [DOI: 10.3390/jlpea11030029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Different in-memory computing paradigms enabled by emerging non-volatile memory technologies are promising solutions for the development of ultra-low-power hardware for edge computing. Among these, SIMPLY, a smart logic-in-memory architecture, provides high reconfigurability and enables the in-memory computation of both logic operations and binarized neural networks (BNNs) inference. However, operation-specific hardware accelerators can result in better performance for a particular task, such as the analog computation of the multiply and accumulate operation for BNN inference, but lack reconfigurability. Nonetheless, a solution providing the flexibility of SIMPLY while also achieving the high performance of BNN-specific analog hardware accelerators is missing. In this work, we propose a novel in-memory architecture based on 1T1R crossbar arrays, which enables the coexistence on the same crossbar array of both SIMPLY computing paradigm and the analog acceleration of the multiply and accumulate operation for BNN inference. We also highlight the main design tradeoffs and opportunities enabled by different emerging non-volatile memory technologies. Finally, by using a physics-based Resistive Random Access Memory (RRAM) compact model calibrated on data from the literature, we show that the proposed architecture improves the energy delay product by >103 times when performing a BNN inference task with respect to a SIMPLY implementation.
Collapse
|
57
|
Zhou H, Zhang W, Wang C, Ma X, Yu H. BBNet: A Novel Convolutional Neural Network Structure in Edge-Cloud Collaborative Inference. SENSORS 2021; 21:s21134494. [PMID: 34209400 PMCID: PMC8272083 DOI: 10.3390/s21134494] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 06/24/2021] [Accepted: 06/26/2021] [Indexed: 11/16/2022]
Abstract
Edge-cloud collaborative inference can significantly reduce the delay of a deep neural network (DNN) by dividing the network between mobile edge and cloud. However, the in-layer data size of DNN is usually larger than the original data, so the communication time to send intermediate data to the cloud will also increase end-to-end latency. To cope with these challenges, this paper proposes a novel convolutional neural network structure—BBNet—that accelerates collaborative inference from two levels: (1) through channel-pruning: reducing the number of calculations and parameters of the original network; (2) through compressing the feature map at the split point to further reduce the size of the data transmitted. In addition, This paper implemented the BBNet structure based on NVIDIA Nano and the server. Compared with the original network, BBNet’s FLOPs and parameter achieve up to 5.67× and 11.57× on the compression rate, respectively. In the best case, the feature compression layer can reach a bit-compression rate of 512×. Compared with the better bandwidth conditions, BBNet has a more obvious inference delay when the network conditions are poor. For example, when the upload bandwidth is only 20 kb/s, the end-to-end latency of BBNet is increased by 38.89× compared with the cloud-only approach.
Collapse
Affiliation(s)
- Hongbo Zhou
- College of Engineering, Huaqiao University, Quanzhou 362021, China; (H.Z.); (C.W.); (X.M.); (H.Y.)
- Fujian Provincial Academic Engineering Research Centre in Industrial Intellectual Techniques and Systems, Quanzhou 362021, China
| | - Weiwei Zhang
- College of Engineering, Huaqiao University, Quanzhou 362021, China; (H.Z.); (C.W.); (X.M.); (H.Y.)
- Fujian Provincial Academic Engineering Research Centre in Industrial Intellectual Techniques and Systems, Quanzhou 362021, China
- Correspondence:
| | - Chengwei Wang
- College of Engineering, Huaqiao University, Quanzhou 362021, China; (H.Z.); (C.W.); (X.M.); (H.Y.)
| | - Xin Ma
- College of Engineering, Huaqiao University, Quanzhou 362021, China; (H.Z.); (C.W.); (X.M.); (H.Y.)
- Fujian Provincial Academic Engineering Research Centre in Industrial Intellectual Techniques and Systems, Quanzhou 362021, China
| | - Haoran Yu
- College of Engineering, Huaqiao University, Quanzhou 362021, China; (H.Z.); (C.W.); (X.M.); (H.Y.)
- Fujian Provincial Academic Engineering Research Centre in Industrial Intellectual Techniques and Systems, Quanzhou 362021, China
| |
Collapse
|
58
|
Zhang T, Li Y, Philip Chen C. Edge computing and its role in Industrial Internet: Methodologies, applications, and future directions. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2020.12.021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
59
|
Abstract
In the post-cloud era, edge computing is a new computing paradigm with data processed at the edge of the network, which can process the data close to the end-user in real time and offload the cloud task intelligently. Meanwhile, the decentralization, tamper-proof and anonymity of blockchain technology can provide a new trusted computing environment for edge computing. However, it does raise considerable concerns of security, privacy, fault-tolerance and so on. For example, identity authentication and access control rely on third parties, heterogeneous devices and different vendors in IoT, leading to security and privacy risks, etc. How to combine the advantages of the two has become the highlight of academic research, especially the issue of secure resource management. Comprehensive security and privacy involve all aspects of platform, data, application and access control. In. this paper, the architecture and behavior of an Access Management System (AMS) in a proof of concept (PoC) prototype are proposed with a Color Petri Net (CPN) model. The two domains of blockchain and edge computing are organically connected by interfaces and interactions. The simulation of operation, activity and role association proves the feasibility and effectiveness of the AMS. The instances of platform business access control, data access control, database services, IOT hub service are run on Advantech WISE-PaaS through User Account and Authentication (UAA). Finally, fine-grained and distributed access control can be realized with the help of a blockchain attribute. Namely, smart contracts are used to register, broadcast, and revoke access authorization, as well as to create specific transactions to define access control policies.
Collapse
|
60
|
Lovén L, Lähderanta T, Ruha L, Peltonen E, Launonen I, Sillanpää MJ, Riekki J, Pirttikangas S. EDISON: An Edge-Native Method and Architecture for Distributed Interpolation. SENSORS (BASEL, SWITZERLAND) 2021; 21:2279. [PMID: 33805187 PMCID: PMC8037329 DOI: 10.3390/s21072279] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Revised: 03/18/2021] [Accepted: 03/22/2021] [Indexed: 11/30/2022]
Abstract
Spatio-temporal interpolation provides estimates of observations in unobserved locations and time slots. In smart cities, interpolation helps to provide a fine-grained contextual and situational understanding of the urban environment, in terms of both short-term (e.g., weather, air quality, traffic) or long term (e.g., crime, demographics) spatio-temporal phenomena. Various initiatives improve spatio-temporal interpolation results by including additional data sources such as vehicle-fitted sensors, mobile phones, or micro weather stations of, for example, smart homes. However, the underlying computing paradigm in such initiatives is predominantly centralized, with all data collected and analyzed in the cloud. This solution is not scalable, as when the spatial and temporal density of sensor data grows, the required transmission bandwidth and computational capacity become unfeasible. To address the scaling problem, we propose EDISON: algorithms for distributed learning and inference, and an edge-native architecture for distributing spatio-temporal interpolation models, their computations, and the observed data vertically and horizontally between device, edge and cloud layers. We demonstrate EDISON functionality in a controlled, simulated spatio-temporal setup with 1 M artificial data points. While the main motivation of EDISON is the distribution of the heavy computations, the results show that EDISON also provides an improvement over alternative approaches, reaching at best a 10% smaller RMSE than a global interpolation and 6% smaller RMSE than a baseline distributed approach.
Collapse
Affiliation(s)
- Lauri Lovén
- Center for Ubiquitous Computing, University of Oulu, FI-90014 Oulu, Finland; (E.P.); (J.R.); (S.P.)
| | - Tero Lähderanta
- Research Unit of Mathematical Sciences, University of Oulu, FI-90014 Oulu, Finland; (T.L.); (L.R.); (I.L.); (M.J.S.)
| | - Leena Ruha
- Research Unit of Mathematical Sciences, University of Oulu, FI-90014 Oulu, Finland; (T.L.); (L.R.); (I.L.); (M.J.S.)
- Natural Resources Institute Finland, FI-90014 Oulu, Finland
| | - Ella Peltonen
- Center for Ubiquitous Computing, University of Oulu, FI-90014 Oulu, Finland; (E.P.); (J.R.); (S.P.)
| | - Ilkka Launonen
- Research Unit of Mathematical Sciences, University of Oulu, FI-90014 Oulu, Finland; (T.L.); (L.R.); (I.L.); (M.J.S.)
| | - Mikko J. Sillanpää
- Research Unit of Mathematical Sciences, University of Oulu, FI-90014 Oulu, Finland; (T.L.); (L.R.); (I.L.); (M.J.S.)
| | - Jukka Riekki
- Center for Ubiquitous Computing, University of Oulu, FI-90014 Oulu, Finland; (E.P.); (J.R.); (S.P.)
| | - Susanna Pirttikangas
- Center for Ubiquitous Computing, University of Oulu, FI-90014 Oulu, Finland; (E.P.); (J.R.); (S.P.)
| |
Collapse
|
61
|
Mijuskovic A, Chiumento A, Bemthuis R, Aldea A, Havinga P. Resource Management Techniques for Cloud/Fog and Edge Computing: An Evaluation Framework and Classification. SENSORS 2021; 21:s21051832. [PMID: 33808037 PMCID: PMC7961768 DOI: 10.3390/s21051832] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/30/2021] [Revised: 02/20/2021] [Accepted: 02/24/2021] [Indexed: 11/16/2022]
Abstract
Processing IoT applications directly in the cloud may not be the most efficient solution for each IoT scenario, especially for time-sensitive applications. A promising alternative is to use fog and edge computing, which address the issue of managing the large data bandwidth needed by end devices. These paradigms impose to process the large amounts of generated data close to the data sources rather than in the cloud. One of the considerations of cloud-based IoT environments is resource management, which typically revolves around resource allocation, workload balance, resource provisioning, task scheduling, and QoS to achieve performance improvements. In this paper, we review resource management techniques that can be applied for cloud, fog, and edge computing. The goal of this review is to provide an evaluation framework of metrics for resource management algorithms aiming at the cloud/fog and edge environments. To this end, we first address research challenges on resource management techniques in that domain. Consequently, we classify current research contributions to support in conducting an evaluation framework. One of the main contributions is an overview and analysis of research papers addressing resource management techniques. Concluding, this review highlights opportunities of using resource management techniques within the cloud/fog/edge paradigm. This practice is still at early development and barriers need to be overcome.
Collapse
Affiliation(s)
- Adriana Mijuskovic
- Department of Pervasive Systems, University of Twente, 7522 NB Enschede, The Netherlands; (A.C.); (R.B.); (P.H.)
- Correspondence: ; Tel.: +315-3489-8227
| | - Alessandro Chiumento
- Department of Pervasive Systems, University of Twente, 7522 NB Enschede, The Netherlands; (A.C.); (R.B.); (P.H.)
| | - Rob Bemthuis
- Department of Pervasive Systems, University of Twente, 7522 NB Enschede, The Netherlands; (A.C.); (R.B.); (P.H.)
| | - Adina Aldea
- Department of Industrial Engineering and Business Information Systems, University of Twente, 7522 NB Enschede, The Netherlands;
| | - Paul Havinga
- Department of Pervasive Systems, University of Twente, 7522 NB Enschede, The Netherlands; (A.C.); (R.B.); (P.H.)
| |
Collapse
|
62
|
Blockchain-Enabled Edge Intelligence for IoT: Background, Emerging Trends and Open Issues. FUTURE INTERNET 2021. [DOI: 10.3390/fi13020048] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Blockchain, a distributed ledger technology (DLT), refers to a list of records with consecutive time stamps. This decentralization technology has become a powerful model to establish trust among trustless entities, in a verifiable manner. Motivated by the recent advancement of multi-access edge computing (MEC) and artificial intelligence (AI), blockchain-enabled edge intelligence has become an emerging technology for the Internet of Things (IoT). We review how blockchain-enabled edge intelligence works in the IoT domain, identify the emerging trends, and suggest open issues for further research. To be specific: (1) we first offer some basic knowledge of DLT, MEC, and AI; (2) a comprehensive review of current peer-reviewed literature is given to identify emerging trends in this research area; and (3) we discuss some open issues and research gaps for future investigations. We expect that blockchain-enabled edge intelligence will become an important enabler of future IoT, providing trust and intelligence to satisfy the sophisticated needs of industries and society.
Collapse
|
63
|
Recent Advances in Collaborative Scheduling of Computing Tasks in an Edge Computing Paradigm. SENSORS 2021; 21:s21030779. [PMID: 33498910 PMCID: PMC7865659 DOI: 10.3390/s21030779] [Citation(s) in RCA: 26] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Revised: 01/10/2021] [Accepted: 01/12/2021] [Indexed: 01/10/2023]
Abstract
In edge computing, edge devices can offload their overloaded computing tasks to an edge server. This can give full play to an edge server’s advantages in computing and storage, and efficiently execute computing tasks. However, if they together offload all the overloaded computing tasks to an edge server, it can be overloaded, thereby resulting in the high processing delay of many computing tasks and unexpectedly high energy consumption. On the other hand, the resources in idle edge devices may be wasted and resource-rich cloud centers may be underutilized. Therefore, it is essential to explore a computing task collaborative scheduling mechanism with an edge server, a cloud center and edge devices according to task characteristics, optimization objectives and system status. It can help one realize efficient collaborative scheduling and precise execution of all computing tasks. This work analyzes and summarizes the edge computing scenarios in an edge computing paradigm. It then classifies the computing tasks in edge computing scenarios. Next, it formulates the optimization problem of computation offloading for an edge computing system. According to the problem formulation, the collaborative scheduling methods of computing tasks are then reviewed. Finally, future research issues for advanced collaborative scheduling in the context of edge computing are indicated.
Collapse
|