Skip to main content

Virtual Node

Virtual Node-Driven Cloud–Edge Collaborative Resource Scheduling for Surveillance with Visual Sensors


For public security purposes, distributed surveillance systems are widely deployed in key areas. These systems comprise visual sensors, edge computing boxes, and cloud servers. Resource scheduling algorithms are critical to ensure such systems’ robustness and efficiency. They balance workloads and need to meet real-time monitoring and emergency response requirements. Existing works have primarily focused on optimizing Quality of Service (QoS), latency, and energy consumption in edge computing under resource constraints. 

However, the issue of task congestion due to insufficient physical resources has been rarely investigated. In this paper, we tackle the challenges posed by large workloads and limited resources in the context of surveillance with visual sensors. First, we introduce the concept of virtual nodes for managing resource shortages, referred to as virtual node-driven resource scheduling. Then, we propose a convex-objective integer linear programming (ILP) model based on this concept and demonstrate its efficiency. 

Additionally, we propose three alternative virtual node-driven scheduling algorithms, the extension of a random algorithm, a genetic algorithm, and a heuristic algorithm, respectively. These algorithms serve as benchmarks for comparison with the proposed ILP model. Experimental results show that all the scheduling algorithms can effectively address the challenge of offloading multiple priority tasks under resource constraints. Furthermore, the ILP model shows the best scheduling performance among them.

In this study, we address the EUARS problem and make several significant contributions. First, we formally define the concept of the virtual node in addressing the EUARS problem. This concept renders a universal architecture for addressing resource scarcity, compared to the existing work. The new virtual node formulation allows a more robust and efficient allocation of tasks by introducing abstract physical resources into manageable units. Then we propose an optimal scheduling algorithm by using integer linear programming (ILP) to solve the EUARS problem exactly. By prioritizing tasks based on their importance, the ILP algorithm ensures that high-priority tasks receive the necessary resources promptly, enhancing the overall efficiency and responsiveness of the system.

Through numerical simulation experiments, we have compared the performance of the ILP algorithm with three other comparative algorithms, i.e., the VND-HA, VND-GA, and random algorithms. We show that the ILP algorithm significantly outperforms the other three algorithms in terms of task scheduling success rate, and edge resource utilization. Specifically, the ILP algorithm demonstrates superior performance in utilizing CPU, GPU, memory, and network bandwidth resources on edge computing boxes.

While the ILP algorithm achieves near-optimal resource allocation, it does incur higher computational costs, as evidenced by the elapsed CPU time. Despite this, the delay is still within acceptable limits for real-time applications, particularly for high-priority tasks that require immediate attention.
In summary, the ILP algorithm not only optimizes resource utilization but also ensures that critical tasks are prioritized effectively. Our study highlights the potential of ILP in solving complex resource allocation problems in edge computing environments, providing a robust framework for future research and practical implementations in real-world scenarios.

We acknowledge that our study still has some limitations. First, our experiments are conducted in a simulated environment, which may not fully capture the complexities of real-world edge computing systems. For example, we omit the virtual machine setup time on the edge computing box and reusable scheduling for identical tasks. Second, the proposed ILP algorithm assumes that task priorities are known in advance, which may not always be the case in practical scenarios.

In future work, we plan to validate our ILP algorithm in real-world edge computing offloading to address the limitations of simulation-based evaluation. Additionally, we will explore the application of virtual nodes in MEC, smart city, and smart building scenarios to optimize the topology of the resource scheduling model and enhance the robustness of the algorithm. For example, in the field of autonomous driving, safety detection can be assigned the highest priority, while tasks such as path planning are given lower priority. By introducing virtual nodes, robust scheduling of tasks with different priorities can be achieved.

Virtual Node, distributed computing, blockchain node, Kubernetes virtual node, virtual networking, cloud scalability, virtual machine integration, peer-to-peer node, decentralized system, containerized node, network simulation, cluster management, digital infrastructure, high-performance node, cloud-native architecture, virtualization layer, scalable deployment, edge computing node, orchestration system, software-defined node

#VirtualNode, #DistributedNode, #BlockchainNode, #KubernetesNode, #VirtualNetworking, #CloudScalability, #VMIntegration, #PeerToPeerNode, #DecentralizedNode, #ContainerNode, #NetworkSimulation, #ClusterNode, #DigitalInfrastructure, #HighPerformanceNode, #CloudNativeNode, #VirtualizationLayer, #ScalableDeployment, #EdgeNode, #OrchestrationNode, #SoftwareDefinedNode

Comments

Popular posts from this blog

HealthAIoT: Revolutionizing Smart Healthcare! HealthAIoT combines Artificial Intelligence and the Internet of Things to transform healthcare through real-time monitoring, predictive analytics, and personalized treatment. It enables smarter diagnostics, remote patient care, and proactive health management, enhancing efficiency and outcomes while reducing costs. HealthAIoT is the future of connected, intelligent, and patient-centric healthcare systems. What is HealthAIoT? HealthAIoT is the convergence of Artificial Intelligence (AI) and the Internet of Things (IoT) in the healthcare industry. It integrates smart devices, sensors, and wearables with AI-powered software to monitor, diagnose, and manage health conditions in real-time. This fusion is enabling a new era of smart, connected, and intelligent healthcare systems . Key Components IoT Devices in Healthcare Wearables (e.g., smartwatches, fitness trackers) Medical devices (e.g., glucose monitors, heart rate sensors) Rem...
Detecting Co-Resident Attacks in 5G Clouds! Detecting co-resident attacks in 5G clouds involves identifying malicious activities where attackers share physical cloud resources with victims to steal data or disrupt services. Techniques like machine learning, behavioral analysis, and resource monitoring help detect unusual patterns, ensuring stronger security and privacy in 5G cloud environments. Detecting Co-Resident Attacks in 5G Clouds In a 5G cloud environment, many different users (including businesses and individuals) share the same physical infrastructure through virtualization technologies like Virtual Machines (VMs) and containers. Co-resident attacks occur when a malicious user manages to place their VM or container on the same physical server as a target. Once co-residency is achieved, attackers can exploit shared resources like CPU caches, memory buses, or network interfaces to gather sensitive information or launch denial-of-service (DoS) attacks. Why are Co-Resident Attack...
                        Neural Networks Neural networks are computing systems inspired by the human brain, consisting of layers of interconnected nodes (neurons). They process data by learning patterns from input, enabling tasks like image recognition, language translation, and decision-making. Neural networks power many AI applications by adjusting internal weights through training with large datasets.                                                    Structure of a Neural Network Input Layer : This is where the network receives data. Each neuron in this layer represents a feature in the dataset (e.g., pixels in an image or values in a spreadsheet). Hidden Layers : These layers sit between the input and output layers. They perform calculations and learn patterns. The more hidden layers a ne...