ABSTRACT
Although data is mostly collected in IoT devices, 80% of its processing takes place in data centers and centralized computing facilities such as clouds. This paradigm strongly relies on the network infrastructure and incurs high communication latencies and significant energy consumption while under-utilizing the computing resources embedded in the IoT devices. Fostering architectures that empower end-users leveraging the collaborative capacities of IoT, edge and far-edge devices will reduce the dependence on the Cloud and the network overload and will enable IT systems with higher responsiveness, accuracy, and energy efficiency.
Currently, IoT and Edge devices are treated as surrogates to the Cloud not only from an application processing point of view; the control of the application and the infrastructure are hosted in the Cloud too. This has a strong impact on the design of hyper-distributed applications since developers must be concerned about the deployment of the different components of the application instead of focusing on solving the problems specific to their core business and area of knowledge.
Novel data management and processing frameworks oppose this resource-greedy, centralised management mechanisms and establish new control planes distributed across the Continuum integrating the -- compute, store, connectivity and cyber-physical -- capabilities of the devices with seamless management across providers, connectivity types and network zone. These next-generation computing and data technologies pave the way for the raise of new programming environments that ease the programming, deployment, and maintenance of hyper-distributed applications on the Continuum.
Recommendations
Characterizing and Evaluating Different Deployment Approaches for Cloud Applications
IC2E '14: Proceedings of the 2014 IEEE International Conference on Cloud EngineeringFully automated provisioning and deployment in order to reduce the costs for managing applications is one of the most essential requirements to make use of the benefits of Cloud computing. Several approaches and tools are available to automate the ...
Model-based auto-scaling of distributed data stream processing applications
Middleware'20 Doctoral Symposium: Proceedings of the 21st International Middleware Conference Doctoral SymposiumData Stream Processing (DSP) enables near real-time analysis of fast data streams, produced, e.g., by Internet-of-Things devices. Distributed DSP systems exploit distributed computing infrastructures, possibly spanning both Cloud and Fog/Edge platforms, ...
Deployment Strategies for Distributed Applications on Cloud Computing Infrastructures
CLOUDCOM '13: Proceedings of the 2013 IEEE International Conference on Cloud Computing Technology and Science - Volume 02Cloud computing enables on-demand access to a shared pool of IT resources. In the case of Infrastructure as a Service (IaaS), the cloud user typically acquires Virtual Machines (VMs) from the provider. It is up to the user to decide at what time and for ...
Comments