Latency-aware and Resilient Stream Data Processing in Edge Computing
Recent years have witnessed a huge proliferation of low-cost devices connected on the Internet
of Things. Given the large amounts of data generated by these devices near the edge of the
network, there is an increasing need to process them near the network edge to meet the strict
latency requirements for the applications. For instance, Connected Autonomous Vehicle
applications such as collision warning, autonomous driving and traffic efficiency applications
have a strict latency requirement ranging from 10 to 100 millisecond to produce timely actions.
Edge computing improves the quality of service for such applications by filling the latency gaps
between the devices and the typical cloud infrastructures. While Micro Data Centers provide
computing resources that are geographically distributed, careful management of these resources
near the edge of the network is vital for ensuring efficient, cost-effective and resilient operation
of the system while providing low-latency access for applications executing near the network
edge. This talk will first introduce the notion of Micro Data Centers and the edge computing
architecture. We will then discuss the algorithms, techniques and design methodologies focusing
on efficient and resilient resource allocation for latency-sensitive stream data processing in edge
computing. Finally, we will discuss some open research problems in this area and discuss
potential directions for future work.
Date and Time
Location
Hosts
Registration
-
Add Event to Calendar
- 101 Crawfords Corner Road
- Holmdel Library
- Holmdel, New Jersey
- United States 07733
- Room Number: Township Meeting Room
Speakers
Dr Balaji Palanisamy
Latency-aware and Resilient Stream Data Processing in Edge Computing
Recent years have witnessed a huge proliferation of low-cost devices connected on the Internet
of Things. Given the large amounts of data generated by these devices near the edge of the
network, there is an increasing need to process them near the network edge to meet the strict
latency requirements for the applications. For instance, Connected Autonomous Vehicle
applications such as collision warning, autonomous driving and traffic efficiency applications
have a strict latency requirement ranging from 10 to 100 millisecond to produce timely actions.
Edge computing improves the quality of service for such applications by filling the latency gaps
between the devices and the typical cloud infrastructures. While Micro Data Centers provide
computing resources that are geographically distributed, careful management of these resources
near the edge of the network is vital for ensuring efficient, cost-effective and resilient operation
of the system while providing low-latency access for applications executing near the network
edge. This talk will first introduce the notion of Micro Data Centers and the edge computing
architecture. We will then discuss the algorithms, techniques and design methodologies focusing
on efficient and resilient resource allocation for latency-sensitive stream data processing in edge
computing. Finally, we will discuss some open research problems in this area and discuss
potential directions for future work.
Biography: