Scholarly article on topic 'Resource Aware Placement of Data Analytics Platform in Fog Computing'

Resource Aware Placement of Data Analytics Platform in Fog Computing Academic research paper on "Materials engineering"

CC BY-NC-ND
0
0
Share paper
Academic journal
Procedia Computer Science
OECD Field of science
Keywords
{"Cloud computing" / "fog computing" / "virtual machine" / analytics / "Internet of Things (IoT)"}

Abstract of research paper on Materials engineering, author of scientific article — Mohit Taneja, Alan Davy

Abstract Fog computing is an extension of cloud computing right to the edge of the network, and seeks to minimize service latency and average response time in applications, thereby enhancing the end-user experience. However, there still is the need to define where the service should run for attaining maximum efficiency. By way of the work proposed in this paper, we seek to develop a resource-aware placement of data analytics platform in fog computing architecture, that would adaptively deploy the analytic platform to run either on the cloud, or the fog, thus reducing the network costs and response time for the user.

Academic research paper on topic "Resource Aware Placement of Data Analytics Platform in Fog Computing"

ELSEVIER

CLOUD FORWARD:FromDistributedtoCompleteComputing, CF2016,18-20 October 2016,Madrid,

Resource aware placement of data analytics platform in fog

computing

Mohit Taneja^*, Alan Davya,b

aTelecommunications Software and Systems Group, Department of Computing and Mathematics, Waterford Institute of Technology, Waterford, Ireland bCONNECT / The Centre for Future Networks and Communications, Ireland

Abstract

Fog computing is an extension of cloud computing right to the edge of the network, and seeks to minimize service latency and average response time in applications, thereby enhancing the end-user experience. However, there still is the need to define where the service should run for attaining maximum efficiency. By way of the work proposed in this paper, we seek to develop a resource-aware placement of data analytics platform in fog computing architecture, that would adaptively deploy the analytic platform to run either on the cloud, or the fog, thus reducing the network costs and response time for the user. © 2016 The Authors. Published by ElsevierB.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).

Peer-reviewunderresponsibility of organizing committee of the international conference on cloud forward: From Distributed to Complete Computing

Keywords: Cloud computing; fog computing; virtual machine; analytics; Internet of Things (IoT)

CrossMark

Available online at www.sciencedirect.com

ScienceDirect

Procedia Computer Science 97 (2016) 153 - 156

1. Background and motivation

Cloud computing has been a pivotal revolution in the field of computer science and research, revolutionizing the way software and applications work, and exponentially increasing the capability of The Internet in this related paradigm of hardware and resources. This approach has seen a significant growth over the past decade, generating a

* Corresponding author. Tel.: +353-89-985-6598; fax: +353-5134-1100.

E-mail address: mtaneja@tssg.org

1877-0509 © 2016 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).

Peer-review under responsibility of organizing committee of the international conference on cloud forward:

From Distributed to Complete Computing

doi:10.1016/j.procs.2016.08.295

shift from distributed towards a more centralized approach in networks. With data centers at the helm, in effect, they are the definitive core to the functionality of cloud service providers.

Cloud computing technology has reformed the way we use devices, and allowed increased control over the way things work. Since we now have increased computation and storage, we have the leverage to collect data from a large collection of end nodes. A collection of sensors / devices with internet connectivity, capable of transmitting data to the nodes higher in the structural hierarchy of the network architecture can be defined as IoT (Internet of Things) devices.

The deployed IoT devices are used merely to sense and send data to the sink, which is then subsequently subjected to cloud based analysis. These devices, though, now cater to a wide application base, and with the successive upgradation in the technology, the response and computation time has become increasingly critical in some use case scenarios. To address this issue of latency in communication with the cloud, and the inability of the edge nodes to perform time critical computation, the concept of fog computing has been introduced1. The networking paradigm of fog computing is nothing but an extension of the cloud to the edge of the network in the form of devices referred to as the fog devices, which are classified as the ones capable of computation, storage and network connectivity between the network edges and cloud computing data centers. These fog devices, closer to the edge by principle, allow distributed computing closer to the source, thus reducing the service and computation latency and decreasing the response time by virtue of automated response or limited computation.

The importance and applications of fog computing have been previously assessed by Yannuzzi et al.2 and Preden et al.3 at a preliminary level.

2. Fog computing architecture

A generic fog architecture can be thought as a three tier network structure4, as shown in Fig. 1 and 2.

The fog architecture for IoT applications can be classified on a 3-tier basis, as depicted in Fig 1. The first tier (Tier 1) corresponds to the end points of the network, comprising of the raw data generated by the sensors which act as data sources. This tier thus can be described as the one containing the terminal nodes consisting of IoT devices. The next tier (Tier 2) is the fog computing layer, also referred to as the fog/edge intelligence. This comprises of devices such as routers, gateways, switches, etc. - the ones that are capable enough to process, compute and temporarily store the received information. These fog devices are connected to the cloud framework, and send data to the cloud periodically. The third and final tier (Tier 3) is the cloud computing layer, which corresponds to cloud intelligence and is capable of storing and processing enormous amount of data, depending on the capability of the data centers.

Tier 3

Cloud Intelligence : Cloud Layer

Tier 2

Edge intelligence / Fog intelligence : Fog Layer

Tier 1

(End point) Sensor Data : Terminal nodes/ IoT Device Layer

Fig. 1. Tier Division.

In fog computing architecture, not every data packet is redirected to the cloud; instead, all real time analysis and latency sensitive applications have a dependency to run from the fog layer itself. The fog devices are basically those which can instantiate a VM (virtual machine), and thus have some computing capacity.

Fig. 2. Fog computing architecture

So, the general architecture, as shown in Fig 2, comprises of the cloud layer at the top level, the fog layer in the middle, and the terminal nodes consisting of the IoT devices and the sensors at the base/granular level.

3. Resource aware placement of analytics platform

The Internet itself consists of many en route networking devices from the sensors/IoT devices to the cloud application, as shown in Fig. 3. However, the devices of prime consideration to us here in the fog computing architecture are the fog gateways that connect the IoT devices to The Internet.

Fig. 3. Networking devices and hierarchy

In parallel explorations, Madsen et al.5 have considered various computing paradigms inclusive of cloud computing, and investigated the feasibility of building up a reliable and fault-tolerant fog computing platform. Do et al.6 and Aazam and Huh7'8 have inspected the different minutiae of resource allocation in a fog computing framework. Based on a cloud computing network architecture, virtual machine replication and a complementary merging mechanisms were proposed and analyzed by Eleni Kavvadia et al.9, capable of exploiting the locally available information to reduce the communication and support cost in the network.

A resource aware algorithm for placement of data analytics platform in fog computing, as proposed, would adaptively deploy the analytic platform to run either on the cloud, or the fog. This platform is assumed to be a service/facility running on a virtual machine, and the proposed algorithm would deduce where the service (function) should be called- on the fog or the cloud. Various parameters would be used therein to determine the optimal placement, such as time to receive data (can be triggered from the deployment location), cost of processing, cost of

memory, priority, etc. This, in its future proposed scenario would also compute when the service in consideration should be migrated or replicated to one of the other available virtual machines in the fog architecture.

All of the above computation depends on the overall cost function, which can be thought of as a primary factor with dependency on both the maintenance cost (usually fixed) and the network cost. By using the tradeoff between the fog and the cloud, we want to minimize the network costs, which in turn depends on the network topology and the user traffic demand. By way of this algorithm, we can henceforth reduce the cost function significantly, thus getting a notch higher in attaining a higher degree of efficiency as far as the network communication is concerned.

4. Conclusion

With the advent of cloud computing in computer science, the processing became easier and the management of resources even better. Then, with the new era of computing applications with the centralized approach of cloud computing, a need emerged to further cut down the response time in latency sensitive applications. Thus, fog computing applications came into play. Now, with this new technology, there are still some challenges to further fine tune the system such that the network costs are minimized and the efficiency increased further. In the work proposed, we seek to develop a resource aware placement of data analytics platform in fog computing architecture, that would adaptively deploy the analytic platform to run either on the cloud, or the fog, thus reducing the network costs and response time for the user.

Acknowledgements

This work has emanated from research conducted with the financial support of Science Foundation Ireland (SFI) and is co-funded under the European Regional Development Fund under Grant Number 13/RC/2077.

References

1. Bonomi F, Milito R, Zhu J, Addepalli S. Fog Computing and Its Role in the Internet of Things. Proc First Ed MCC Work Mob Cloud Comput 2012:13-6. doi:10.1145/2342509.2342513.

2. Yannuzzi M, Milito R, Serral-Gracia R, Montero D, Nemirovsky M. Key ingredients in an IoT recipe: Fog Computing, Cloud computing, and more Fog Computing. 2014 IEEE 19th Int Work Comput Aided Model Des Commun Links Networks 2014:325-9. doi:10.1109/CAMAD.2014.7033259.

3. Preden J, Kaugerand J, Suurjaak E, Astapov S, Motus L, Pahtma R. Data to decision : Pushing Situational Information Needs to the Edge of the Network 2015:158-64.

4. Stojmenovic I. Fog computing: A cloud to the ground support for smart things and machine-to-machine networks. 2014 Australas Telecommun Networks Appl Conf ATNAC 2014 2015:117-22. doi:10.1109/ATNAC.2014.7020884.

5. Madsen H, Burtschy B. Reliability in the Utility Computing Era : Towards Reliable Fog Computing 2013:43-6.

6. Do CT, Tran NH, Pham C, Alam MGR, Son JH, Hong CS. A proximal algorithm for joint resource allocation and minimizing carbon footprint in geo-distributed fog computing. Int Conf Inf Netw 2015;2015-Janua:324-9. doi:10.1109/IC0IN.2015.7057905.

7. Aazam M, Huh EN. Fog computing and smart gateway based communication for cloud of things. Proc - 2014 Int Conf Futur Internet Things Cloud, FiCloud 2014 2014:464-70. doi:10.1109/FiCloud.2014.83.

8. Aazam M, Hung P, Huh E. Smart gateway based communication for cloud of things. 2014 IEEE Ninth Int Conf Intell Sensors, Sens Networks Inf Process 2014:21-4.

9. Kavvadia E, Sagiadinos S, Oikonomou K, Tsioutsiouliklis G, Ai'ssa S. Elastic virtual machine placement in cloud computing network environments 2015;93:435-47. doi:10.1016/j.comnet.2015.09.038.