Over the past decade
Over the past decade, a lot of questions have been raised regarding services provided by the cloud computing technology. This considers inefficient data processing at the cloud data centers and delays in data transferring to the cloud and then back to the device (Marin-Tordera, Masin-Bruin, 2017). It is evident that there is a need to bridge the gap between cloud centers and the end devices to provide reliability and improved performance. Fog computing which was introduced a couple of years ago might be a solution to these challenges. The use of fog computing in conjunction with the Internet of Things (IoT) brings features of cloud computing closer to the end of the network as it allows the fog nodes to be deployed across different regions of the network (Marin-Tordera, Masin-Bruin, 2017). The extension of cloud computing to the edges of the network plays an important role in reducing latency and network congestion. In this paper, I discuss the architecture of fog computing in detail, benefits of fog computing and security aspects important in fog computing.
The newly introduced fog computing technology that extends services offered by cloud computing to the edges of the network. With this new approach, customers can now run their applications at the end devices without having to fully rely on the internet to access the services. This is achieved by deploying devices across the network components such as gateways and routers.
Its main purpose is to close the gap between cloud and network edges by making use of Internet of things devices to host cloud computing features at the fog nodes (devices that can be placed anywhere in the network architecture) for easy access and improved data processing (Atlam, Walters, Wills, 2018). It aims to close the loopholes in cloud computing by providing the architecture that supports database decentralization, computation and data processing near the edges of the network and device-to-device communication (Atlam, Walters, Wills, 2018). Fog computing makes use of the internet of things devices with capabilities of performing all these tasks.
Definition of fog computing
Fog computing is an extension of cloud computing to the edges of the network, instead of keeping everything in the cloud some resources and services are placed at the end nodes (Khan, Parkinson, Qin, 2017). Fog computing consists of components for managing storage and network resources that are distributed between end devices and data servers. In the case where data need not be sent to the cloud, these components play a major role in placing it near the edges of the cloud.
Fog computing supports quite a several features and services such as reliable communication protocol, cloud integration and computing resources which makes it easier for applications that need low latency and faster data processing to perform their tasks. Fog computing provides a device-to-device communication instead of sending data to a remote server. This communication mode depends more on the near located-devices which promotes easy data sharing instead of requesting it from a remote server. Furthermore, D2D communication ensures that data is processed, stored and manipulated closer to the requester(Veeramanikandan, Sankaranarayanan, 2017).
In contrast to the centralized architecture of cloud computing, the geographical distribution of the fog nodes plays an important role in delivering high quality information among end devices through access points and proxies that are geographically located on the highways. For instance, vehicles moving on the same road can share important information about the conditions of the road. Noteworthy, the information need not have to be of the same format as fog computing supports data processing of different kinds.
Fog computing capabilities are not limited to the above-mentioned characteristics, there are quite a few services offered to customers. Such as software as a service (SaaS) which allows end-users to use applications running on a fog node. Another one is the platform as a service (PaaS) that allows end-users to deploy applications onto the platform of fog node. Lastly, the Infrastructure as a service (IaaS) that provides customers with computing resources such as data processing, storage, and networks.
The architecture of fog computing
One of the features that a distinguishes fog computing from cloud computing is the approach used to design the architecture of fog computing system. The Heterogenous physical resources layer, orchestration layer, and abstraction layer are the main components of cloud computing that give it functionalities not present in cloud computing architecture(Dhelim, Hu, Ning, Qiu, 2017). Figure 1 below illustrates a detail architecture of fog computing along with end-devices and software residing on each layer.
Figure 1: The architecture of Fog Computing
The image (Openfogconsortium, 2018) shows the architecture of fog computing
On the top boundary of fog computing architecture lies the orchestration layer that is responsible for managing services on the fog nodes. Some of these services include software as a service and platform as a service. This also provides different Application Programming Interfaces (API) that are compatible with a wide range of devices which may be used to query data against a distributed database and for deploying users` application (Caganoff, 2013).
The orchestration layer ensures that the resources are kept safe but does not have capabilities of managing them. This is where its neighboring layer (Heterogenous physical resource layer) comes into play. This duo works together to ensure services such as reliability, availability and real-time characteristics (Wobker, Seitz, Mueller, & Bruegge, 2018). Heterogenous physical resource layer consists of components responsible for the deployment of the same application to devices with different capabilities. For instance, during video streaming the same video can be optimized and delivered to devices with different cameras. There is no need to worry about transferring a large amount of data to the cloud server because the primary processing of data is performed locally at the fog node installed next to sensors which then transfer it to the cloud servers after processing. A three-level data model constructed of the distributed database is introduced to facilitate the process of requesting and delivering data to appropriate devices (Tsukasa, 2018). The first level is which saves data from a sensor is placed in the fog nodes. The extracted data and its analysis are saved in the second level and third level which both resides in the cloud servers.
At the bottom of the fog computing architecture resides the abstraction layer which is responsible for controlling and managing physical resources such as Energy, memory, network and CPU(Zijiang, Novak, Shanhe, Qun, 2017).). Cloud computing cannot keep up with a rapid increase in the number of devices that request a large amount of data that must be transmitted over the internet with limited bandwidth. This layer provides great APIs for monitoring geo-allocated servers that are distributed across the network. Figure 2 illustrates some of the components of this layer.
Figure 2: Layout of components of the abstraction layer
The image (Cloud via abstraction layers, 2016) shows components of the abstraction layer
Scenarios that can benefit from fog Computing
Response Time and Latency
Having to send a huge amount of data to cloud servers and getting a response back is unbearable if these tasks are performed with limited resources such as one route for data transmission and limited bandwidth. Fog computing addresses these issues by bringing new features and services that play an important role to ensure quality services are delivered to the end users (Loannis, Panagiotis, Vasileios, 2018). The architecture requires no internet connectivity at all because some data is kept close to local machines. Now that more data is stored much closer to the internet of things devices which imply that there is less data transfer between the cloud server and devices, this leads to increased bandwidth and response time (Dickson, 2016).
Computation and Data Processing
Computation and data processing in the cloud servers are some of the problems in cloud computing that service providers are facing a huge amount of data sent over the internet for computation and processing (Hu, Dhelim, Ning, Qui, 2017). These tasks require big and strong cloud servers which can produce poor performance if not properly maintained. The idea of deploying some devices in the nodes will make life much easier for businesses that fully rely on the cloud servers. Because computation and data processing will be done locally.
There is less demand for bandwidth since small chunks of data is stored on the edge servers and integrated at access points rather than sending it directly to the cloud servers (Abdelshkour, 2015). This approach conserves a lot of network bandwidth as data is stored and processed locally. As a result, business organizations will now pay less for the bandwidth and data kept on the cloud servers. And consumers can use IoT devices to gain access to services without an internet connection.
Security aspects of fog computing
The main security concern in fog computing is the authentication at different fog nodes distributed across the network. The device-to-device communication that provides multiple routes between the same nodes may lead to data leakage due to the lack of centralized authority (Khan, Parkinson, Qin, 2017). Fog nodes are more likely to suffer from the Denial-of-service attack (DoS) since data is communicated through unauthorized gateways. Dos attack either from fog nodes or end-user device can lead to data disclosure or delay of the service delivery (Marin-Tordera, Masip-Bruin, Garcia-Alminana, Jukan, Ren & Zhu, 2018). As shown in figure 3, the attacker can remotely access the decentralized database and have control over the legitimate user computer which might either slow down the connection or completely block it (Dickson, 2016).
Figure 3: Scenario of the denial of service attack
The image(Denial-of-Service attack, 2018) shows scenario of the denial of service attack
One more security issue is the storage of customers` data in local sensors that are far from a more secure cloud server. Since these sensors deliver sensitive data to end devices such as smart street lights and smart grids. Attackers can easily disclosure user information and use it against them. Accessing the smart meter will reveal information about the family, for instance, what time there is no one at home or when does the security system undergo maintenance which invades customers` privacy (Stojmenovic, Wen, 2018). Furthermore, in the case of healthcare systems that use fog computing. Through medical sensors that continuously transmit data to fog nodes, the system can be exploited to gain access to the confidential information or compromise data integrity during transmission.
Implementation of fog computing
Healthcare System: Mirjana (2017) stated that “Fog computing technology is applied to health care systems, where the infrastructure could be modified to accommodate new features that can help to minimize power used by medical devices and to bring services much closer for fast data retrieval during life-critical situations”. In the case of the car accident, geo-allocated servers and internet of things devices can be used to detect the location where car accident took place and share that information with other devices.
Smart Traffic lights and connected vehicles: Smart streetlights can interact with sensors that are deployed locally to detect the presence of bikers and pedestrians. And calculate distance and speed of vehicles (Stojmenovic, Wen, 2018). Furthermore, it can also be used to send a warning signal to other vehicles about traffic status and suggest an alternative route. Video cameras can also be used to listen to the siren or sense flashing lights of the police car, ambulance or firefighter car to open lane for them to pass through.
Fog computing is an undergoing research that might be a solution to the challenges that include the delivery of the reliable service from the cloud and data processing. The initiation of this research caught the attention of many research institutions and business organizations which might benefit from it