Open access peer-reviewed chapter

A Survey on IoT Fog Resource Monetization and Deployment Models

Written By

Cajetan M. Akujuobi and Faith Nwokoma

Submitted: 08 April 2023 Reviewed: 12 September 2023 Published: 18 December 2023

DOI: 10.5772/intechopen.113174

From the Edited Volume

Internet of Things - New Insights

Edited by Maki K. Habib

Chapter metrics overview

13 Chapter Downloads

View Full Metrics

Abstract

There has been an immense growth in the number of applications of devices using the Internet of Things (IoT). Fog nodes (FN) are used between IoT devices and cloud computing in fog computing (FC) architecture. Indeed, an IoT application can be fully serviced by local fog servers without propagating IoT data into the cloud core network. FC extends the cloud-computing paradigm to the network edge. This paper surveys fog resources monetization and the wide use of IoT devices in making FC a paramount technology necessary to achieve real-time computation of IoT devices. We looked into the monetization architectures applied by various literature. We found that the decentralization fog monetization architecture stands out since it solves some issues posed by centralized fog monetization architecture, such as QoS and additional fee costs by third parties payment gateway.

Keywords

  • Internet of Things (IoT)
  • fog computing (FC)
  • fog monetization architecture
  • blockchain
  • quality of service (QoS)

1. Introduction

The application of Internet-of-Things (IoT) technology to almost every sphere of life has resulted in immense growth in the number of IoT devices, the acceptance of IoT, IoT applications, and the volume of data uploaded to cloud systems. The international data corporation (IDC) forecasted that about 41 billion IoT-connected devices will be active in 2025, producing data surpassing 79 ZB [1]. Current cloud systems are not large enough to process and store this increase in IoT data traffic to meet real-time demands [2], which affects all IoT systems. Too many networks towards a distant cloud can cause high latency for sensitive IoT applications, such as healthcare [3], multimedia [4], and vehicular/drone applications [5, 6]. In addition, the centralization of the cloud may lead to a reduction of privacy in IoT data uploaded [7].

Fog nodes (FN) are used between IoT devices and cloud computing in fog computing (FC) architecture to reduce the distance data travels for processing in the cloud. Therefore, enabling cloud computing services from core network infrastructures to customer premises by using fog nodes, namely; switches, private servers, cloudlets, routers, etc. The closeness of fog nodes to edge devices results in a large reduction in latency, energy-efficient, and optimal use of the network bandwidth for applications within agriculture, smart cities, etc. [8]. Also, fog nodes eliminate data duplication and empower applications using fog with local and near real-time intelligence. Regarding processing and storage abilities, fog servers are much smaller than cloud [9]. Still, fog servers’ larger number and geo-distribution allow fog to alleviate cloud network congestion by servicing many IoT applications [8]. Indeed, an IoT application can be fully serviced by local fog servers without propagating IoT data into the cloud core network. Fog computing enables computational workload offloading through fog nodes which can further reduce the transmission latency and ease traffic congestions on the Internet. It also introduces many new services and applications that cannot fit the traditional cloud computing architecture well. For example, large-scale environmental monitoring systems can deploy computationally intensive applications at the sensors and utilize the fog computing architecture to achieve instantaneous response [9, 10].

This paper presents a survey on fog resources monetization, a payment system implemented for the services delivered by fog resources [11]. The wide use of IoT devices has made FC a paramount technology necessary to achieve real-time computation of IoT devices. The basic unit of FC, which is the FN, is defined in this literature, also highlighting the characteristics of FC. The available deployment and revenue models are divided into four and discussed briefly. Fog resource monetization was divided into centralized and decentralized architectures. The centralized architecture has a central authority, which determines the pricing model and quality of service (QoS), while the decentralized system has no central authority and no fixed pricing model.

Advertisement

2. Background

The term “Internet of Things,” composed in 1999 by Kevin Ashton [12], refers to a network of interconnected physical devices, technologies, objects, and services through the Internet for the exchange, processing, and storage of data [13, 14]. IoT devices use sensors to get information from the surrounding environment and regularly respond to this information through an actuator. The application of IoT has evolved over the years, and it is widely used to implement a variety of groundbreaking smart devices and services. Unfortunately, it cannot often implement such services directly through the IoT device. The large variety of heterogeneous data, otherwise known as big data [15], must be computed and stored by IoT devices. IoT devices are mostly battery-powered and have deficient networking, processing, and storage resources. Therefore, it cannot efficiently carry out these operations (computing, storing, and networking) [12]. The cloud-computing concept was introduced to compensate for IoT devices’ deficiencies in computing, processing, and storing big data [2].

The exponentially growing request for computationally intensive applications and services makes cloud computing necessary. Cloud computing gives users access to various on-demand services by judiciously using the hardware and software in cloud data centers [12]. Large-scale data centers are huge and costly; therefore, they are always built-in low-cost remote areas. Although cloud computing solves the computational and storage problem of IoT devices, it poses a challenge of high latency due to its distance from the IoT devices located at the edge of the network and poor quality of service (QoS) due to data traffic in the network between the cloud and edge devices [10]. A new framework called fog computing was proposed to resolve the issues associated with cloud computing.

The concept of fog computing can be traced back to early 2009 when Satyanarayanan et al. [16] proposed using cloudlets to cope with the limits of cloud computing, especially the high and unpredictable latencies. These cloudlets provide the benefit of cloud computing close to the edge devices, and when there are no cloudlets, the edge devices communicate with the cloud directly. Cloudlets are used to support edge devices to carry out computational operations. This process of using cloudlets is called edge computing [17]. “Edge computing” and “fog computing” are often used interchangeably. However, there are similarities between them. It’s essential to identify that edge computing does not view the overall service as consisting of a hierarchy of nodes with the cloud also included; instead, the overall service is performed by a close-by cloudlet [17]. Due to this, Open Fog Consortium differentiates fog and edge computing, highlighting that fog works with the cloud and is hierarchical. In contrast, the edge works independently of the cloud and is restricted to several layers. Hence, it is important to note that, though there are similarities between the two concepts, they were designed for different contexts; nevertheless, they are both growing towards an inevitable convergence [18, 19, 20, 21].

FC extends the cloud-computing paradigm to the network edge. Formally, fog computing (FC) is defined as the virtualization of network architecture that “uses one or a collaborative multitude of end-user clients or near-user edge devices to carry out a substantial amount of storage (instead of stored primarily in cloud data centers), communication (instead of routed over backbone networks), control, configuration, measurement, and management” [22]. Fog computing is proposed to enable computing directly at the network’s edge, which can deliver new applications and services, especially for the future of the Internet of Things. Fog computing uses fog nodes at the edge to interact with IoT devices.

Advertisement

3. Definition of fog node

Fog computing overcomes the limitations of cloud computing to enable real-time analysis for smart devices at the edge of the network. The process of fog computing involves data transfer to fog nodes for processing, storage (temporary storage), and networking operations from edge devices. Fog nodes are the basic units of fog computing. A network device that uses processing capabilities, dedicated servers, or computational servers to coordinate underlying edge devices can be referred to as a fog node [21, 23], or, put, a fog node is a physical device that performs fog computing. In [21], some examples of fog nodes were given as; wireless access points, routers, video surveillance cameras, switches, and Cisco Unified Computing System (UCS) servers. One uniform feature among all these devices is that they all embed storage, computing, and networking abilities, all necessary for IoT applications. A fog architecture is usually an aggregation of several levels of nodes. A processing application might be suited to a particular level due to the specifics of the requirement of that application for such features as latency, mobility, security/encryption, and the need for quick scalability [24]. The position and number of levels of fog nodes in a hierarchical fog will depend on the architecture involved. In the architecture described in [25], fog nodes are created near base stations in 5G networks. In contrast, as described in the architecture in [26], end users contribute to providing fog devices within residential areas and are rewarded as incentives to share fog nodes.

Advertisement

4. Fog computing characteristics

Fog Computing is a highly virtualized platform that overcomes the limitation of interaction between end devices and the cloud. Fog computing devices are mostly located at the edge of the network; they bring the paradigm of cloud computing, such as computing, networking, and storage services, to the edge of the network. It provides all the benefits of the cloud to the edge devices and compensates for the limitations of the edge and IoT devices. Fog computing enhances the performance of edge computing by reducing the time, bandwidth, and energy requirements that would have been expended in IoT-Cloud communication. In this section, the characteristics of fog computing from the works of [12, 21, 27, 28, 29, 30] are highlighted below:

4.1 Edge location

One of the major characteristics of fog computing is that it contains fog nodes located at the edge of the network. The FNs are in the same environment and location where the IoT devices generate their data [31]. From the perspective of the Communication Service Provider, FNs are the cloudlets attached to the base stations that are distributed with the service masts/tower [31]. These fog nodes enable the computational ability of the cloud to be performed at the edge of the network.

4.1.1 Location-awareness

Fog computing supports location awareness applications. The ability of an edge device to be aware of its location through an application is known as location awareness. Location awareness enables location-specific services and information to be available for users when a device enters or leaves a geographical region. This is particularly important for mobile edge IoT devices in applications like the automotive, drone, and health industries. Location-awareness features of the FC network provide important information in resource planning and distribution for equitable, even, and fair service distribution.

4.1.2 Low latency

The issue of high latency gives rise to the need for FC due to the distance between edge devices and cloud systems. FC alleviates this high latency by providing the network’s edge fog nodes. FN supports end devices with cloud services at the network’s edge, including applications with low latency requirements (e.g., gaming, video streaming, and augmented reality). FNs make the cloud’s robust computational and storage capacities available to the edge devices in the shortest time possible since they are located almost in the same network and environment as the IoT and other edge devices needing their services [32].

4.1.3 Geographical distribution

The cloud is more centralized, while in contrast, FC uses services and applications that requires vastly distributed deployments. FC contains varieties of FN, which are widely distributed, and are placed in different places such as highways, tracks, network infrastructure, and even in residential buildings. Therefore, fog computing consists of widely distributed fog nodes that enable data processing, storage, and computing with IoT devices.

4.2 Large-scale sensor networks

Fog computing has a very large-scale sensor network. These networks are connected between the fog nodes, clouds, and endpoint devices. Due to fog computing having a large-scale network since it is widely distributed, it can be used to monitor environments, smart city designs, smart agriculture implementations, and smart grid applications. These large-scale sensors send data continuously to the FN for processing, analysis, decision-making, and storage. These find applications in smart home devices, wearable devices, industrial sensors, connected appliances, smart healthcare devices, vehicles, environment monitoring devices, and smart agriculture devices.

4.3 Large number of fog nodes

Fog computing supports a very large number of IoT devices with cloud paradigms. Due to the large number of IoT devices that are widely distributed, FC has a large number of FNs to support these IoT devices. Fog nodes, sometimes called fog servers, include servers, routers, gateways, and IoT devices with routing, storage, and computing capabilities.

4.4 Support for mobility

It is essential for many Fog applications to communicate directly with mobile devices and therefore support mobility techniques since a good percentage of the devices on the edge of the network are not stationary, for example, wearables, drones, and self-driven cars. FC must have robust mobile support capability for efficient edge computing. Mobile devices, like automobiles and drones, always change location quickly and depend on FN’s critical services for operational efficiency and decision-making. This will depend on the capability of the fog network to offer these services without a drop in the quality of services rendered as these devices move from one location to another [33].

4.5 Real-time interactions

An important feature of FC design is the need for real-time support for edge devices. FC is designed to greatly reduce the latency in communication between IoT and the cloud [34]. Fog applications involve real-time interactions rather than batch processing. Fog computing enables real-time interactions between end devices and fog nodes by ensuring it operates at the lowest possible latency. The real-time feature of the fog supports gaming, healthcare, automotive, aviation, streaming, and security systems.

4.6 Heterogeneity

IoT comprises different devices, including Fog nodes deployed in various environments from different vendors and technologies. Fog Computing, as a platform of high virtualization, yields computation, storage, and networking services, bridging the gap between edge devices and the cloud. While standardization has not been achieved across various FC computing paradigms like deployment methods, orchestration strategies, and equipment designs, there is beginning to be convergence between enterprises of similar interests [31]. Fog computing must continue to grow in accommodating heterogeneous vendors for equipment and application. It is also necessary to have communication protocols that will assist in interoperability.

4.6.1 Scalability/flexibility

Resources and devices should be added dynamically to accommodate constant changes in the network system. Fog networks should be distributed and have the flexibility of ease of integration with new devices and other networks. It should be scalable to meet the ever-growing deployment of IoT devices and to handle the enormous data generated from these IoT devices for processing, analysis, and storage. FC has the scalability features of the cloud for quick provisioning and an increase in available computing resources to handle spikes in service requests.

4.7 Interoperability and federation

For the seamless delivery of some peculiar services, like streaming, the cooperation of several fog providers will be required. For this reason, there is a requirement for fog components to interoperate, and services will need to have the capacity to be federated across different domains. Like cloud services, FC is delivered across different layers of technologies and paradigms. You have the ISPs, cloud services providers, payment industries, network equipment vendors, varying communication protocols, and security standards, to mention a few. However, to a large extent, there is a federation in most cloud paradigms enhanced by standardizations across these layers, which has enhanced interoperability, and there is still continuous to be more. For FC to meet the emerging IoT service needs, more must be done in federation across vendors and communication stack to ease the different technologies’ interoperability.

4.7.1 Filtering

Edge devices produce huge amounts of data in real time, and the fog nodes filter all this data. Some filtered data are sent to the cloud for further processing or storage. The edge device that directly receives the data from the IoT devices must be able to determine noise in the data and perform some level of processing and analysis that will support the IoT’s immediate need before offloading it to the cloud. This enables real-time analysis of data by the fog. It also saves the bandwidth wasted in offloading useless data to the cloud. The filtering capability of the FC allows faster data analysis and decision-making at the edge.

Advertisement

5. Deployment models and revenue scenarios of fog computing

Understanding the different revenues and incentive structures employed through different deployment models is a challenge necessary to enable a vast adoption of fog computing systems. The different revenues and incentive models proffer a better understanding as to why:

  1. Infrastructure providers would offer their resources to act as FN

  2. Users would want to make use of these FC resources.

Edge architecture, such as Wi-Fi deployments within cities, can be considered similar to FC deployment, which different organizations control. Characteristics of FC, like geographical distribution, security requirements, and FN heterogeneity, are related to the revenue models. They play a major role in how FNs can generate a potential revenue stream for FN providers [12]. Maintaining a suitable infrastructure with good computing network performance and power without sufficient FNs will be unrealistic. Providing incentive models for the provision and maintenance of FNs is essential. We consider the following four types of deployment models. The description below attempts to provide context for the deployment model based on the particular deployment approach being used in [12, 31, 35, 36, 37].

5.1 Dynamic FN discovery supported revenue model

This model describes the dynamic discovery of an FN as an end device changes its location. The user device searches for FN in its “vicinity” using the advertised profile of the node (which can include availability statistics, security credentials, and types of available services). Applying this approach, the user is not guaranteed that a suitable FN will be found to sustain an application session. Still, negotiation can take place if multiple fog nodes are found. A user device can also cache previously seen fog nodes. The incentive for the provider is to gain revenue from each user session sustained using that FN. A user is charged based on connection time, size of data, or range of services utilized. The incentive for this deployment model is based on the fact that fog node providers gain revenue from each user session based on the connection time, range of services utilized, and size of data, therefore it is necessary for the fog nodes provider to make the FN discoverable to enable users to connect. The revenue earned by undertaking this would be the basis for the deployment model. This deployment model gives the user the option to choose the fog node needed based on the service and subscription model provided by the FN.

5.2 Pre-agreed contracts with fog providers

This deployment model generates pre-agreed contracts with operators of specific FNs—negotiated at a set price. Hence, there would be a preferential selection of particular nodes by a user if multiple choices are found. This also reduces user risks, as security credentials would be included in these pre-agreed contracts and could be configured (e.g., use of particular encryption keys) beforehand. These pre-agreed contracts must comply with service-level objectives (e.g., an availability profile) that an operator needs to meet. It is, therefore, possible that a fog node operator will outsource their task to a Cloud provider. The incentive for the provider is to increase the number of potential subscribers by developing pre-agreed contracts. Capacity planning associated with such FNs depends on accurately predicting potential future demand. In this case, the deployment model involves agreeing to a cost for entering into a contract with a Fog provider. This contract also provides preferential access to the provider’s fog nodes.

5.3 FNs federation

This deployment model involves multiple FN operators collaborating to share the workload. This would imply a federation between FNs within a particular geographical area to sustain potential revenue. There would be a preferred cost for sharing the workload with other providers, enabling revenue sharing between providers. It is necessary to identify how workload “units” can be characterized to enable such an exchange. This is equivalent to alliances between airline companies, where specialist capability (and capacity) available along a particular route can be shared across multiple operators. In the same way, if an operator deploys specialist GPUs or video analytics capability within an FN at a particular location, other operators could also seamlessly make use of this and similarly share other capabilities in other locations. This type of geographic-centric specialization could enable localized investment within particular areas by operators.

5.4 Fog-cloud exchange

This deployment model involves a user device not being aware of the existence of any FN. Instead, the user device interacts with a Cloud operator who then attempts to find an FN near the user. Therefore, the Cloud operator needs to keep track of the user location and discover suitable FN operators that could be used to support the session at a particular location. In this instance, the Cloud operator will always try to complete the user request first; however, if a QoS target is unlikely to be met due to latency constraints, it can outsource the user request to a regional FN. The incentive in this instance is to enable Fog-Cloud exchange contracts to be negotiated between providers.

Advertisement

6. Evaluation of fog computing resources monetization architecture

Fog computing is well known for making cloud processing, storing, and computational ability available at the network edge through fog nodes. Therefore, enabling cloud abilities to be carried out from end devices to nearby fog nodes, hence high computational power and low latency are achieved simultaneously. Fog architecture performs a huge role in fog computing. In most related works, the fog computing architecture is described as the structure shown in Figure 1 [25, 38, 39, 40].

Figure 1.

Abstract model of fog computing.

This structure involves an end device communicating with the fog nodes and clouds to request or send data for processing or storage. The monetization aspect of this architecture introduces a fourth block, either a third-party operator or a smart contract. Fog node resource monetization varies according to the literature. In this work, the architecture based on the monetization and pricing model employed is divided into two, namely:

  1. Centralized monetization architecture

  2. Decentralized monetization architecture

Advertisement

7. Centralized monetization architecture

According to [35, 41, 42], the centralized monetization architecture is shown in Figure 2. This architecture comprises the cloud, fog nodes, edge devices, and third-party payment gateway. The third-party payment gateway is an entity that helps the fog providers receive payment online for the services rendered to the end users. This is called a centralized monetization architecture because the fog provider has firm control and authority over the kind of services rendered and determines how the fog services will be monetized, irrespective of the QoS provided. The third-party payment gateway implements the pricing and monetization strategies between edge devices and fog nodes. This leads to a subscription-based pricing model [41]. This fixed payment is only advantageous when the fog service providers deliver the quality of service promised. Still, there is mostly a variation in the quality of service (QoS) which is not reflected in the subscription-based pricing models. The QoS promised is not always the same in all instants when the customer accesses the fog service, resulting in mistrust between the fog node provider and customer. Also, service charges by third parties increase the cost of using the fog services by the consumers. Since it is centralized and embeds a fixed pricing model, hence once the promised quality of service is not met, it might lead to customers churn and vendors lock-in for situations where it is difficult for customers to migrate to another vendor due to sole dependence on a particular vendor [35].

Figure 2.

Central monetization architecture of fog nodes.

Advertisement

8. Decentralized monetization architecture

The decentralized monetization architecture shown in Figure 3 consists of cloud servers, public fog nodes, edge devices, and a smart contract whose major function is ensuring monetization and a structured, logical revenue exchange between fog nodes and edge devices. In this kind of monetization architecture, the cloud or fog provider has no control or authority over the monetization and pricing model of services rendered by the fog nodes. The monetization strategy is shared between the fog service provider and the end user through a smart network, for example, blockchain. The blockchain network accesses the quality of service the fog devices provide and determines the pricing model to employ between the end user and the fog service provider.

Figure 3.

Decentralized monetization architecture of fog node.

The Ethereum smart contract was the monetization smart contract [35]. The smart contract was divided into fog node provider only, device only, and fog only as the layers of authorization provided in the Ethereum smart contract, and each entity, as the name of the layer suggests, can only access layers present to it. The edge devices are registered in the smart contract, and the device deposits an initial amount. The interaction between the fog node, Ethereum smart contract, and edge devices enables money to be paid to the fog node for services rendered or money refunded by the fog node if there is a breach in trust between the fog nodes and the device. Also, an individual or a particular organization may own a set of fog nodes. These fog nodes generate, curate, and process raw data in a specific geographical area. These processed data may not all be needed by the organization. Lizcano [43] presents a way of monetizing these fog nodes with other fog nodes owned by an individual or organization needing those processed data. Guevara et al. [24] propose a digital marketplace where fog nodes requiring a specific data set can connect with others to get the required data using blockchain and FIWARE technology. The blockchain ensures trust between fog nodes during data exchange and implements the pricing model configured in the smart network while data is being exchanged. FIWARE technologies ensure the interoperability of data between the fog nodes. The blockchain ensures trust and nonrepudiation between the devices, while the FIWARE technologies ensure interoperability between the connected fog nodes. Furthermore, fog computing depends on joint action between several infrastructure operators and service providers who manages and operates this infrastructure pose a major challenge. Also, resource allocation is still a major problem for fog computing instances. A user-participatory fog computing architecture was proposed in [25]. This fog architecture is similar to a WIFI architecture where users connect to the WIFI with their devices. Likewise, in the model, services provided by fog benefit the users when they install fog devices to the network. In contrast, the fog container placement is controlled by fog managers to make it feasible. After successfully connecting to the network, the user registers fog devices in the fog portal. The fog portal between the corresponding resources plays an intermediary role.

The decentralized fog monetization architecture obliterates the issues of QoS and third-party fees faced by centralized architecture. The decentralized architecture employs smart contract technology with algorithms written for the monetization and pricing model. The interaction between the public fog nodes and the edge devices is made public to all network members through a public ledger. The trust between the customer and fog node providers is restored, and the quality of service is tracked at each stage. Once it drops below a certain standard, there is a breach of trust between the fog node and the edge device, and a refund of revenue is demanded [23]. Fog nodes with more trust issues are flagged by the blockchain network and avoided by other fog devices. Their system reputation is monitored, which keeps the fog providers in check.

Advertisement

9. Challenges

Although there is a vast improvement in ideas related to Fog node monetization, some challenges are still encountered, which need to be addressed. To make fog computing a reality, to the extent of the demands postulated, some of the open challenges of fog are listed below:

  1. Fog Networking: The heterogeneous nature of the fog network placed at the internet edge poses a challenge in managing and controlling services such as maintaining connectivity between heterogeneous devices. Many vendors and major players are currently providing fog networks with silo technologies. This makes interoperability difficult, and it is a challenge for the minor players to contribute or enter the fog network ecosystem. The convergence of the major providers, like Amazon, Microsoft, Ericson, etc., to a common standard, will boost the expansion of the fog market. As seen in [31], some bodies are beginning to come together to create a common ground for technology convergence. These include the 5G Alliance for Connected Industries and Automation (5G-ACIA), Automotive Edge Computing Consortium (AECC), Industrial Internet Consortium, and many others. As noted in [31], while standardization bodies like 3GPP, ETSI, and TM Forum and pushing out standards for fog and IoT, some open source forums are also contributing to fog convergence, such as Cloud Native Computing Foundation (CNCF), Open Network Automation Platform (ONAP) and LF Edge [29]. More research must be conducted to show ways of providing these services more flexibly.

  2. Task Scheduling: Task scheduling is not an easy fix in the Fog. This is because the task can move between various physical devices like fog nodes, back-end cloud servers, and client devices. There is a need for efficient communication between different planes, administrative, data, user interface, and many other processes for seamless task scheduling. Caprolu et al. [44] explored similarities with docker for containerization. Since many fog technologies, especially open-source forums [31], are moving towards an autonomous distributed system, more work must be done in designing an efficient system that will distribute service requests among the many serving nodes considering the features of such services. For instance, a pool of IoT devices seeking data offload and analysis much be merged with a fog node with compute and storage capacity, and the location of the serving nodes must also be accounted for concerning latency requirements.

  3. Management: For fog computing to be feasible, there are potentially billions of small edge devices and fog nodes to be configured, the fog will heavily rely on decentralized (scalable) management mechanisms that are yet to be tested, and this, at an unprecedented scale. The technology can only deliver its best decentralized. Like blockchain, FC needs to support technologies like smart cities, smart agriculture, automotive, and surveillance systems without the control of a central firm. The management plan must be flexible so that new players will not find it difficult to enter the market. Service requests and integrations must be made seamless for end users. Such an ecosystem will lead to a rise in fog technologies and service quality. Some of these issues are partly laid out in [45].

  4. Location of Fog Nodes: Fog nodes are the basic unit of fog computing. As the number of IoT devices increases exponentially, more fog nodes will also be needed. The problem becomes; where will these fog nodes be placed to ensure optimal functionality? Many providers of fog nodes leverage telecommunication site locations for fog servers. This is not efficient as telecom sites are not evenly distributed. This deployment type will disfranchise industries like agriculture, usually in rural areas. Fog technology integration technologies need to make room for user-contributed nodes so individual users can contribute FN to locations with a deficiency of FN. Incentives for node contributions must be lucrative enough to attract sufficient contributions.

Advertisement

10. Conclusion

Fog computing has immensely alleviated the challenges faced by edge computing, and has introduced us to new possibilities we can achieve with real-time analysis. The application of fog computing is vast, ranging from health care, agriculture, sports, housing, computations, etc. Fog computing makes up all the deficiencies of IoT networks in computing and storage and reduces the latency in IoT-Cloud communication. Monetization of fog computing has been a major challenge since all participants seem not to be favored in any system of monetization. In this work, we surveyed the characteristics of fog computing, such as a large number of nodes, edge location, low latency, etc. We briefly explained the working relationships, the importance of each, and how fog computing brings cloud capabilities to the edge. We also explored the monetization architectures applied by various literature. We found in the study that the decentralization of the fog monetization architecture stands out since it solves some of the issues posed by centralized fog monetization architecture, such as QoS and additional fee costs by third parties payment gateway.

References

  1. 1. Wright A. The growth in connected IoT devices is expected to generate 79.4ZB of data in 2025, according to a new IDC forecast. Framingham, MA, USA, Rep. PrUS45213219: International Data Corporation (IDC); 2019. Accessed: April 2020. [Online]. Available from: https://www.idc.com/getdoc.jsp?containerId=prUS45213219
  2. 2. Fog Computing and the Internet of Things: Extend the Cloud to Where the Things Are. San Jose, CA, USA, Rep. C11-734435-00: CISCO; 2015. Accessed: April 2019. [Online]. Available: https://www.cisco.com/c/dam/en_us/solutions/trends/iot/docs/computing-overview.pdf
  3. 3. Santos GL et al. Analyzing the availability and performance of an e-health system integrated with edge, fog and cloud infrastructures. Journal of Cloud Computing. 2018;7(1):16
  4. 4. Do CT, Tran NH, Pham C, Alam MGR, Son JH, Hong CS. A proximal algorithm for joint resource allocation and minimizing carbon footprint in geo-distributed fog computing. In: IEEE Proceedings of the 2016 International Conference on Information Networking (ICOIN). 2015. pp. 324-329. [Online] Available from: https://ieeexplore.ieee.org/document/7057905
  5. 5. Yu C, Lin B, Guo P, Zhang W, Li S, He R. Deployment and dimensioning of fog computing-based internet of vehicle infrastructure for autonomous driving. IEEE Internet Things Journal. 2019;6(1):149-160
  6. 6. Loke SW. The internet of flying-things: opportunities and challenges with airborne fog computing and mobile cloud in the clouds. arXiv:1507.04492. 2015. [Online]. Available from: https://arxiv.org/abs/1507.04492
  7. 7. Vaquero LM, Rodero-Merino L. Finding your way in the fog: Towards a comprehensive definition of fog computing. ACM SIGCOMM Computer Communication Review. 2014;44(5):27-32
  8. 8. Bonomi F, Milito R, Zhu J, Addedapalli S. Fog computing and its role in the internet of things. In: Proceedings ACM 1st Ed. MCC Workshop Mobile Cloud Computing. ACM Digital Library; 2012. pp. 13-16. DOI: 10.1145/2342509.2342513. Available from: https://dl.acm.org/doi/10.1145/2342509.2342513
  9. 9. Vahid Dastjerdi A, Gupta H, Calheiros RN, Ghosh SK, Buyya R. Fog computing: Principals, architectures, and applications. ArXiv: 1601.02752. 2016. Available from: https://arxiv.org/abs/1601.02752
  10. 10. Yi S, Hao Z, Qin Z, and Li Q. Fog Computing: Platform and Applications
  11. 11. Farshad F, Bahar F, Mahmoud D, Cesare P. Guest editorial: Special issue on AI-driven IoT data monetization: A transition from Value Islands to value ecosystems. IEEE Internet of Things Journal. 2022;9(8):5578
  12. 12. Bassi A, Bauer M, Fiedler M, Kramp T, Kranenburg RV, Lange S, et al. Enabling Things to Talk. NY, USA: Springer Heidelberg, Springer Link; 2013. DOI: 10.1007/978-3-642-40403
  13. 13. Ray PP. A survey on internet of things architectures. Journal of King Saud University-Computer and Information Sciences. 2016;30(3):291-319
  14. 14. Ni J, Zhang K, Lin X, Shen XS. Securing fog computing for internet of things applications: Challenges and solutions. IEEE Communications and Surveys Tutorials. 2018;20(1):601-628
  15. 15. Zhou Z, Yu H, Xu C, Chang Z, Mumtaz S, Rodriguez J. BEGIN: Big data enabled energy-efficient vehicular edge computing. IEEE Communications Magazine. 2018;12:82-88
  16. 16. Satyanarayanan M, Bahl P, Caceres R, Davies N. The case for VM-based cloudlets in mobile computing. IEEE Pervasive Computing. 2009;8(4):14-23
  17. 17. Ha K and Satyanarayanan M. OpenStack++ for Cloudlet Deployment. Technical Report. CMU School of Computer Science. 2015. Available from: http://elijah.cs.cmu.edu/DOCS/CMU-CS-15-123.pdf
  18. 18. Satyanarayanan M. The emergence of edge computing. Computer. 2017, 2017;50(1):30-39
  19. 19. Satyanarayanan M, Chen Z, Ha K, Hu W, Richter W, Pillai P. Cloudlets: At the leading edge of mobile cloud convergence. In: Proceedings of the 6th International Conference on Mobile Computing, Applications, and Services (MobiCASE’14). Ericsson; 2014. pp. 1-9
  20. 20. Satyanarayanan M, Simoens P, Xiao Y, Pillai P, Chen Z, Ha K, et al. Edge analytics in the internet of things. IEEE Pervasive Computation. 2015;14, 2015(2):24-31
  21. 21. Yi S, Li C, Li Q. A survey of fog computing: Concepts, applications and issues, proceedings of the 2015 workshop on Mobile big data. ACM, Mobidata 15; 2015:37-42
  22. 22. Chiang M. Fog Networking: An Overview on Research Opportunities. arXiv preprint arXiv:1601.00835. [Online]. Available from: http://arxiv.org/pdf/1601.00835
  23. 23. Marín-Tordera E, Masip-Bruin X, García-Almiñana J, Jukan A, Ren GJ, Zhu J. Do we all really know what a fog node is? Current trends towards an open definition. Computer Communications. 2017;109:117-130
  24. 24. Guevara JC, Bittencourt LF, da Fonseca NLS. Class of service in fog computing. In: Proceedings of the 2017 IEEE 9th Latin-American Conference on Communications (LATINCOM), Guatemala City, Guatemala. IEEE; 2017. pp. 1-6. DOI: 10.1109/LATINCOM.2017.8240187
  25. 25. Vilalta R, Lopez L, Giorgetti A, Peng S, Orsini V, Velasco L, et al. TelcoFog: A unified flexible fog and cloud computing architecture for 5G networks. IEEE Communications Magazine. 2017;55:36-43
  26. 26. Kim W, Chung S. User-participatory fog computing architecture and its management schemes for improving feasibility. IEEE Access. 2018;6:20262-20278
  27. 27. Kaur M, Bharti M. Securing user data on cloud using fog computing and decoy technique. International Journal of Advance Research in Computer Science and Management Studies. 2014;2(10):104-110
  28. 28. Bonomi F. Connected vehicles, the internet of things, and fog computing. In: The Eighth ACM International Workshop on Vehicular InterNetworking (VANET). Las Vegas, USA: Sigmobile; 2011
  29. 29. ETSI. ETSI and Open Fog Consortium Collaborate on Fog and Edge Applications. 2017. Available from: https://www.etsi.org/newsroom/news/1216-2017-09-news-etsi-and-openfog-consortium-collaborate-on-fog-and-edge-applications
  30. 30. Petri I, Rana OF, Bignell J, Nepal S, Auluck N. Incentivizing resource sharing in edge computing applications. In: Proceedings of the International Conference on the Economics of Grids, Clouds, Systems, and Services (GECON’17). Springer Link; Oct 2017. pp. 204-215. Available from: https://link.springer.com/chapter/10.1007/978-3-319-68066-8_16
  31. 31. Bravo C, Backstrom H. “Edge computing and deployment strategies for communication service providers,” White Paper GFMC-20:000097, 2020. Available from: https://www.ericsson.com/en/reports-and-papers/white-papers/edge-computing-and-deployment-strategies-for-communication-service-providers
  32. 32. Badreddine W, Zhang K, Talhi C. Monetization using Blockchains for IoT Data Marketplace. Montreal, Quebec, Canada: Ecole de Technologie Superieure; [Online]. Available: wiem.badreddine.1@ens.etsmtl.ca, kaiwen.zhang@etsmtl.ca, chamseddine.talhi@etsmtl.ca
  33. 33. Poggi A, Tomaiuolo M. Mobile agents: Concepts and technologies. In: Handbook of Research on Mobility and Computing: Evolving Technologies and Ubiquitous Impacts. Beijing, China: IGI Global; 2011. pp. 343-355. Available from: https://www.igi-global.com/chapter/mobile-agents-concepts-technologies/50597
  34. 34. Nguyen D-D, Ali MI. Enabling on-Demand Decentralized IoT Collectability Marketplace Using Blockchain and Crowdsensing. In: Proceedings of the IEEE. IEEE; 2019
  35. 35. Debe M, Salah K, Rehman MHUR, Svetinovic D. Monetization of services provided by public fog nodes using Blockchain and smart contracts. IEEE Access. 2020;8:20118-20128
  36. 36. Yu Y, Liu S, Guo L, Yeoh PL, Vucetic B, Li Y. CrowdR-FBC: A distributed fog-Blockchains for Mobile crowdsourcing reputation management. IEEE Internet of Things Journal. 2020;7(9):8722-8735
  37. 37. Rejiba Z, Masip-Bruin X, Marín-Tordera E. Analyzing the deployment challenges of Beacon stuffing as a discovery enabler in fog-to-cloud systems. In: Proceedings of the 2018 European Conference on Networks and Communications (EuCNC), Ljubljana, Slovenia; 2018. pp. 1-276. DOI: 10.1109/EuCNC.2018.8442471. Available from: https://ieeexplore.ieee.org/document/8442471
  38. 38. Zhang Y, Zhang H, Long K, Zheng Q , Xie X. Software-defined and fog-computing-based next generation vehicular networks. IEEE Communications Magazine. 2018;9:34-41
  39. 39. Cao B, Zhang L, Li Y, Feng D, Cao W. Intelligent offloading in multi-access edge computing: A state-of-the-art review and framework. IEEE Communications Magazine. 2019;57(3):56-62
  40. 40. Cui Q , Gong Z, Ni W, Hou Y, Chen X, Tao X, et al. Stochastic online learning for mobile edge computing: Learning from changes. IEEE Communications Magazine. 2019;57(3):63-69
  41. 41. Kim D, Lee H, Song H, Choi N, Yi Y. Economics of fog computing: Interplay among infrastructure and service providers, users, and edge resource owners. IEEE Transactions Mobile Computations, to be Published. 2020;19:2609-2622. DOI: 10.1109/tmc.2019.2925797
  42. 42. Li X, Zhang C, Gu B, Yamori K, Tanaka Y. Optimal pricing and service selection in the mobile cloud architectures. IEEE Access. 2019;7:43564-43572. DOI: 10.1109/access.2019.2908223
  43. 43. Vega F, Soriano J, Jimenez M, Lizcano D. A peer-to-peer architecture for distributed data monetization in fog computing scenarios. Wireless Communications and Mobile Computing Volume. 2018;2018, Article ID 5758741:15 https://doi.org/10.1155/2018/5758741
  44. 44. Caprolu M, Di Pietro R, Lombardi F, Raponi S. Edge computing perspectives: Architectures, technologies, and open security issues, 2019 IEEE international conference on edge computing. IEEE Access. 2020;8:231825-231847
  45. 45. AbdulKareem KH, Mohammed MA, Gunasekaran SS, Al-Mhiqani MN, Mutlag AA, Mostafa SA, et al. A review of fog computing and machine learning: Concepts, applications, challenges, and open issues. IEEE Access. 2019;7:141274-141308

Written By

Cajetan M. Akujuobi and Faith Nwokoma

Submitted: 08 April 2023 Reviewed: 12 September 2023 Published: 18 December 2023