Internet of Things and Cloud-Fog-Edge Computing

A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Internet of Things (IoT)".

Deadline for manuscript submissions: 30 September 2024 | Viewed by 7402

Special Issue Editors

Department of Computer Engineering and Management, Faculty of Engineering, University of Mons, 7000 Mons, Belgium
Interests: internet of things; cloud computing; edge computing; artificial intelligence; internet of medical things; smart farming
Communication Networks Department, University Mohammed V – ENSIAS, Rabat BP 713, Morocco
Interests: parallel and distributed systems; high performance computing; virtualisation; cloud computing
Special Issues, Collections and Topics in MDPI journals
Faculty of Science, Technology and Medicine, University of Luxembourg, L-4364 Esch-sur-Alzette, Luxembourg
Interests: cloud computing; parallel and grid computing; distributed systems & middleware; optimisation techniques
Special Issues, Collections and Topics in MDPI journals
Faculty of Science, Technology and Medicine, University of Luxembourg, L-4364 Esch-sur-Alzette, Luxembourg
Interests: artificial intelligence; machine learning; cloud computing; decision-making; internet of things

Special Issue Information

Dear Colleagues,

The MDPI Journal Information invites submissions to a Special Issue on “Internet of Things and Cloud/Fog/Edge Computing”.

The ever-increasing number of connected objects requires more and more processing resources. Cloud computing has shown its limits, with problems of latency and link congestion related to the volume of data to be transferred. To remedy this, some of the processing has been shifted to the intermediate levels between the cloud and the sensors (Fog computing) or on the sensors themselves (Edge computing). New challenges have emerged related to the distribution of processing between the different processing layers, the need to ensure the end-to-end security to protect sensitive data, or the privacy.

The goal of this Special Issue is to invite high-quality, state-of-the-art research papers that deal with challenging issues in Cloud/Fog/Edge Computing across the different parts of the IoT ecosystem. We solicit original papers of unpublished and completed research that are not currently under review by another conference/journal. Topics of interest include but are not limited to the following:

  • Internet of medical things (IoT)
  • Mobile edge computing
  • Osmotic computing
  • IoT security
  • Confidential computing
  • Mobile systems and applications
  • Smart communities and ubiquitous systems
  • IoT in healthcare
  • IoT in business and industry
  • IoT for resilient organizations

Papers’ length has to be 9–15 pages and should be formatted according to the MDPI template. Complete instructions for authors can be found at: https://www.mdpi.com/journal/information/instructions.

FIRM Deadline: 30 September 2023

Dr. Olivier Debauche
Dr. Mostapha Zbakh
Prof. Dr. Pascal Bouvry
Dr. Caesar Wu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • internet of things
  • cloud IoT architecture
  • cloud-fog-edge computing
  • distributed architecture

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

17 pages, 1117 KiB  
Article
Design of a Meaningful Framework for Time Series Forecasting in Smart Buildings
by Louis Closson, Christophe Cérin, Didier Donsez and Jean-Luc Baudouin
Information 2024, 15(2), 94; https://doi.org/10.3390/info15020094 - 07 Feb 2024
Viewed by 913
Abstract
This paper aims to provide discernment toward establishing a general framework, dedicated to data analysis and forecasting in smart buildings. It constitutes an industrial return of experience from an industrialist specializing in IoT supported by the academic world. With the necessary improvement of [...] Read more.
This paper aims to provide discernment toward establishing a general framework, dedicated to data analysis and forecasting in smart buildings. It constitutes an industrial return of experience from an industrialist specializing in IoT supported by the academic world. With the necessary improvement of energy efficiency, discernment is paramount for facility managers to optimize daily operations and prioritize renovation work in the building sector. With the scale of buildings and the complexity of Heating, Ventilation, and Air Conditioning (HVAC) systems, the use of artificial intelligence is deemed the cheapest tool, holding the highest potential, even if it requires IoT sensors and a deluge of data to establish genuine models. However, the wide variety of buildings, users, and data hinders the development of industrial solutions, as specific studies often lack relevance to analyze other buildings, possibly with different types of data monitored. The relevance of the modeling can also disappear over time, as buildings are dynamic systems evolving with their use. In this paper, we propose to study the forecasting ability of the widely used Long Short-Term Memory (LSTM) network algorithm, which is well-designed for time series modeling, across an instrumented building. In this way, we considered the consistency of the performances for several issues as we compared to the cases with no prediction, which is lacking in the literature. The insight provided let us examine the quality of AI models and the quality of data needed in forecasting tasks. Finally, we deduced that efficient models and smart choices about data allow meaningful insight into developing time series modeling frameworks for smart buildings. For reproducibility concerns, we also provide our raw data, which came from one “real” smart building, as well as significant information regarding this building. In summary, our research aims to develop a methodology for exploring, analyzing, and modeling data from the smart buildings sector. Based on our experiment on forecasting temperature sensor measurements, we found that a bigger AI model (1) does not always imply a longer time in training and (2) can have little impact on accuracy and (3) using more features is tied to data processing order. We also observed that providing more data is irrelevant without a deep understanding of the problem physics. Full article
(This article belongs to the Special Issue Internet of Things and Cloud-Fog-Edge Computing)
Show Figures

Figure 1

32 pages, 1146 KiB  
Article
Online Task Scheduling of Big Data Applications in the Cloud Environment
by Laila Bouhouch, Mostapha Zbakh and Claude Tadonki
Information 2023, 14(5), 292; https://doi.org/10.3390/info14050292 - 15 May 2023
Cited by 2 | Viewed by 1410
Abstract
The development of big data has generated data-intensive tasks that are usually time-consuming, with a high demand on cloud data centers for hosting big data applications. It becomes necessary to consider both data and task management to find the optimal resource allocation scheme, [...] Read more.
The development of big data has generated data-intensive tasks that are usually time-consuming, with a high demand on cloud data centers for hosting big data applications. It becomes necessary to consider both data and task management to find the optimal resource allocation scheme, which is a challenging research issue. In this paper, we address the problem of online task scheduling combined with data migration and replication in order to reduce the overall response time as well as ensure that the available resources are efficiently used. We introduce a new scheduling technique, named Online Task Scheduling algorithm based on Data Migration and Data Replication (OTS-DMDR). The main objective is to efficiently assign online incoming tasks to the available servers while considering the access time of the required datasets and their replicas, the execution time of the task in different machines, and the computational power of each machine. The core idea is to achieve better data locality by performing an effective data migration while handling replicas. As a result, the overall response time of the online tasks is reduced, and the throughput is improved with enhanced machine resource utilization. To validate the performance of the proposed scheduling method, we run in-depth simulations with various scenarios and the results show that our proposed strategy performs better than the other existing approaches. In fact, it reduces the response time by 78% when compared to the First Come First Served scheduler (FCFS), by 58% compared to the Delay Scheduling, and by 46% compared to the technique of Li et al. Consequently, the present OTS-DMDR method is very effective and convenient for the problem of online task scheduling. Full article
(This article belongs to the Special Issue Internet of Things and Cloud-Fog-Edge Computing)
Show Figures

Figure 1

13 pages, 573 KiB  
Article
Security Verification of an Authentication Algorithm Based on Verifiable Encryption
by Maki Kihara and Satoshi Iriyama
Information 2023, 14(2), 126; https://doi.org/10.3390/info14020126 - 15 Feb 2023
Viewed by 1369
Abstract
A new class of cryptosystems called verifiable encryption (VE) that facilitates the verification of two plaintexts without decryption was proposed in our previous paper. The main contributions of our previous study include the following. (1) Certain cryptosystems such as the one-time pad belong [...] Read more.
A new class of cryptosystems called verifiable encryption (VE) that facilitates the verification of two plaintexts without decryption was proposed in our previous paper. The main contributions of our previous study include the following. (1) Certain cryptosystems such as the one-time pad belong to the VE class. (2) We constructed an authentication algorithm for unlocking local devices via a network that utilizes the property of VE. (3) As a result of implementing the VE-based authentication algorithm using the one-time pad, the encryption, verification, and decryption processing times are less than 1 ms even with a text length of 8192 bits. All the personal information used in the algorithm is protected by Shanon’s perfect secrecy. (4) The robustness of the algorithm against man-in-the-middle attacks and plaintext attacks was discussed. However, the discussion about the security of the algorithm was insufficient from the following two perspectives: (A) its robustness against other theoretical attacks such as ciphertext-only, known-plaintext, chosen-plaintext, adaptive chosen-plaintext, chosen-ciphertext, and adaptive chosen-ciphertext attacks was not discussed; (B) a formal security analysis using security verification tools was not performed. In this paper, we analyze the security of the VE-based authentication algorithm by discussing its robustness against the above theoretical attacks and by validating the algorithm using a security verification tool. These security analyses, show that known attacks are ineffective against the algorithm. Full article
(This article belongs to the Special Issue Internet of Things and Cloud-Fog-Edge Computing)
Show Figures

Figure 1

Review

Jump to: Research

21 pages, 445 KiB  
Review
Literature Review: Clinical Data Interoperability Models
by Rachida Ait Abdelouahid, Olivier Debauche, Saïd Mahmoudi and Abdelaziz Marzak
Information 2023, 14(7), 364; https://doi.org/10.3390/info14070364 - 27 Jun 2023
Cited by 1 | Viewed by 2291
Abstract
A medical entity (hospital, nursing home, rest home, revalidation center, etc.) usually includes a multitude of information systems that allow for quick decision-making close to the medical sensors. The Internet of Medical Things (IoMT) is an area of IoT that generates a lot [...] Read more.
A medical entity (hospital, nursing home, rest home, revalidation center, etc.) usually includes a multitude of information systems that allow for quick decision-making close to the medical sensors. The Internet of Medical Things (IoMT) is an area of IoT that generates a lot of data of different natures (radio, CT scan, medical reports, medical sensor data). However, these systems need to share and exchange medical information in a seamless, timely, and efficient manner with systems that are either within the same entity or other healthcare entities. The lack of inter- and intra-entity interoperability causes major problems in the analysis of patient records and leads to additional financial costs (e.g., redone examinations). To develop a medical data interoperability architecture model that will allow providers and different actors in the medical community to exchange patient summary information with other caregivers and partners to improve the quality of care, the level of data security, and the efficiency of care should take stock of the state of knowledge. This paper discusses the challenges faced by medical entities in sharing and exchanging medical information seamlessly and efficiently. It highlights the need for inter- and intra-entity interoperability to improve the analysis of patient records, reduce financial costs, and enhance the quality of care. The paper reviews existing solutions proposed by various researchers and identifies their limitations. The analysis of the literature has shown that the HL7 FHIR standard is particularly well adapted for exchanging and storing health data, while DICOM, CDA, and JSON can be converted in HL7 FHIR or HL7 FHIR to these formats for interoperability purposes. This approach covers almost all use cases. Full article
(This article belongs to the Special Issue Internet of Things and Cloud-Fog-Edge Computing)
Show Figures

Figure 1

Back to TopTop