Next Article in Journal
Influence of the Operating Conditions on the Release of Corrosion Inhibitors from Spray-Dried Carboxymethylcellulose Microspheres
Previous Article in Journal
LiDAR-Based Dense Pedestrian Detection and Tracking
Previous Article in Special Issue
A Fog Computing Architecture with Multi-Layer for Computing-Intensive IoT Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Emerging Technologies for Next-Generation Applied Science Systems

Department of Software and Communication Engineering, Hongik University, Sejong 30016, Korea
Appl. Sci. 2022, 12(4), 1801; https://doi.org/10.3390/app12041801
Submission received: 15 December 2021 / Accepted: 16 December 2021 / Published: 9 February 2022
In the past decade, unlike other industries, Information Technology has been substantially advanced and improved. For example, in the network area, even though 5G is not fully commercialized and functioned, the technologies are heading to 6G to achieve 50 times faster speeds than 5G communications. Additionally, it is expected that advances in Intelligent technologies such as Artificial Intelligence, Machine Learning, Deep Learning, etc., will significantly change and enrich our lives. As the computing capability of small devices increases and massive information is generated, accurate prediction, estimation, and decision making are possible anywhere.
Even though such technologies have been advanced and developed, people and industries are pursuing better technologies for the next generation. In fact, pursuing better technologies is the role of researchers to provide a milestone for future technologies.
To identify the direction of the technologies for the next generation and to provide researchers with an opportunity to interact with each other, an international academic conference was held, the Ninth International Conference on Green and Human Information Technology (ICGHIT 2021).
ICGHIT 2021 was held on 13–15 January 2021, in Jeju Island, Korea. The conference was a unique global conference for researchers, industry professionals, and academics interested in the latest developments in green and human information technology. It formed a platform to seek the advancement of green technology and human-related IT in an interdisciplinary manner. The conference included plenary sessions, technical sessions, and workshops with special sessions. The topics included, but were not limited to, the following: green information technology, energy-saving green computing, green IT convergence and applications’ communication and IoT communications, optical networks, ad hoc visual light communication, M2M/IoT sensor networks, ubiquitous computing, network security wireless and mobile security, internet of things security, applied cryptography security in big data, cloud computing, multimedia and signal processing, smart media technology, speech and signal processing, computer vision and image processing control, intelligent system automatic control, neural network and fuzzy, artificial intelligence, HCI intelligent robotics and transportation, HRI brain science and bioengineering SW/HW design, architecture, development architecture and protocols, sustainable sensor networks, information-centric sensor networks, blockchain-based secure sensor networks, AI-based self-evolving sensor networks, sensor/RFID circuits, design system on a chip (SoC), and an IC system for communication.
The editorial board of ICGHIT 2021 selected 6 high-quality papers among 125 submissions. The selected papers were required to extend 50% more from the conference paper, and after an extensive peer-review process, including second and even third revisions, the selected papers were finally published in the SI. In the next section, the six published papers are briefly introduced.
Recently, even though many researchers have been trying to push artificial intelligence to the edge to fully discover the potential of the fog computing paradigm, it is still in the initial stage. Due to the extensive use of Internet of Things (IoT) systems, massive amounts of data are being generated, and more and better intelligent services analyzing and mining the data are required. As a consequence, this demands more compute-intensive applications such as Deep Learning (DL) and low-latency. In order to meet the requirements, better network architectures locating computing devices close to users have been studied. As a part of the research for network architecture for highly intensive computation, a paper titled “A Fog Computing Architecture with Multi-Layer for Computing-Intensive IoT Applications” [1] proposes a multi-layered architecture to perform latency-sensitive analysis with IoT data coming from smart IoT-based devices, such as intelligent CCTVs. The tri-tiers consists of cloud computing, edge-fog computing, and sensors, which work together with one another. The first layer (the bottom of the architecture), named the “physical layer”, comprises a set of IoT devices, including variable sensors, which collect data and send them to edge-fog gateways via offloading. In the second layer, consisting of edge gateways, filtering-out and pre-processing the arrived data at the edge gateway is performed, in which 30–70% of meaningless data (according to data analytics) is deleted. This process reduces the data-transmission burden and increases the data analysis processing speed. The next layer is composed of edge-fog servers. The amount of data at the edge servers are distributed among the various edge devices according to the computation requirements of the data for reducing the latency in real-time scenario. The proposed architecture was evaluated with the scenario of a surveillance camera application in iFogSim and it showed that the proposed architecture significantly improved the performances of networks compared to a cloud-based system.
As mentioned above, since tremendous amounts of data have been generated, not only analyzing data itself but also preventing the privacy of the data has received considerable attention. As a result, privacy-preserving data publishing (PPDP) is being actively researched to provide methods to publish and preserve data privacy. Most existing PPDP algorithms use the process whereby after anonymizing the requested data containing personal information offline, a data publisher publishes the data. However, anonymizing data during the query-processing phase adds significant overhead, and as a result, it degrades query performance. To enhance the performance of the process of PPDP, Kim in [2] propose a novel method to efficiently anonymize the query results online. The proposed algorithm is composed of three steps. At the first step, given a query, the master node estimates the generalization level of each quasi-identifier attribute to satisfy the k-anonymity property over the query result datasets and sends it to each slave node along with the user query. In the second step, each slave node executes the user query, anonymizes its query results based on the generalization information received from the master node, and sends the anonymized query results to the master node. At the final step, the master node aggregates the anonymized query results from every slave node and returns the aggregated results to the user. The proposed approach effectively estimates the generalization level of each attribute for achieving the k-anonymity property in the query result datasets based on the statistical information. The proposed method was evaluated through extensive experiments, and the results show that significant processing time gains were achieved while avoiding a significant reduction in information in the released microdata.
Like the previous research target, the privacy and security of health and medical records, called electric health records (EHRs), have also received attention from the research community. Blockchain-based EHR management systems are one of the solutions to provide security in EHRs. However, since most blockchain systems are managed by outsourced companies, EHRs may be leaked to the companies. To resolve this issue, Park et al. in [3] proposed a blockchain-based EHR management scheme with proxy re-encryption. The proposed method enables a proxy server to re-encrypt the ciphertext between file servers to solve HER sharing issues. In addition, using this method prevents outsourced companies from manipulating the server or accessing the records.
Even though there are many file encryption/decryption algorithms, the algorithms still face threats, such as hacking. To provide a better algorithm, Ko et al. in [4] propose a Quantum-Gate-based advanced encryption standard (AES) algorithm, which means AES algorithm implementation using Quantum computing. The proposed method was evaluated with a conventional AES-128 algorithm with various sizes of files. The evaluations were performed in Qiskit, a Quantum computer platform, and the evaluation results showed that the quantum-computing-based implementation requires approximately 7518 gates and 64 qubits per iteration. Even though no real quantum computer exists in the world, the performance measurements are meaningless; the paper merely shows the possibility of implementing a quantum-computing-based ASE algorithm.
In the research area of future networking and the internet, some new paradigms have been proposed, such as Information-Centric Networking (ICN), Software-Defined Networking (SDN), P4, Osmotic Computing, Computer-in-Network (COIN), etc. Among them, SDN is already implemented and used for managing enterprise networks, and P4 is becoming close to use in real life. ICN, considered a paradigm to replace IP-based networks, is also actively studied for realistic implementations. In this Special Issue, there are two pieces of research on ICN.
The first ICN-related paper, authored by Ahmad et al. [5], applied the ICN paradigm to 5G-enabled tactile internet services and proposed a method to reduce the latency in the routing process of ICN. The Tactile Internet is defined as an internet network that combines ultra-low latency with extremely high availability, reliability, and security by the International Telecommunication Union (ITU). The paper adopts a Q-learning algorithm to explore and exploit the different routing paths within the ICN infrastructure after formulating the ICN routing problem to the Markov decision process (MDP). The proposed method was evaluated compared with a random-routing protocol and history-aware routing protocol (HARP), and the results show it reduces delay by 33.33% and 33.69% compared to random routing and HARP, respectively.
The other ICN-related paper in this Special Issue proposes a solution to resolve the producer mobility problem in Mobile ICN networks. One of the well-known issues in ICN is producer mobility. Once the producer moves and changes its location while it is providing contents, the content forwarding mechanism of ICN causes a large delay and inefficiencies. Hussaini et al. in [6] propose a method to support producer mobility by reducing the overheads in the networks. The routing prefix name stored in the FIB of the intermediate routers becomes outdated due to the movements of the producer. Thus, a mechanism to update the FIB with the new prefix of the mobile producer is required. For this purpose, the proposed method proposes a new packet, named the Mobility Interest packet, which is modified from the current interest packet by adding new fields to inform new routing prefix and mobility status. In addition, a broadcasting strategy for the packet is designed. By using the proposed method, the current triangle routing path is changed to the shortest path from the producer to a consumer so that the latency and overhead are reduced. The simulation results show that compared to the conventional method, the proposed method reduces 25% of the signaling overheads and increases throughput up to 75%.

Funding

This research received no external funding.

Acknowledgments

The author of the submissions has expressed their appreciation to the work of the anonymous reviewers and the Applied Science editorial team for their cooperation, suggestions, and advice.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Muneeb, M.; Ko, K.-M.; Park, Y.H. A Fog Computing Architecture with Multi-layer for Computing-intensive IoT Applications. Appl. Sci. 2021, 11, 11585. [Google Scholar] [CrossRef]
  2. Kim, J.W. Efficiently Supporting Online Privacy-Preserving Data Publishing in a Distributed Computing Environment. Appl. Sci. 2021, 11, 10740. [Google Scholar] [CrossRef]
  3. Park, Y.H.; Kim, Y.; Lee, S.-O.; Ko, K. Secure Outsourced Blockchain-Based Medical Data Sharing System Using Proxy Re-Encryption. Appl. Sci. 2021, 11, 9422. [Google Scholar] [CrossRef]
  4. Ko, K.-K.; Jung, E.-S. Development of Cybersecurity Technology and Algorithm Based on Quantum Computing. Appl. Sci. 2021, 11, 9085. [Google Scholar] [CrossRef]
  5. Ahmad, H.; Islam, M.Z.; Ali, R.; Haider, A.; Kim, H. Intelligent Stretch Optimization in Information Centric Networking-Based Tactile Internet Applications. Appl. Sci. 2021, 11, 7351. [Google Scholar] [CrossRef]
  6. Hussaini, M.; Naeem, M.A.; Kim, B.-S. OPMSS: Optimal Producer Mobility Support Solution for Named Data Networking. Appl. Sci. 2021, 11, 4064. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, B.-S. Emerging Technologies for Next-Generation Applied Science Systems. Appl. Sci. 2022, 12, 1801. https://doi.org/10.3390/app12041801

AMA Style

Kim B-S. Emerging Technologies for Next-Generation Applied Science Systems. Applied Sciences. 2022; 12(4):1801. https://doi.org/10.3390/app12041801

Chicago/Turabian Style

Kim, Byung-Seo. 2022. "Emerging Technologies for Next-Generation Applied Science Systems" Applied Sciences 12, no. 4: 1801. https://doi.org/10.3390/app12041801

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop