Federated Learning: Challenges, Applications and Future

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (31 March 2022) | Viewed by 9705

Special Issue Editor


E-Mail Website
Guest Editor
1. College of Information Technology, United Arab Emirates University, Abu Dhabi 15551, United Arab Emirates
2. Network and Mobility Competence Center at DAI Labor, Technical University of Berlin, Berlin, Germany
Interests: machine learning; autonomous driving; future mobile networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The rise of big data has caused businesses to adopt a data-driven approach when making decisions. Such an approach empowers companies to discover unique insights and information on their customers, which can be intelligently utilized to provide better services and user experience. However, not all “valuable” data can be captured by companies and analyzed in a traditional manner, namely in a centralized location. Most “valuable” data are private and can rarely be shared by their owners. Moreover, these data are generated in a distributed fashion in users’ mobile phones or in IoT devices. Examples of such private data are users’ financial transactions, patients’ health data, or footage of a camera located on a street. In order to not let go of the data-driven approach, and at the same time respect the non-negotiable privacy-preserving requirement, federated learning (FL) was introduced by Google in 2016. The idea behind FL is to train a model collaboratively among distributed actors without sharing the data and violating the privacy accord. Federated learning is one of the key enabling technologies of future intelligent applications in domains such as autonomous driving, smart manufacturing, or healthcare.

This Special Issue aims to bring together the solutions and enabling technologies that help to realize the vision of future distributed AI applications through federated learning, provide the community with the current state, highlight the challenges, and study the envisioned ecosystems and integration of heterogeneous technologies. Potential topics include but are not limited to the following:

  • Implementation challenges of FL in eHealth, autonomous driving, or manufacturing;
  • Definitions of envisioned future use cases or applications enabled by FL (and 6G);
  • The interplay between 6G and FL to enable distributed intelligence;
  • Shortcomings of FL in preserving privacy (or possible new attacks);
  • Application of blockchain technology or differential privacy, to support FL process;
  • Algorithms/experiments/prototypes for tackling data or system heterogeneity, or both, in FL;
  • Algorithms/experiments/prototypes for tackling resource allocation in FL;
  • Communication efficiency in FL processes;
  • Model convergence in FL, challenges, and possible solutions;
  • Dynamic orchestration of FL (dynamic controlling of FL training process);
  • Applications of deep reinforcement learning and meta-learning to support FL process;
  • Variants of FedAvg, their focus, approach, and shortcomings;
  • The architectural paradigm shift in existing traditional centralized applications to distributed FL-enabled ones. The shift can include infrastructural change, data flow change, regulations, risk assessment, and the business model. This can be explained in a specific industry application or use-case by describing and recommending step-by-step migration to the new paradigm;
  • Impact of FL on privacy regulations, data owners, and role of businesses (emergence of new business models or opportunities?).

Dr. Manzoor Ahmed Khan
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 1132 KiB  
Article
Federated Learning with Dynamic Model Exchange
by Hannes Hilberger, Sten Hanke and Markus Bödenler
Electronics 2022, 11(10), 1530; https://doi.org/10.3390/electronics11101530 - 11 May 2022
Cited by 3 | Viewed by 2893
Abstract
Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to [...] Read more.
Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting. Full article
(This article belongs to the Special Issue Federated Learning: Challenges, Applications and Future)
Show Figures

Figure 1

17 pages, 3413 KiB  
Article
Fed2A: Federated Learning Mechanism in Asynchronous and Adaptive Modes
by Sheng Liu, Qiyang Chen and Linlin You
Electronics 2022, 11(9), 1393; https://doi.org/10.3390/electronics11091393 - 27 Apr 2022
Cited by 12 | Viewed by 2384
Abstract
Driven by emerging technologies such as edge computing and Internet of Things (IoT), recent years have witnessed the increasing growth of data processing in a distributed way. Federated Learning (FL), a novel decentralized learning paradigm that can unify massive devices to train a [...] Read more.
Driven by emerging technologies such as edge computing and Internet of Things (IoT), recent years have witnessed the increasing growth of data processing in a distributed way. Federated Learning (FL), a novel decentralized learning paradigm that can unify massive devices to train a global model without compromising privacy, is drawing much attention from both academics and industries. However, the performance dropping of FL running in a heterogeneous and asynchronous environment hinders its wide applications, such as in autonomous driving and assistive healthcare. Motivated by this, we propose a novel mechanism, called Fed2A: Federated learning mechanism in Asynchronous and Adaptive Modes. Fed2A supports FL by (1) allowing clients and the collaborator to work separately and asynchronously, (2) uploading shallow and deep layers of deep neural networks (DNNs) adaptively, and (3) aggregating local parameters by weighing on the freshness of information and representational consistency of model layers jointly. Moreover, the effectiveness and efficiency of Fed2A are also analyzed based on three standard datasets, i.e., FMNIST, CIFAR-10, and GermanTS. Compared with the best performance among three baselines, i.e., FedAvg, FedProx, and FedAsync, Fed2A can reduce the communication cost by over 77%, as well as improve model accuracy and learning speed by over 19% and 76%, respectively. Full article
(This article belongs to the Special Issue Federated Learning: Challenges, Applications and Future)
Show Figures

Figure 1

17 pages, 4143 KiB  
Article
Multi-Party Privacy-Preserving Logistic Regression with Poor Quality Data Filtering for IoT Contributors
by Kennedy Edemacu and Jong Wook Kim
Electronics 2021, 10(17), 2049; https://doi.org/10.3390/electronics10172049 - 25 Aug 2021
Cited by 5 | Viewed by 2030
Abstract
Nowadays, the internet of things (IoT) is used to generate data in several application domains. A logistic regression, which is a standard machine learning algorithm with a wide application range, is built on such data. Nevertheless, building a powerful and effective logistic regression [...] Read more.
Nowadays, the internet of things (IoT) is used to generate data in several application domains. A logistic regression, which is a standard machine learning algorithm with a wide application range, is built on such data. Nevertheless, building a powerful and effective logistic regression model requires large amounts of data. Thus, collaboration between multiple IoT participants has often been the go-to approach. However, privacy concerns and poor data quality are two challenges that threaten the success of such a setting. Several studies have proposed different methods to address the privacy concern but to the best of our knowledge, little attention has been paid towards addressing the poor data quality problems in the multi-party logistic regression model. Thus, in this study, we propose a multi-party privacy-preserving logistic regression framework with poor quality data filtering for IoT data contributors to address both problems. Specifically, we propose a new metric gradient similarity in a distributed setting that we employ to filter out parameters from data contributors with poor quality data. To solve the privacy challenge, we employ homomorphic encryption. Theoretical analysis and experimental evaluations using real-world datasets demonstrate that our proposed framework is privacy-preserving and robust against poor quality data. Full article
(This article belongs to the Special Issue Federated Learning: Challenges, Applications and Future)
Show Figures

Figure 1

Back to TopTop