Deep Learning for Facial Emotion Analysis and Human Activity Recognition

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Artificial Intelligence".

Deadline for manuscript submissions: 15 January 2025 | Viewed by 55

Special Issue Editors


E-Mail Website
Guest Editor
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an 710071, China
Interests: facial expression analysis; pain assessment; depression detection; partial label learning; multi-instance learning

E-Mail Website
Guest Editor
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an 710071, China
Interests: medical image classification and recognition; big data analysis and mining; artificial intelligence algorithms

E-Mail Website
Guest Editor
Academy of Advanced Interdisciplinary Research, Xidian University, Xi’an 710071, China
Interests: video-based action recognition; action quality assessment; computer-aided diagnosis of developmental coordination disorder

E-Mail Website
Guest Editor
Academy of Advanced Interdisciplinary Research, Xidian University, Xi’an 710071, China
Interests: multi-organ segmentation; motion-compensated 4DCBCT reconstruction

Special Issue Information

Dear Colleagues,

We are pleased to announce a Special Issue on "Deep Learning for Facial Emotion Analysis and Human Activity Recognition" in Electronics. This Special Issue aims to explore the advancements and applications of deep learning in facial emotion analysis and human activity recognition, with a focus on their significance in the domains of health, interaction, and security.

Facial emotion and human activity are two of the most important human biological characteristics, and they are the most direct and powerful signals for human beings to express their deepest emotional states and their own intentions. Facial emotion analysis and human activity recognition are keys to intelligently perceiving human emotions and activities, which have received extensive attention in various fields, such as disease-assisted diagnosis, human–computer interaction, automatic driving, national defence and security, intelligent education, and intelligent surveillance. Deep learning techniques have demonstrated remarkable performance in extracting discriminative features and modelling complex patterns from facial images, enabling accurate and robust facial emotion analysis.

This Special Issue aims to bring together researchers and experts from diverse fields, such as computer vision, psychology, healthcare, human–computer interaction, and security, to present their original research, review articles, and technical reports on topics related to deep learning for facial emotion analysis and human activity recognition.

The scope of this Special Issue includes, but is not limited to, the following topics:  

  • Deep learning for facial expression recognition;
  • Deep learning for facial pain assessment;
  • Deep learning-based depression detection;
  • Driver fatigue detection using facial emotion analysis;
  • Multi-modal fusion for enhanced facial emotion analysis;
  • Real-time facial emotion analysis for interactive systems;
  • Transfer learning and domain adaptation for facial emotion analysis;
  • Facial emotion analysis in virtual reality and augmented reality environments;
  • Explainable deep learning models for facial emotion analysis;
  • Deep learning for human action recognition;
  • Spatiotemporal action localization;
  • Action quality assessment;
  • Emotion generation.

By exploring the intersection of deep learning on facial emotion analysis and human activity recognition and their applications in health, interaction, and security, this Special Issue aims to provide valuable insights into the potential of deep learning techniques in understanding and utilizing facial expressions and human activity. The contributions in this Special Issue will foster advancements in healthcare diagnostics, human–computer interaction, and security systems, leading to improved well-being, enhanced user experiences, and better safety measures.

We look forward to receiving your contributions.

Dr. Shasha Mao
Prof. Dr. Shuiping Gou
Dr. Ruimin Li
Dr. Nuo Tong
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • deep learning
  • facial emotion analysis
  • facial expression recognition
  • pain estimation
  • depression detection
  • affective computing
  • human activity recognition
  • human behaviour analysis

Published Papers

This special issue is now open for submission.
Back to TopTop