sensors-logo

Journal Browser

Journal Browser

Sensor-Based Human Activity Recognition

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 15 December 2025 | Viewed by 2077

Special Issue Editor


E-Mail Website
Guest Editor
Department of Information Systems Design, Doshisha University, 1-3 Tatara Miyakodani, Kyotanabe 610-0394, Kyoto, Japan
Interests: multimedia information processing; machine learning; data mining; sensor-based human activity recognition

Special Issue Information

Dear Colleagues,

Sensor-based human activity recognition (HAR) aims at recognising activities by using and integrating data obtained from various sensors, like wearable, object, and ambient sensors, as well as vision sensors. Here, activities are high-level descriptions that can be extracted from low-level sensor data and include not only the physical behaviours of a person but also their mental states (i.e., emotions) and health conditions. Sensor-based HAR to recognise such activities can lead to innovative and useful applications in various domains like healthcare, smart home, human–computer interaction, and autonomous driving. Several sensor-based HAR systems are already in practical use thanks to the recent advancement of sensor technologies, wireless communication networks, and machine learning techniques. However, devising really useful sensor-based HAR systems still needs to solve many problems like the scarcity of large-scale datasets, the treatment of missing or defective sensors, the compositional nature of an activity, the personalisation and calibration of a system for each user, and so on. This Special Issue is dedicated to collecting the latest research achievements and findings on the following topics:

  • System architecture for sensor-based HAR;
  • Sensing devices and technologies;
  • Signal processing for sensor recordings;
  • Machine learning for sensor-based HAR;
  • Multimodal deep learning for sensor-based HAR;
  • Knowledge discovery and data mining for sensor-based HAR;
  • Personalisation and calibration of a sensor-based HAR system;
  • Explainability of a sensor-based HAR system;
  • Sensor-based HAR using LLM;
  • Visualisation and user interface for sensor-based HAR;
  • Datasets to benchmark sensor-based HAR systems;
  • Real-world applications of sensor-based HAR;
  • Surveys on sensor-based HAR.

Dr. Kimiaki Shirahama
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human activity recognition (HAR)
  • sensor technology
  • artificial intelligence
  • pervasive computing
  • wearable computing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 18281 KiB  
Article
IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment
by Sehwan Park, Minkyo Youm and Junkyeong Kim
Sensors 2025, 25(2), 442; https://doi.org/10.3390/s25020442 - 13 Jan 2025
Cited by 1 | Viewed by 784
Abstract
According to South Korea’s Ministry of Employment and Labor, approximately 25,000 construction workers suffered from various injuries between 2015 and 2019. Additionally, about 500 fatalities occur annually, and multiple studies are being conducted to prevent these accidents and quickly identify their occurrence to [...] Read more.
According to South Korea’s Ministry of Employment and Labor, approximately 25,000 construction workers suffered from various injuries between 2015 and 2019. Additionally, about 500 fatalities occur annually, and multiple studies are being conducted to prevent these accidents and quickly identify their occurrence to secure the golden time for the injured. Recently, AI-based video analysis systems for detecting safety accidents have been introduced. However, these systems are limited to areas where CCTV is installed, and in locations like construction sites, numerous blind spots exist due to the limitations of CCTV coverage. To address this issue, there is active research on the use of MEMS (micro-electromechanical systems) sensors to detect abnormal conditions in workers. In particular, methods such as using accelerometers and gyroscopes within MEMS sensors to acquire data based on workers’ angles, utilizing three-axis accelerometers and barometric pressure sensors to improve the accuracy of fall detection systems, and measuring the wearer’s gait using the x-, y-, and z-axis data from accelerometers and gyroscopes are being studied. However, most methods involve use of MEMS sensors embedded in smartphones, typically attaching the sensors to one or two specific body parts. Therefore, in this study, we developed a novel miniaturized IMU (inertial measurement unit) sensor that can be simultaneously attached to multiple body parts of construction workers (head, body, hands, and legs). The sensor integrates accelerometers, gyroscopes, and barometric pressure sensors to measure various worker movements in real time (e.g., walking, jumping, standing, and working at heights). Additionally, incorporating PPG (photoplethysmography), body temperature, and acoustic sensors, enables the comprehensive observation of both physiological signals and environmental changes. The collected sensor data are preprocessed using Kalman and extended Kalman filters, among others, and an algorithm was proposed to evaluate workers’ safety status and update health-related data in real time. Experimental results demonstrated that the proposed IMU sensor can classify work activities with over 90% accuracy even at a low sampling rate of 15 Hz. Furthermore, by integrating internal filtering, communication modules, and server connectivity within an application, we established a cyber–physical system (CPS), enabling real-time monitoring and immediate alert transmission to safety managers. Through this approach, we verified improved performance in terms of miniaturization, measurement accuracy, and server integration compared to existing commercial sensors. Full article
(This article belongs to the Special Issue Sensor-Based Human Activity Recognition)
Show Figures

Figure 1

20 pages, 5455 KiB  
Article
A New Iterative Algorithm for Magnetic Motion Tracking
by Tobias Schmidt, Johannes Hoffmann, Moritz Boueke, Robert Bergholz, Ludger Klinkenbusch and Gerhard Schmidt
Sensors 2024, 24(21), 6947; https://doi.org/10.3390/s24216947 - 29 Oct 2024
Viewed by 780
Abstract
Motion analysis is of great interest to a variety of applications, such as virtual and augmented reality and medical diagnostics. Hand movement tracking systems, in particular, are used as a human–machine interface. In most cases, these systems are based on optical or acceleration/angular [...] Read more.
Motion analysis is of great interest to a variety of applications, such as virtual and augmented reality and medical diagnostics. Hand movement tracking systems, in particular, are used as a human–machine interface. In most cases, these systems are based on optical or acceleration/angular speed sensors. These technologies are already well researched and used in commercial systems. In special applications, it can be advantageous to use magnetic sensors to supplement an existing system or even replace the existing sensors. The core of a motion tracking system is a localization unit. The relatively complex localization algorithms present a problem in magnetic systems, leading to a relatively large computational complexity. In this paper, a new approach for pose estimation of a kinematic chain is presented. The new algorithm is based on spatially rotating magnetic dipole sources. A spatial feature is extracted from the sensor signal, the dipole direction in which the maximum magnitude value is detected at the sensor. This is introduced as the “maximum vector”. A relationship between this feature, the location vector (pointing from the magnetic source to the sensor position) and the sensor orientation is derived and subsequently exploited. By modelling the hand as a kinematic chain, the posture of the chain can be described in two ways: the knowledge about the magnetic correlations and the structure of the kinematic chain. Both are bundled in an iterative algorithm with very low complexity. The algorithm was implemented in a real-time framework and evaluated in a simulation and first laboratory tests. In tests without movement, it could be shown that there was no significant deviation between the simulated and estimated poses. In tests with periodic movements, an error in the range of 1° was found. Of particular interest here is the required computing power. This was evaluated in terms of the required computing operations and the required computing time. Initial analyses have shown that a computing time of 3 μs per joint is required on a personal computer. Lastly, the first laboratory tests basically prove the functionality of the proposed methodology. Full article
(This article belongs to the Special Issue Sensor-Based Human Activity Recognition)
Show Figures

Figure 1

Back to TopTop