Next Article in Journal
An Adaptive RF Front-End Architecture for Multi-Band SDR in Avionics
Previous Article in Journal
Rapid and Resilient LoRa Leap: A Novel Multi-Hop Architecture for Decentralised Earthquake Early Warning Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Inertial Motion Capture-Driven Digital Human for Ergonomic Validation: A Case Study of Core Drilling

1
School of Future Technology, China University of Geosciences, Wuhan 430074, China
2
School of Mechanical & Electronic Information, China University of Geosciences, Wuhan 430074, China
3
Hubei Intelligent Geological Equipment Engineering Technology Research Center, Wuhan 430074, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2024, 24(18), 5962; https://doi.org/10.3390/s24185962
Submission received: 13 August 2024 / Revised: 2 September 2024 / Accepted: 11 September 2024 / Published: 13 September 2024
(This article belongs to the Special Issue Advances in Human Locomotion Using Sensor-Based Approaches)

Abstract

:
In the evolving realm of ergonomics, there is a growing demand for enhanced comfortability, visibility, and accessibility in the operation of engineering machinery. This study introduces an innovative approach to assess the ergonomics of a driller’s cabin by utilizing a digital human. Through the utilization of inertial motion capture sensors, the method enables the operation of a virtual driller animated by real human movements, thereby producing more precise and realistic human–machine interaction data. Additionally, this study develops a simplified model for the human upper limbs, facilitating the calculation of joint forces and torques. An ergonomic analysis platform, encompassing a virtual driller’s cabin and a digital human model, is constructed using Unity 3D. This platform enables the quantitative evaluation of comfortability, visibility, and accessibility. Its versatility extends beyond the current scope, offering substantial support for product development and enhancement.

1. Introduction

Ergonomics, through the analysis of human–environment interface interactions, assesses how factors such as the working environment, workstation layout, working posture, and procedures affect human physiology, psychology and productivity [1,2,3]. The objective is to enhance human–machine systems. As this field evolves, there is an increasing emphasis on protective functions, visibility, accessibility, and comfort in engineering machinery operations. The core drilling rig, a sophisticated geotechnical drilling apparatus, comprises various motion mechanisms, adding complexity to its operational system [4]. The driller’s cabin, an advanced control center, manages the core drilling rig and facilitates safety information. Ergonomic validation in this context aims to refine the internal structure and layout of the cabin, thereby reducing labor intensity and improving drilling efficiency and quality.
In the product development stage, traditional ergonomic evaluation typically adopts physical models. Bazazan et al. [5] utilized questionnaires and rapid upper limb assessments to compare operators’ perceptions and workloads in varying postures. However, this method is often time-consuming, expensive, and inflexible. The integration of virtual human technology in ergonomic design overcomes the reliance on physical equipment, offering a realistic multi-sensory virtual experience for effective performance evaluations [6]. This approach addresses the challenges in data collection, working condition simulation, and safety risks. Recent studies [7,8,9] have developed virtual human models and simulations that closely replicate human characteristics, facilitating 3D behavior simulations in virtual environments for ergonomic assessments in various postures. Nevertheless, these simulations are predominantly static, requiring manual posture adjustments and limiting their realism.
Dynamic interactive virtual human–machine simulations enable operators to adaptively adjust themselves in immersive virtual environments, thereby enhancing the natural interactivity between human and the virtual realm and augmenting the simulations’ reliability and efficiency. This evaluation approach encompasses two types: computer vision and sensor-based equipment. Nadeau [10] utilized cameras to record the working processes of aviation de-icing workers, analyzing the impact of varying postures and different weights of de-icing nozzles on human joints and intervertebral discs. Ma [11] employed electromyography to investigate physical strain, uncovering the potential sources of high workloads. Rolf [12] integrated motion capture technology with surface electromyography to evaluate driving posture comfort. Caputo [13] applied motion capture to analyze limb movements, thereby enhancing work comfortability and efficiency through the assessment of limb movement angles. Gong [14] applied the real-time motion data from wearers to the joints of robots, facilitating direct and natural remote control of the robot’s movements by the wearer. These techniques enable the virtual human model to be driven in real-time by the operator via a comprehensive motion capture system, transforming the virtual entity into a motion avatar of the operator, thus acquiring authentic work data from workers, and rendering the evaluation outcomes more accurate and trustworthy.
This study addresses the design of the driller’s cabin in a core drilling rig, introducing a computer-assisted method to evaluate the upper limb comfortability of drillers using digital human technology. It leverages inertial motion capture sensors to simulate the virtual driller’s actions based on real human movements, resulting in more precise and realistic human–machine analysis data. The study establishes a simplified human upper limb model for calculating forces and torques on the joints, thereby assessing the rationality of the virtual driller’s cabin design. This approach offers more authentic and reliable evaluation results for the driller’s cabin design, significantly contributing to the design and enhancement of engineering machinery equipment.

2. Digital Human Model Generation

2.1. Establishment of Three-Dimensional Virtual Human Model

Three-dimensional virtual human models are digitized representations in human–machine engineering simulations, which are adept at accurately imitating various human postures and facilitating the simulation of human behaviors in three-dimensional environments. Primarily, these models assess product usability, ergonomic design, and comfortability [15]. The human–machine efficiency software, Jack version 8.0.1, offers precise biomechanical human models, enabling the customization of digital human dimensions for the creation of highly realistic virtual humans. Utilizing inverse kinematics for model adjustment, Jack version 8.0.1 simplifies the process, requiring modifications only at crucial points instead of every joint. The model is developed with the 95th percentile adult Chinese male as the reference standard, and its key parameters are detailed in Table 1.
For subsequent bone rigging in Blender version 4.0 software, the model is subjected to format conversion, refined by removing superfluous vertices and faces, and completed with detailed facial data, which culminates in an optimized human body model, as depicted in Figure 1.

2.2. Motion Capture Device-Driven Generation of a Digital Human

The motion of the digital human body is orchestrated by a sequence of skeletal movements, subsequently influencing the dynamics of the skin network. To ensure the computational accuracy of motion capture data, aligning the motion capture’s driving points with the virtual human’s joint points is essential, which necessitates constructing a skeletal model. In this study, the VDsuit-Full full-body motion capture system (VIRDYN, Guangzhou, China) is employed (Figure 2), equipped with 27 highly sensitive sensor nodes capable of capturing and outputting data across a wide range of postures. In terms of the accuracy of the device applied to ergonomic assessment, the attitude accuracy of the VDsuit-Full system is the ROLL < 0.5°PITCH < 0.5°YAW < 1°; the acceleration parameter is ±8 G; the gyroscope parameter is ±2000; and the magnetometer parameter is ±4.9 Gs. This is similar to the Perception Neuron Inertial Mo-Cap system of the (NOITOM, Beijing, China). By comparing the accuracy evaluation with the gold-standard (ISO 13485:2016) [16] Vicon system (Vicon Motion Systems Ltd., Oxford, UK), its error is within 5° [17]. The assessments were based on ergonomic criteria, particularly ISO 11228-1:2003 (standard for manual operation) and ISO 9241-210:2019 (user-centric design standard for human-computer interaction), following the correct posture and movement guide [18,19]. This shows that the system is suitable for the accuracy of ergonomic evaluation.
Based on the structural characteristics of the human body, its various components can be organized into a tree-like system. In this system, the pelvic body segment serves as the root node, while the head, hands, and feet function serve as endpoints [20]. Utilizing Blender version 4.0’s skeletal system, a skeleton is constructed by linking the model’s vertices to the skeletal structure, enabling the model to undergo deformation in response to skeletal movements. The core of the skeleton comprises 23 segments, and the hierarchical and skeletal structures of this core are illustrated in Figure 3, with the corresponding hierarchical structure of the upper limb skeleton depicted in Figure 4.
Digital human movement includes both skeletal motion and skin mesh model deformation. To this end, this study associates the skin mesh vertices with adjacent skeletal segments, establishing a weight relationship between the skin and the skeleton. The skeleton undergoes automatic weight binding, followed by a manual check and adjustment in posture mode to align the skeleton with the mesh model accurately [21]. To ensure the wearer’s movements align with those of the virtual human, motion capture data typically undergo reorientation. This reorientation is achieved by replicating human movements, as collected by the sensors, on the skeletal model. See Figure 5.

3. Evaluation of Human Comfortability

3.1. Calculation of Joint Angle in the Human Upper Limb

To address the challenge of setting a consistent zero point in testing, this paper utilizes the spatial vector method for defining points’ spatial positions, aiming to reduce errors arising from zero-point discrepancies. It adopts the left hand as a standard and simplifies the human upper limb into a ball-and-stick model, with the definitions for upper limb joint angles illustrated in Figure 6.
In this model, Points A, B, C, and D correspond to the spatial positions of the upper arm, lower arm, hand, and finger skeletal nodes, respectively. The shoulder joint angle α is defined as the angle between the vertical line at Point A and the line connecting Points A and B. The elbow joint angle β is the angle formed by the extension line of the line from Point A to B and the line from Point B to C. The wrist joint angle γ is determined by the angle between the extension line of the line from Point B to C and the line from Point C to D. A, B, C, and D are all spatial points. θ1, θ2, and θ3 are the angles between the shoulder, elbow, and wrist joints and the horizontal direction, respectively. Taking the angle between the upper arm, lower arm, and hand as an example, the angle between the upper arm and lower arm is δ , the angle between the lower arm and hand is φ . The angle can be calculated using the cosine theorem, and the expression is as follows:
cos δ = A B × B C | A B | × | B C | cos φ = B C × C D | B C | × | C D |
θ 1 = π / 2 α θ 2 = π θ 1 δ = β θ 1 θ 3 = π + θ 2 φ = γ + θ 2
Based on the disparity in spatial node coordinates within the bone structure, the respective position vectors are derived. Subsequently, Equation (1) is applied to determine the angles formed by the shoulder, elbow, and wrist joints during upper limb movement. Building upon Equation (1), the values of θ1, θ2, and θ3 are acquired through the angle conversion specified in Equation (2). Utilizing the obtained joint angle data, MATLAB version R2020a is employed to compute the mean, standard deviation, maximum, and minimum values for each joint angle.

3.2. Calculation of Upper Extremity Joint Moments

Evaluating human comfortability exclusively through joint angles is inadequate for precisely uncovering the actual sources of discomfort. The calculation of the joint torques provides an intuitive assessment of joint loading, offering a more precise evaluation of joint comfortability [22]. In the analysis of the upper limb moment, considering the hand’s relatively minor mass, the lower arm is the main body for the mass position. Consequently, wrist joint force analysis is omitted, with attention directed towards the potential load impact on the lower arm. The stress analysis of the human upper limbs during exercise is shown in Figure 7.
In this analysis, point A signifies the shoulder joint’s center, B represents the upper arm’s center of mass, C is the elbow joint’s center, and D indicates the center of mass of the forearm and hand. G1 and G2 are the gravitational forces acting on the forearm and hand, and the upper arm, respectively. RS is the reaction force at the shoulder joint’s center generated by the entire upper limb. RE and RE’ represent interactive forces at the elbow joint during upper limb movement. Rp and Rp’ represent interactive forces at the wrist joint during hand movement. θE and θS are the angles of the lower arm and upper arm relative to the horizontal plane, respectively. Ft1 and Fr1 represent the tangential and radial forces, respectively, when the forearm and hand move around the elbow joint. Ft2 and Fr2 represent the tangential and radial forces, respectively, when the upper arm moves around the shoulder joint. Ft0 and Fr0 are the tangential and radial forces, respectively, when the forearm and hand move around the shoulder joint. L0, L1, and L2 represent the distances from the shoulder joint center to the forearm center of mass, from the elbow joint center to the forearm center of mass, and from the center of mass of the upper arm to the shoulder joint center, respectively. MS is the torque of force corresponding to the entire upper limb at the shoulder joint A, ME is the torque of force corresponding to the forearm at the elbow joint C, and Mp is the torque of force corresponding to the hand at the wrist joint E. For the convenience of subsequent calculations, it is assumed that the mass per unit length of the arm is m0, with specific calculations as follows:
θ . . = d ω d t = d d θ d t d t = d 2 θ d t 2
m R = 0 L m 0 r d r
F t = m R θ . .
F r = m d θ d t 2 R = m d θ 2 d t 2 R
Based on this, the formulation of inverse dynamic equilibrium equations and torque balance equations for the elbow joint and shoulder joint is presented. According to the principles of kinematics, it is known that the radial forces generated by the upper limb movements are all on the extension line of the limb, so radial forces do not produce moments at the joints. Equation (7) below represents the force and torque balance equations for the forearm, and Equation (8) represents the force and torque balance equations for the upper arm.
M E + G 1 × L 1 × cos θ E + F t 1 × L 1 = 0 R E + R p + G 1 + F t 1 × cos θ E + F r 1 × sin θ E = 0
M S + G 2 × L 2 × cos θ S + F t 2 × L 2 + F t 0 × L 0 = 0 R S + R E + G 2 + F t 2 × cos θ S + F r 2 × sin θ S = 0
Data processing and analysis were conducted using MATLAB version R2020a software, enabling the calculation of torque variation curves for both the elbow and shoulder joints. These curves were graphically represented with frame number as the variable.

4. Experiment

4.1. Data Acquisition

In the initial phase of the study, Jack version 8.0.1 and CATIA V5 software version V5-6R2022 were employed to design the driller’s cabin for the Chinese 5000 m core drilling rig and to verify its ergonomics. Figure 8 depicts the finalized design of the driller’s cabin. To evaluate the design’s efficacy, motion capture sensor data were analyzed. A virtual simulation platform for assessing man-machine efficiency was developed using Unity3D. When integrating real-time motion capture data in Unity3D, it is necessary to pay attention to hardware performance, data processing load and software scalability, optimize the process such as adopting multi-threading technology, and consider additional plugins or custom development to meet complex requirements to ensure data accuracy and real-time performance. In the process of construction, resources were added to the scene by adding the plane model and lights provided by Unity3D, and position, rotation, and scale data were adjusted in the transform window. If the spatial layout of the entire scene is not considered in the above operations, it may lead to overlap between resources, affecting the visual effect and interaction design. Similarly, there is also a parent–child relationship between resources, in order to prevent layout confusion, it is necessary to frequently test and preview the effect during the adjustment process to ensure that it meets the design requirements. After importing the driller’s room model and the character model with skeleton into Unity3D, real-time action data were broadcast to Unity3D using the VDMocap Studio software version 2.1.15 provided by the hardware vendor. Note that when motion capture data are transmitted in real time and synchronized with the 3D model, this integration may face a processing overload, especially if high frequency sampling and complex calculations are required. In order to avoid delay or frame loss, it can be solved by reducing unnecessary calculations and improving hardware performance.
To improve the realism of the driller’s cabin and digital human within the virtual platform, collision detection functionality was integrated between the cabin and human models. This integration involved the creation of bounding box components for rigid bodies, coupled with the use of inherent triggers to identify overlaps between collision boxes. The Rokoko Studio for Blender plugin (Rokoko, San Francisco, CA, USA) facilitated the alignment and redirection of skeletons using data exported from VDMocap Studio (VIRDYN, Guangzhou, China). Ultimately, the process culminated in the export of files encompassing the human mesh model, the skeletal system with bindings, associated materials and textures, and skeletal motion data. The overall simulation experimental testing platform is shown in Figure 9.
The driller predominantly engages in three actions: manipulating the joystick, adjusting the knob, and activating buttons. Data on these key actions are systematically collected. Throughout the testing phase, the digital human’s movements are visualized via animation frame playback. Within Unity3D, the transform component systematically captures the skeletal nodes’ numerical values on a frame-by-frame basis to gather positional data. Both the data collection and subsequent processing are conducted at a consistent frame rate of 60 frames per second. We selected five adult males who roughly met 95% of the criteria in this paper for operational testing. Before the data collection, informed consent was given by the participants. During the testing process, we tried our best to keep the testers at the same zero point of the initial state. After normal operation testing, due to the basically consistent sequence of operation process, and after data collection, it was found that the position data were roughly the same and did not vary too much in the process of interception in seconds. In addition, in order to make the research of test data universally representative, we intercepted the comprehensive average data of the upper limb movements of the five testers. Table 2 displays the average position data of skeletal nodes obtained during a random period of time.

4.2. Calculation of Upper Extremity Joint Angles and Torques

Upon collecting positional data from three typical arm movements, five key parameters are computed: the angles of the shoulder, elbow, and wrist joints, along with the torques at the shoulder and elbow joints. The variation curves for these parameters are depicted in Figure 10, Figure 11 and Figure 12. Table 3 displays the average, maximum, and minimum value of the upper limb joint angles and torques for these movements.

4.3. Assessment Criteria

To demonstrate the accuracy of the model design, the operational comfort of the drillers in this study is evaluated next. There are many evaluation modes, and one of the average modes is selected to evaluate the comfort of this design [23]. According to the ergonomic design of geological core rig, the joint angles of the human body have limited angle range and comfortable angle range under different activity modes [24,25]. The angle adjustment range for the comfortable posture of different body parts is shown in Table 4.
For force comfort, the human upper limb comfort evaluation is itself a subjective mapping of the upper limb muscle strength, which can be assessed by combining the subjective feelings of the tester and the classification of the upper limb muscle comfort grade provided in the literature [26]. The corresponding muscle strength Xi is obtained by converting the torque. Because the increase in Xi, muscle fatigue is inversely proportional to comfort, the muscle strength Xi is input into the following upper limb muscle comfort function:
σ = 0 t ( X max X i ) d t 0 t ( X max X i ) d t ( i N )
In this analysis, Xmax is the maximum muscle strength value calculated above, Xi represents the muscle strength of the joint under test by the tester during the simulation operation, and Xi’ represents the muscle strength of the joint under test in the natural state under the same continuous frame number as the simulation operation. In accordance with the above process, the average comfort index σ of the experimental test data is shown in the Table 5.
Then, the force comfort is evaluated effectively, as shown in Table 6 and demonstrated in the literature [26].

4.4. Analysis of Visible and Reachable Domains

The analysis of the visual domains is grounded in the eye ellipsoid principle. This study employs the Unity3D camera system to emulate the human eye’s visual cone, ensuring a consistent relative positioning of both the visual cone and the visual field concerning the head skeleton. By aligning the camera’s visual cone and the visual field with the virtual eye’s position, this study defines the scope of vision, as demonstrated in Figure 13.
The tracking of the human hand bone position is conducted, with its positional data recorded in each frame, and line segments are rendered using the Unity3D engine. Analyzing the spatial relationship between the hand trajectory curve and the manipulation device model’s range enables reachable range analysis, as depicted in Figure 14.

5. Discussion

By combining motion capture technology with real-time data analysis, the study achieves accurate capture and just-in-time analysis of joint angle and torque in a highly dynamic environment, which provides strong support for ergonomic improvements in complex work scenarios and reveals more design details. The experimental findings indicate that the elbow joint angle is typically largest, followed by the shoulder joint angle, and the wrist joint angle is smallest. A smooth angle change curve indicates there is no movement variation during this period, with the joint angle remaining constant and the upper limb acceleration at zero, bearing only gravitational torque. This phenomenon is corroborated by the torque curve diagram.
The data (Table 3 and Table 4) reveal that, during three typical movements, the average values for both the shoulder and elbow joints fall within the comfort range, and the wrist joint’s average value is closely aligned with this zone. Notably, the maximum and minimum value for the angles of the shoulder, elbow, and wrist joints all fall within the human body’s range of motion limits. Similarly, the data (Table 5 and Table 6) reveal that the force comfort index of shoulder and elbow is also above the “more comfort” level, which indicates that the operation comfort of the experimental test meets the design requirements of the core rig.
This experiment employs a virtual driller’s cabin model designed using Jack version 8.0.1 software and performs an ergonomic evaluation with Catia software version V5-6R2022. The study utilizes motion capture sensors to create a digital human, enabling the quantitative analysis of upper limb comfort, visual field, and reachable range, thereby affirming the validity of the previous cabin design.

6. Conclusions

Traditional ergonomic evaluation methods are challenged by lengthy processes, high cost, and lack of flexibility in adjustment. Virtual ergonomics and digital human modeling are pivotal in digital manufacturing and product development and advancing swiftly. This paper introduces a digital human-based method for ergonomic analysis in the driller’s cabin of core drilling. Operations are conducted through inertial motion capture sensors, the virtual drillers are controlled by the real human operators, and the operations yield more precise and lifelike human–machine analysis data. Concurrently, the study establishes a model of human upper limb to calculate the force and torque of upper limb joints and quantitatively evaluate the rationality of driller room design. This study also provides a more nuanced analysis of joint angle and torque than the simplified treatment of human posture in many existing studies. This depth analysis not only improves the accuracy of the data but also quantitatively evaluates the rationality of the driller room design. The study makes a significant contribution to the integration of digital human technology with dynamic and interactive virtual human–machine simulation, thereby enhancing the operator’s natural interaction with the virtual environment and augmenting the simulation’s reliability and efficiency. Digital human technology also addresses challenges like the difficulties in data collection and the complexity of simulating working conditions. In future research, the application of digital human technology can be further explored in ergonomics evaluation. For example, higher-precision sensors can be integrated or more complex algorithms developed to achieve more accurate, realistic digital human simulations. Similarly, in the face of highly dynamic and complex environments in aviation, automotive, and other fields, based on the case study in this paper, stable operation performance under extreme conditions can be improved through the combination of motion capture technology and real-time data analysis, and the adaptability of the system can be improved by providing real-time analysis results. Ergonomics involves the cross-fusion of many fields, and the application in these fields requires deep integration and collaborative work with these technologies to achieve the best ergonomic evaluation effect. While the direct impact of this approach has been analyzed in this paper, its long-term impact on ergonomic design and human health remains unexplored. Future research should investigate the sustainability of this approach applied over time, assessing how continued use may affect ergonomic outcomes and contribute to overall health and well-being, and should improve the adaptability of the system in different environments to achieve more accurate and efficient ergonomic evaluations.

Author Contributions

Conceptualization, Q.Z. and T.L.; Methodology, Q.Z. and T.L.; Software, T.L. and M.T.; Validation, Q.Z., T.L. and S.C.; Formal analysis, T.L.; Investigation, Q.Z. and M.T.; Resources, S.C. and G.W.; Writing—original draft, Q.Z. and T.L.; Writing—review & editing, Q.Z. and T.L.; Visualization, T.L.; Supervision, S.C. and G.W.; Project administration, S.C. and G.W.; Funding acquisition, G.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China under grant no. 52205611 and no. 41972325, National Undergraduate Training Program for Innovation and Entrepreneurship under grant NO.202410491059, and Exploration and Conjecture Program Project of China University of Geosciences (Wuhan), grant number EGP-202215.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Jennifer, L.; Hans, R. Visual ergonomics on-the-go. Work 2019, 63, 321–324. [Google Scholar] [CrossRef]
  2. Qiu, B.B.; Zhang, Y.F.; Shen, H.; Zhou, J.; Chu, L. Ergonomic researches in agricultural machinery—A systematic review using the PRISMA method. Int. J. Ind. Ergon. 2023, 95, 103446. [Google Scholar] [CrossRef]
  3. Fabio, G.; Margherita, P.; Luca, Z.; Marcello, P. An automatic procedure based on virtual ergonomic analysis to promote human-centric manufacturing. Procedia Manuf. 2019, 38, 488–496. [Google Scholar] [CrossRef]
  4. Manocha, M.; Soni, S.; Beck, M.; Ramirez, X. Scenario Based Training in Drilling Rig Simulator Improves Drillers Core Competency. In Proceedings of the SPE/IADC International Drilling Conference and Exhibition, Stavanger, Norway, 7–9 March 2023. [Google Scholar] [CrossRef]
  5. Bazazan, A.; Dianat, I.; Feizollahi, N.; Mombeini, Z.; Shirazi, A.M.; Castellucci, H.I. Effect of a posture correction-based intervention on musculoskeletal symptoms and fatigue among control room operators. Appl. Ergon. 2019, 76, 12–19. [Google Scholar] [CrossRef] [PubMed]
  6. Dai, M.Y.; Lu, Y.L.; Wang, S.; Zhang, H. Research progress on engineering mechanical ergonomic design based on virtual reality technology. J. Mach. Des. 2023, 37, 116–123. [Google Scholar] [CrossRef]
  7. Li, F.Y.; Li, Y.F. A Computer Aided Conceptual Evaluation of Excavator Cab Based on Ergonomics. Appl. Mech. Mater. 2012, 121–126, 1844–1848. [Google Scholar] [CrossRef]
  8. Chatterjee, T.; Bhattacharyya, D.; Majumdar, D.; Pal, M. Ergonomic Assessment of Multi Calibre Individual Weapon System in Virtual Reality Platform. Def. Ence J. 2019, 69, 240–248. [Google Scholar] [CrossRef]
  9. Rahal, R.; Matarese, G.; Gabiccini, M.; Artoni, A.; Prattichizzo, D.; Giordano, P.R.; Pacchierotti, C. Caring about the human operator: Haptic shared control for enhanced user comfort in robotic telemanipulation. IEEE Trans. Haptics 2020, 13, 197–203. [Google Scholar] [CrossRef] [PubMed]
  10. Nadeau, S.; Salmanzadeh, H.; Ahmadi, M.; Landau, K. Aviation deicing workers, global risk assessment of musculoskeletal injuries. Int. J. Ind. Ergon. 2019, 71, 8–13. [Google Scholar] [CrossRef]
  11. Ma, Q.G.; Sun, X.L.; Fu, H.J.; Zhao, D.C.; Guo, J.F. Manufacturing process design based on mental and physical workload analysis. Appl. Mech. Mater. 2013, 345, 482–485. [Google Scholar] [CrossRef]
  12. Ellegast, R.P.; Kraft, K.; Groenesteijn, L.; Krause, F.; Berger, H.; Vink, P. Comparison of four specific dynamic office chairs with a conventional office chair: Impact upon muscle activation, physical activity and posture. Appl. Ergon. 2012, 43, 296–307. [Google Scholar] [CrossRef] [PubMed]
  13. Caputo, F.; D’Amato, E.; Spada, S.; Sessa, F. Upper Body Motion Tracking System with Inertial Sensors for Ergonomic Issues in Industrial Environments. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2016; Volume 489. [Google Scholar] [CrossRef]
  14. Gong, L.; Chen, B.; Xu, W.; Liu, C.; Li, X.; Zhao, Z.; Zhao, L. Motion Similarity Evaluation between Human and a Tri-Co Robot during Real-Time Imitation with a Trajectory Dynamic Time Warping Model. Sensors 2022, 22, 1968. [Google Scholar] [CrossRef] [PubMed]
  15. Dahibhate, G.; Shinde, R.; Sakle, N. The Use of Digital Human Modeling in Ergonomic Design and Product Development. J. Inst. Eng. India Ser. C 2023, 104, 1133–1138. [Google Scholar] [CrossRef]
  16. ISO 13485:2016; Medical Devices—Quality Management Systems—Requirements for Regulatory Purposes. International Organization for Standardization: Geneva, Switzerland, 2016.
  17. Sers, R.; Forrester, S.; Moss, E.; Ward, S.; Ma, J.; Zecca, M. Validity of the Perception Neuron inertial motion capture system for upper body motion analysis. Measurement 2020, 149, 107024. [Google Scholar] [CrossRef]
  18. ISO 11228-1:2003; Ergonomics—Manual Handling—Part 1: Lifting and Carrying. International Organization for Standardization: Geneva, Switzerland, 2003.
  19. ISO 9241-210:2019; Ergonomics of Human-System Interaction—Part 210: Human-Centered Design for Interactive Systems. International Organization for Standardization: Geneva, Switzerland, 2019.
  20. Li, S.; Hao, Y.; Mo, H.; Li, Q.; Lyu, Y.; Wang, X.; Li, H. Fast non-rigid human motion 3D reconstruction. J. Comput.-Aided Des. Comput. Graph. 2018, 30, 1505–1513. [Google Scholar] [CrossRef]
  21. Kavan, L.; Collins, S.; Zara, J.; O’Sullivan, C. Geometric Skinning with Approximate Dual Quaternion Blending. ACM Trans. Graph. 2008, 27, 105-1–105-23. [Google Scholar] [CrossRef]
  22. Tanaka, Y.; Nishikawa, K.; Yamada, N.; Tsuji, T. Analysis of Operational Comfort in Manual Tasks Using Human Force Manipulability Measure. IEEE Trans. Haptics 2015, 8, 8–19. [Google Scholar] [CrossRef] [PubMed]
  23. Gates, D.H.; Walters, L.S.; Cowley, J.; Wilken, J.M.; Resnik, L. Range of Motion Requirements for Upper-Limb Activities of Daily Living. Am. J. Occup. Ther. 2016, 70, 1–10. [Google Scholar] [CrossRef] [PubMed]
  24. GB/T14779-1993; Sitting Posture and Human Body Template Design Requirements. China Standards Press: Beijing, China, 1993.
  25. Zhang, R.M.; Liu, Z.Q.; Zhou, Q.X.; Shen, Y.H.; Li, C.M. Comfortableness evaluation of range of motion of human upper limb joint. China Saf. Sci. J. 2018, 28, 75–80. [Google Scholar] [CrossRef]
  26. Qing, T.; Li, Z.; Kang, J. Evaluation of human upper limb motion comfort based on motion capture system. In Proceedings of the IEEE International Conference on Automation and Computing, Newcastle upon Tyne, UK, 6–7 September 2018. [Google Scholar] [CrossRef]
Figure 1. Virtual human model.
Figure 1. Virtual human model.
Sensors 24 05962 g001
Figure 2. Motion capture sensor and wear effect.
Figure 2. Motion capture sensor and wear effect.
Sensors 24 05962 g002
Figure 3. Parent–child relationship of human body skeleton.
Figure 3. Parent–child relationship of human body skeleton.
Sensors 24 05962 g003
Figure 4. Parent–child relationship of human upper limb bones.
Figure 4. Parent–child relationship of human upper limb bones.
Sensors 24 05962 g004
Figure 5. Weight of the partial bones in the skeleton.
Figure 5. Weight of the partial bones in the skeleton.
Sensors 24 05962 g005
Figure 6. Angle analysis diagram of upper limb joint.
Figure 6. Angle analysis diagram of upper limb joint.
Sensors 24 05962 g006
Figure 7. Torque analysis diagram of upper limb joints.
Figure 7. Torque analysis diagram of upper limb joints.
Sensors 24 05962 g007
Figure 8. The 3D structure of the core drilling driller’s cabin.
Figure 8. The 3D structure of the core drilling driller’s cabin.
Sensors 24 05962 g008
Figure 9. Test platform.
Figure 9. Test platform.
Sensors 24 05962 g009
Figure 10. Curves of joint angle and torque under operating rocker.
Figure 10. Curves of joint angle and torque under operating rocker.
Sensors 24 05962 g010
Figure 11. Curves of joint angle and torque under twist knob.
Figure 11. Curves of joint angle and torque under twist knob.
Sensors 24 05962 g011
Figure 12. Curves of joint angle and torque under push button.
Figure 12. Curves of joint angle and torque under push button.
Sensors 24 05962 g012
Figure 13. Analysis of visible domains.
Figure 13. Analysis of visible domains.
Sensors 24 05962 g013
Figure 14. Analysis of reachable domains.
Figure 14. Analysis of reachable domains.
Sensors 24 05962 g014
Table 1. Main parameters of human body.
Table 1. Main parameters of human body.
Anthropometry Parameters/Human of the 95th Percentile (cm)
StatureWeightHead LengthAcromion HeightBiacromial BreadthArm LengthElbow SpanButtock-Popliteal LengthThigh Clearance
177.575.019.8147.139.779.9135.049.016.2
Table 2. Random position data at 60 frames per second.
Table 2. Random position data at 60 frames per second.
Position Data of Each Node at the Same TimeRandom Sampling Node Number
12345678910
Operating rockerLeft Handx−0.124 −0.122 −0.118 −0.112 −0.108 −0.103 −0.098 −0.093 −0.090 −0.089
y0.843 0.847 0.854 0.864 0.873 0.884 0.896 0.910 0.925 0.938
z0.194 0.192 0.191 0.189 0.188 0.186 0.187 0.189 0.193 0.199
Left Lower Armx−0.126 −0.123 −0.119 −0.114 −0.110 −0.105 −0.100 −0.095 −0.092 −0.091
y0.839 0.843 0.849 0.859 0.867 0.878 0.890 0.904 0.919 0.931
z0.188 0.186 0.184 0.183 0.182 0.181 0.181 0.184 0.189 0.194
Left Upper Armx−0.292 −0.292 −0.292 −0.293 −0.294 −0.295 −0.297 −0.299 −0.301 −0.303
y0.931 0.930 0.930 0.929 0.930 0.930 0.931 0.933 0.935 0.938
z0.011 0.010 0.009 0.009 0.010 0.013 0.018 0.026 0.035 0.044
Left Shoulderx−0.183 −0.184 −0.184 −0.184 −0.184 −0.184 −0.184 −0.184 −0.184 −0.184
y1.175 1.174 1.173 1.173 1.173 1.172 1.172 1.172 1.172 1.172
z−0.001 −0.002 −0.002 −0.003 −0.004 −0.004 −0.005 −0.005 −0.005 −0.005
Twist knobLeft Handx−0.092 −0.092 −0.091 −0.091 −0.091 −0.092 −0.092 −0.093 −0.093 −0.093
y0.8360.836 0.837 0.837 0.838 0.836 0.839 0.839 0.839 0.840
z0.034 0.034 0.038 0.039 0.041 0.042 0.043 0.044 0.045 0.045
Left Lower Armx−0.092 −0.092 −0.092 −0.092−0.092−0.092−0.092−0.092−0.092−0.093
y0.831 0.8310.8310.8320.8320.8330.8330.8330.8340.834
z0.0280.0280.0320.0330.0350.0360.0370.0380.0390.040
Left Upper Armx−0.231 −0.231 −0.230−0.230−0.229−0.229−0.228−0.228−0.228−0.229
y0.929 0.929 0.928 0.927 0.927 0.9270.9270.9270.9260.926
z−0.166 −0.166−0.164−0.163−0.163−0.162−0.162−0.161−0.161−0.160
Left Shoulderx−0.136−0.136−0.136−0.135−0.135−0.135−0.135−0.134−0.134−0.134
y1.166 1.1661.1661.1661.1661.165 1.1651.1651.1651.165
z−0.090−0.090−0.091−0.091−0.090−0.090−0.090−0.090−0.090−0.090
Push buttonLeft Handx−0.359−0.360 −0.360 −0.359 −0.359 −0.358−0.356−0.354−0.352 −0.349
y0.8680.868 0.869 0.869 0.869 0.869 0.8690.8700.870 0.871
z0.1980.201 0.202 0.204 0.204 0.204 0.205 0.2050.2050.206
Left Lower Armx−0.357−0.357−0.358−0.357−0.356−0.355−0.353−0.351−0.349−0.347
y0.863 0.863 0.864 0.864 0.864 0.865 0.865 0.866 0.866 0.866
z0.192 0.194 0.196 0.197 0.198 0.198 0.1980.198 0.198 0.199
Left Upper Armx−0.296 −0.297 −0.297 −0.298 −0.298 −0.298 −0.299 −0.299 −0.299 −0.301
y0.974 0.975 0.976 0.976 0.977 0.977 0.976 0.975 0.975 0.975
z0.011 0.010 0.009 0.009 0.010 0.013 0.018 0.026 0.035 0.044
Left Shoulderx−0.128 −0.128 −0.128 −0.129−0.129−0.129−0.130−0.130−0.130−0.131
y1.171 1.171 1.172 1.173 1.173 1.172 1.172 1.172 1.172 1.172
z−0.097 −0.097 −0.097 −0.096 −0.096 −0.095 −0.095 −0.096 −0.094 −0.094
Table 3. Data processing results during the three actions.
Table 3. Data processing results during the three actions.
Upper Limb MovementJoint ParameterExperimental Data Statistics
Average ValueMinimum ValueMaximum Value
Operating rockerShoulder angle (°)66.0448.0198.91
Elbow angle (°)86.3267.98113.62
Wrist angle (°)31.0222.9943.47
Shoulder torques (N·cm)253.390857.79
Elbow torques (N·cm)119.660483.62
Twist knobShoulder angle (°)95.4665.79130.06
Elbow angle (°)108.6582.81133.29
Wrist angle (°)78.8258.12104.85
Shoulder torques (N·cm)296.7801108.45
Elbow torques (N·cm)189.350746.83
Push buttonShoulder angle (°)66.0448.0198.91
Elbow angle (°)82.6150.83117.43
Wrist angle (°)36.3014.3263.18
Shoulder torques (N·cm)213.800738.64
Elbow torques (N·cm)101.330250.21
Table 4. Postural and angle range of shoulder, elbow, and wrist joints.
Table 4. Postural and angle range of shoulder, elbow, and wrist joints.
JointMode of MotionLimiting AngleComfort Zone
Shoulder jointFront and rear pendulum140°~40°40°~90°
Elbow joint Bend and stretch140°~40°80°~110°
Wrist jointWrist flexion and extension80°~70°10°~30°
Table 5. Average comfort index under the three movements.
Table 5. Average comfort index under the three movements.
JointUpper Limb Movement
Operating RockerTwist KnobPush Button
Shoulder joint0.7820.7580.800
Elbow joint0.8330.7970.815
Table 6. Comfort rating table.
Table 6. Comfort rating table.
Comfort LevelIIIIIIIVV
Comfort index 0 σ < 0.2 0.2 σ < 0.4 0.4 σ < 0.6 0.6 σ < 0.8 σ 0.8
Comfort descriptionVery uncomfortableNot comfortableGenerally comfortableMore comfortableVery comfortable
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, Q.; Lu, T.; Tao, M.; Cheng, S.; Wen, G. Inertial Motion Capture-Driven Digital Human for Ergonomic Validation: A Case Study of Core Drilling. Sensors 2024, 24, 5962. https://doi.org/10.3390/s24185962

AMA Style

Zhao Q, Lu T, Tao M, Cheng S, Wen G. Inertial Motion Capture-Driven Digital Human for Ergonomic Validation: A Case Study of Core Drilling. Sensors. 2024; 24(18):5962. https://doi.org/10.3390/s24185962

Chicago/Turabian Style

Zhao, Quan, Tao Lu, Menglun Tao, Siyi Cheng, and Guojun Wen. 2024. "Inertial Motion Capture-Driven Digital Human for Ergonomic Validation: A Case Study of Core Drilling" Sensors 24, no. 18: 5962. https://doi.org/10.3390/s24185962

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop