Next Article in Journal
Parameter Optimization and Tuning Methodology for a Scalable E-Bus Fleet Simulation Framework: Verification Using Real-World Data from Case Studies
Previous Article in Journal
Neural Network Modeling of Microstructure Formation in an AlMg6/10% SiC Metal Matrix Composite and Identification of Its Softening Mechanisms under High-Temperature Deformation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe

1
Department of Electronics Information System Engineering, Sangmyung University, 31 Sangmyungdae-gil, Dongnam-gu, Cheonan-si 31066, Chungcheongnam-do, Republic of Korea
2
Department of Information Security Engineering, Sangmyung University, 31 Sangmyungdae-gil, Dongnam-gu, Cheonan-si 31066, Chungcheongnam-do, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(2), 938; https://doi.org/10.3390/app13020938
Submission received: 14 December 2022 / Revised: 1 January 2023 / Accepted: 4 January 2023 / Published: 10 January 2023

Abstract

:
Recently, musculoskeletal disorders (MSDs) caused by repetitive working postures in industrial sites have emerged as one of the biggest problems in the field of industrial health. The risk of MSDs caused by the repetitive working postures of workers is quantitatively evaluated by using NLE (NIOSH Lifting Equation), OWAS (Ovako Working-posture Analysis System), RULA (Rapid Upper Limb Assessment), REBA (Rapid Entire Body Assessment), etc. Methods used for the working posture analysis include vision-based analysis and motion capture analysis. Vision-based analysis is a method where an expert with ergonomics knowledge watches and manually analyzes recorded working images. Although the analysis is inexpensive, it takes a lot of time to analyze. In addition, the analyst’s subjective opinions or mistakes may be reflected in the results, so it may be somewhat unreliable. On the other hand, motion capture analysis can obtain more accurate and consistent results, but its measurement equipment is very expensive and it requires a large space for measurement. In this paper, we propose a computer-based automated REBA system that can evaluate, automatically and consistently, working postures in order to supplement the shortcomings of these existing methods. The CREBA system uses the body detection learning model of MediaPipe to detect the worker’s area in the recorded images and sets the body area based on the position of the face, detected using the face tracking learning model. In the set area, the positions of joints are tracked using the posture tracking learning model, and the angles of joints are calculated based on the joint positions using the inverse kinematics, and then by automatically calculating the degree of load of the working posture with the REBA evaluation method. In order to verify the accuracy of the evaluation results of the CREBA system, we compared them with the experts’ vision-based REBA evaluation results. The result of the experiment showed a slight difference of about 1.0 points between the evaluation results of the expert group and those of the CREBA system. It is expected that the ergonomic analysis method for the working posture used in this study will reduce workers’ labor intensity and improve their safety and efficiency.

1. Introduction

As the proportion of repetitive tasks increases due to rapid industrial development and the automation of processes, the incidence of musculoskeletal disorders among industrial workers is increasing. In order to reduce occupational musculoskeletal disorders in workers, the Korea Occupational Safety and Health Agency (KOSHA) put industries under the obligation to prevent musculoskeletal disorders from July 2003 (Item 5 of Para. 1 of Art. 24 of the Occupational Safety and Health Act, Chapter 12 of the Rules on Occupational Safety and Health Standards). They issued and have enforced the scope of work burdened by the musculoskeletal system (Ministry of Employment and Labor Notice No. 2003-24), and adopted a method of investigating harmful factors (KOSHA GUIDE H-9-2018), but patients with musculoskeletal disorders continue to occur [1]. Among domestic industries, musculoskeletal disorders occurring in the top 10 industries with multiple musculoskeletal disorders accounted for 64.9% of all musculoskeletal disorders, and it was found to occur mostly in the manufacturing, wholesale and retail trade, consumer goods repair, and construction [2].
Figure 1 compares the incidence of occupational diseases and the incidence of musculoskeletal diseases based on the annual occurrence of accident cases reported by the Occupational Safety and Health Agency [3]. The report said that, as of 2021, there were 11,868 people suffering from musculoskeletal disorders, accounting for about 87.4% of 13,578 people with occupational diseases, and it has been on a steady rise since 2016. On the other hand, the rate of deaths per 10,000 workers from occupational accidents decreased by 0.22‱ (14%) from 1.25‱ in 2013 to 1.07‱ in 2021, but the musculoskeletal disorders rate per 10,000 workers increased by 2.59‱ (73%) from 3.53‱ in 2013 to 6.12‱ in 2021. According to these statistics, the number of deaths from industrial accidents has decreased, but the number of people with musculoskeletal disorders are on the rise.
In particular, as the reported musculoskeletal disorders are mainly caused by repeatedly working in a fixed position, many studies on work improvement and preventive measures have been made in the manufacturing and construction industries. Accordingly, as part of the policy to reduce and prevent musculoskeletal disorders, the Ministry of Employment and Labor legislated the duty of employers to prevent musculoskeletal disorders and made it mandatory to investigate related harmful factors and implement measures to improve the working environment. In addition, it is stipulated that workplace health managers direct the working environments to prevent musculoskeletal disorders by improving work methods and securing ergonomic working spaces. It also specified the role of employers, health managers, and workers, respectively, for preventing musculoskeletal disorders, and the preventive and management measures implemented by characteristics are periodically evaluated and supplemented. However, despite such preventive measures for musculoskeletal disorders at workplace, the frequency is continuously increasing, so the need for discussions on occupational accidents related to musculoskeletal disorders is constantly being raised.
The harmful factor surveys for the prevention of musculoskeletal disorders are conducted at only 11.4%, even in manufacturing workplaces with five or more employees, where people are more likely to have musculoskeletal disorders [4]. In addition, the application of the ergonomic evaluation technique with the objectivity and the quantification is very low, at about 16%, and the low proficiency of non-experts in using evaluation tools makes the ergonomic evaluation results inaccurate and unreliable. In this study, we proposed a method that uses MediaPipe to track the worker’s joint information and automatically calculate the risk through the Rapid Entire Body Assessment (REBA) algorithm. The proposed CREBA method produces more objective and accurate evaluations and lowers the evaluation cost. It aims to design a system that can be easily used by non-experts without ergonomic knowledge by intuitively visualizing data with 3D human body models, along with joint angle and risk factors. This enables small businesses that are difficult to be evaluated by ergonomic experts to conduct the musculoskeletal risk assessment with little cost and effort, and provides ergonomics experts with auxiliary tools for reliable evaluation.
In this study, REBA was selected as an ergonomic precision evaluation tool. REBA is difficult to deduce as a uniform evaluation because many values need to be measured for the whole body, but, unlike other evaluation tools that evaluate only specific areas, it is a suitable evaluation tool for measuring the musculoskeletal workload of manufacturing workers. To compare the accuracy of the evaluation, the evaluation group was composed of three nurses from a Spinal and Joint Hospital with ergonomic expertise. Each evaluator watched a pre-shot work video of a wood worker and compared the results obtained through the ergonomic-based precision evaluation with the evaluation results obtained through this system. Experiments showed an acceptable difference, with a mean bias of score between the evaluation results through the REBA evaluation system proposed in this paper and the evaluation results by ergonomic experts. Therefore, Computer-based Rapid Entire Body Assessment (CREBAS) is expected to be useful for the prevention of musculoskeletal disorders, even in small businesses where it is difficult to obtain a professional evaluation by automatically evaluating the risk of musculoskeletal system of workers.

2. Backgrounds

There are two major technologies related to the musculoskeletal risk assessment automation. The first was to evaluate musculoskeletal risk and the evaluation techniques include NIOSH Lifting Equation (NLE), Ovako Working-posture Analysis System (OWAS), Rapid Upper Limb Assessment (RULA), and REBA. Among them, REBA, which deals with the whole body including upper and lower body, arms and wrists, was selected in light of the characteristics of the manufacturing worker. The second is a vision-based posture estimation technique. The depth vision-based posture estimation technique has the highest accuracy among vision-based posture estimation techniques, but the essential use of a depth camera leads to the application of the real field being difficult. Since MediaPipe shows high accuracy for postural estimation, he system was designed using it.

2.1. Musculoskeletal Risk Posture Analysis Technique

The KOSHA suggests various precise evaluation methods to evaluate the harmful factors of musculoskeletal disorders of workers, and various studies are also being conducted for this purpose in many other countries. In the study [5], a statistical analysis on 1920 employees of 35 manufacturing companies through cluster sampling was conducted using questionnaires on demographic characteristics, ergonomics-related factors, occupational types, labor intensity, insomnia and Work-related Musculoskeletal Disorders (WMSD), determining the correlation with musculoskeletal disorders. In the study [6], musculoskeletal disorders were evaluated for workers of petrochemical companies. In this study, evaluation was also conducted through a questionnaire. For data analysis, Statistical Package for the Social Sciences (SPSS) was used to evaluate the risk of each job. However, for this evaluation method, the subjective opinion of the evaluator may hamper the ability to draw accurate results, and therefore an expert or systematic objective evaluation method is required.
Precise evaluation of work posture is to determine the risk of musculoskeletal disorders by applying several evaluation techniques based on the posture of the worker, including OWAS, RULA and REBA. The applied evaluation technique varies depending on the environment or method [7,8,9].
OWAS is a representative work posture evaluation technique developed by a steel company, Ovako Oy, in Finland in the mid-1970s, and then jointly modified by Ovako Oy and the Finish Institute for Occupational Health. OWAS is widely used because it is easy to learn and easy to apply in the field, but it is difficult to perform detailed analysis with because of the oversimplification of the working posture. The working posture is simply defined into four levels, and the results are also not specific, requiring additional detailed analysis procedures [8,10].
RULA is a working posture assessment technique developed by McAtamney and Corlett at the University of Nottingham, UK, in 1933. It was designed to quickly and easily evaluate the workload caused by working posture, by focusing on the upper limbs such as shoulders, wrists, and neck. RULA helps the use of the EU’s minimum safety and health requirements for Visual Display Unit (VDU) workplaces and the UK’s guidelines for occupational upper limb disorder prevention. RULA can quickly and easily determine the proportion of workers with upper limbs disorders caused by poor working posture, and it is possible to evaluate the muscle load caused by the task, by examining how working posture affects muscle fatigue, static or repetitive tasks and the force required for the task. Although RULA was developed for providing comprehensive ergonomic evaluation results, the focus is only on the upper limbs, limiting it in evaluating various working postures [8,9,10,11,12,13].
The REBA assesses an individual worker’s exposure to harmful factors associated with musculoskeletal disorders. Compared to RULA, which focuses on upper limbs, the range of the measurement is wide, so it is suitable for analyzing the degree of body burden and exposure to harmful factors in the automobile industry. It supplements the shortcomings of RULA, covering only the upper limbs, by evaluating wrists, forearms, elbows, shoulders, neck, trunk, waist, legs, knees, etc. [8,9,10,12,14]. Therefore, in order to evaluate the exact workload for the wood manufacturing industry studies in this paper, it is necessary to define the working posture and the appropriate evaluation method.
Table 1 shows the characteristics and the reliability of each evaluation tool. RULA, which has been used in many previous studies, can evaluate the workload on the whole body, but it focuses on the workload on the arm. In the case of OWAS, the whole body is evaluated, but its oversimplification of posture prevent it from subdividing and evaluating various working postures. Lastly, in the case of REBA, although the evaluation reliability is rather low, it has a detailed evaluation system of the whole body, so it is suitable for the evaluation of the whole body workload [13,14]. Therefore, this study intends to evaluate the workload by using REBA, an observation technique suitable for evaluating the workload on the whole body of the worker.

2.2. Posture Estimation Techniques

The most commonly used method in vision-based posture estimation technology is a method using depth. Kinect, released by Microsoft in 2010, is the most representative method. Kinect is a device that can extract a person’s posture by using a depth information extraction method through an infrared beam projector and a monochrome Complementary Metal-Oxide-Semiconductor (CMOS) sensor. When the infrared laser beam is illuminated, CMOS sensor receives the reflected laser beam points and measures the distance between each pixel, and the image processor processes these data to obtain the user’s 3D information. The results of the study [15] verify its usefulness in acquiring the worker’s joints with Kinect and Augmented Reality (AR) markers and evaluating the working posture. However, since it is forced to take the depth image with Kinect and to attach AR markers to the worker’s body, it is not useful in the real field.
The Forces [16] system enables automated musculoskeletal risk estimation through a workstation featuring motion capture, but it is also difficult to apply to real industrial fields because major joints need to be equipped with markers or sensors for motion capture.
Deep neural network learning is called deep learning. Among them, Convolutional Neural Network (CNN) was introduced by LeCun to more effectively process images in deep learning [17], and, later, the recent version CNN was proposed by LeCun in 1998 [18]. Initially, it was used only in the study of the visual cortex of the cerebrum, but since the 1990s, it has been used in image recognition fields. CNN is widely used in image search, autonomous vehicles, and automatic image classification systems, and it is also widely used in other fields, such as speech recognition and natural language processing, as well as the visual fields. Figure 2 shows the CNN architecture designed by LeCun. This architecture consists of convolution layers that extract features, and pooling layers that sample features extracted from the convolution layers. The convolution layers apply a filtering technique to images, and the pooling layers reduce the size of the image by converting local parts of the image into a representative scalar value. The image processing technology using the learning model in this study is based on this CNN.
OpenPose is the most widely used library for estimating posture by vision. It is a posture estimation model introduced by a team at Carnegie Mellon University (CMU) in 2017, and was released in 2019 with further improvements. OpenPose is a bottom-up model that estimates poses in real time through Part Affinity Fields (PAF) [19].
The posture estimation method is divided into a top-down method and a bottom-up method, depending on which part of the body or which one of the joints is detected first. The top-down method is more accurate than the bottom-up method, but it is slower because it first detects the area of the person in the image, and then estimates the person’s posture within their bounding box of. The bottom-up method is less accurate than the top-down method, because it first detects the joints in the image and analyzes the joints’ correlations to estimate the posture by connecting them. However, it has the advantage of being fast, because there is no human body tracking process. Figure 3 shows the system architecture of OpenPose [20].
OpenPose lacks in accuracy due to this bottom-up method, although its estimation speed is fast. When REBA evaluation is performed using OpenPose, it forces the image to be changed to 432 × 368 during the preprocessing, so it could be deformed in the vertical image input, and overlapping or occluded areas could be unmeasured [21].
MediaPipe, another posture estimation technology, is an AI framework provided by Google. It is an open-source cross-platform framework that provides various vision AI functions using various types of perceptual data such as video and audio in the form of a pipeline. This enables a person’s face, body parts, and fingers to be estimated in real time. To implement the posture estimation function, which is the key point of this study, requires a model that can know the body skeletal coordinates. The investigation of the characteristics of the above models: they are top-down method, they are light enough to enable real-time inference even with CPU operation, and they can be used for commercial purposes as well. In the posture estimation process, if the user’s posture is complicated or occlusion areas occur, the position of the hidden joint needs to be estimated. There are two methods for this: heatmap and regression. The heatmap method is to estimate the position of joints by generating a heatmap for each joint and modifying the offset for each joint. The heatmap method probabilistically calculates the location of major joints among body parts through a learning model, and estimates the position of the most frequent parts in the form of the heatmap (Figure 4a). Although it has the advantage of being able to recognize the postures of several people, it takes a long time to calculate due to the large amount of computation. On the other hand, the regression model (Figure 4b) requires few computing resources and is highly scalable, but it is vulnerable to occlusion. MidePipe uses a mixture of these two methods to provide a high-accuracy posture estimation with small amount of computation.
MediaPipe’s pipeline (Figure 5) consists of a body detector and a pose tracker. First, the body area is detected through the body detector, the coordinate of the key-point, and whether there is a person, and the Region of Interest (ROI) of the person is estimated through the posture tracker in the body area. When it is determined that there is a person in the image, only the tracker operates, and when it is determined that there is no person, the detector operates again to detect the body area [22].
Recently, object detection technologies perform the post-processing with Non-Maximum Suppression (NMS). However, NMS can cause malfunctions in complex gestures with overlapping key points, such as hugging or shaking hands. In order to make up for this shortcoming, MediaPipe uses a Face Detector as shown in Figure 6 to detect a face in the image and track the posture based on the face. The face can be quickly estimated because it is the most distinctive among the body parts and has little variation. Based on the position of the face, additional parameters such as the median of the pelvis, the size of the circle including the human body and the inclination depending on the angle of the head are derived.
The MediaPipe pose pipeline combines key-points used by BlazeFace, BlazePalm, and Coco to track 33 joint key-points (See Figure 7). Unlike OpenPose or Kinect, it estimates only the minimum key-points necessary to estimate the rotation and size ROI positions [23].
In this study, a posture estimation technique that can accurately estimate the posture of a worker in real time is required. The research analyzing the performance of OpenPose and MediaPipe compared two datasets consisting of 1000 images, each of which includes one or two people [24]. The first AR dataset consisted of a variety of people and backgrounds without a specific topic, while the second, yoga, dataset consisted only of yoga and fitness postures. For consistent comparison, only 17 joints commonly used in OpenPose and MediaPipe were used for evaluation. The dataset used as the evaluation index has a Percent of Correct Points (PCK) of 20% tolerance, and the BlazePose model, a posture model used in MediaPipe, was compared by capacity (Full, Lite). BlazePose Full is a high-capacity model used in a desktop, and BlazePose Lite is a low-capacity model used in a mobile smartphone. As shown in the comparison results in Table 2, OpenPose had the highest accuracy in the AR dataset and BlazePose Full showed the highest accuracy in the yoga dataset. As for the frame rate, BlazePose Full was 25 times faster than OpenPose in a desktop environment (20 Core CPU), and BlazePose Lite was 70 times faster than OpenPose in a mobile environment (Google Pixel).
Table 3 shows the Frame Per Second (FPS) results when compared using the experimental data of this study: the image of a wood manufacturing worker. The hardware used in the experiment is CPU: AMD Ryzen 7 2700X, GPU: NVIDIA GeForce RTX 2070. As a result of the experiment, MediaPipe was at an average speed of 30 fps with CPU operation, and OpenPose showed slow performance with an average 0.3 fps for CPU operation and 10 fps for GPU operation despite high-performance PC environment.
In general, considering whole-body posture images that do not take into account the occlusion and the intersection of body parts, OpenPose is more accurate than MediaPipe, but it is significantly slower in speed. Because it is necessary to estimate the posture of a worker in real time in this study, MediaPipe was used.
As for the video (iPhone11 Pro shot, 1080p resolution) of the working posture of a wood manufacturing worker used as experimental data in this study, the worker’s posture was estimated normally in the range of 2.5 to 6.0 m between the worker and the camera (See Figure 8).

3. REBA Evaluation System Using MediaPipe

The evaluation system proposed in this paper uses Unity3D, a 3D rendering engine, as an authoring tool to handle the worker’s joints in a 3D virtual environment [25]. Figure 9 shows the architecture of this evaluation system. First, it detects a body area from an image input through MediaPipe and extracts a 3D joint landmark by tracking the posture in the detected area. The measured joint information is visualized in three dimensions through a virtual skeletal model. Based on the relative position of each joint, the joint angle is calculated and input into the REBA evaluation module.

3.1. System Design

Figure 10 shows the Graphic User Interface (GUI) of this evaluation system. For visualization, the basic GUI provided by Unity3D was used. Expressing the detected body areas and joints in the shape of a skeleton on the input image enhanced its visibility. Figure 10 shows the overall GUI of the system. For GUI visualization, image, text, toggle, and button, which are GUI components provided by Unity3D, were used, and Render Texture and Line Render were used to visualize the detected areas and 3D skeleton models.
  • Visualization of Human Pose Estimation results: The working posture image is encoded in RGB32 format and 640 × 480 resolution and output through the Image UI. The Bounding Box of the body area obtained by MediaPipe’s Human Detector and the joint obtained by Pose Tracking is overlaid on the image.
  • Visualization of Inverse Kinematics (IK) modeling: The position coordinates of each joint calculated through Human Pose Estimation are input to the target of the inverse kinematics (IK) model, and the applied result is displayed.
  • Details of Group A (waist, neck, legs): The bending angles of the hip, neck, and leg joints included in group A are displayed in white, and the scores for each part are displayed in green based on the REBA rule.
  • Details of Group B (upper arm, forearm, wrist): The bending angles of the shoulder, elbow, and wrist joints included in group B are displayed in white, and the scores for each part are displayed in green based on the REBA rule.
  • Selection of additional risk factors: Select additional deduction factors based on the group A’s weight/force classification, group B’s grip type, and group C’s behavioral score.
  • Final results (score A, score B, score C, REBA score): The result of group A is score A, the result of group B is score B, and the sum of score A and score B is score C. The final REBA score is displayed in yellow by each left body and right body.
  • Top menu (pause, image selection, end): The pause button is to stop the video being played, and the image selection button is to call up another image. The end button is to end the program.

3.2. Joint Position Estimation

The Tensorflow-based BalzeFace learning model detected the face area from the input video image. The body area was visualized by finding the body center-point based on the face position and by drawing a rectangular boundary based on the face position and center-point (Figure 11).
As shown in Figure 7, 32 joint landmark values are calculated through the BlazePose [26] learning model by inputting the Crop Texture of the body region areas through human body detection. The position and connection information of each joint were visualized with points and lines.

3.3. Calculation of the Angles of Joints Using Inverse Kinematics (IK)

The input data required for the REBA evaluation method are the angles of six joints: waist, neck, leg, upper arm, forearm, and wrist. However, the joint landmark calculated by MediaPipe does not include the waist and neck, and it has only coordinated information, not angle.
Equation (1) is to find the midpoint M between two connected points to obtain the position of the waist and neck. The position of the pelvis is the center of the left hip (Landmark 23) and the right hip (Landmark 24). The position of the chest is the center of the left shoulder (Landmark 11) and the right shoulder (Landmark 12), and these two positions enable us to know the position of the waist. Equation 1 is to find the midpoint M of two points in a three-dimensional space, and Figure 12 shows the concept for this equation.
M = ( x 1 + x 2 2 ,   y 1 + y 2 2 ,   z 1 + z 2 2 )
Inverse Kinematics (IK) was used to obtain the joint angles required for REBA evaluation. IK is mainly used in the computer animation and robotics fields and is the opposite of Forward Kinematics. In Forward Kinematics, when the position and direction of the higher-level object in hierarchy change, the position and direction of the lower-level object are affected and determined; on the contrary, in Inverse Kinematics, the position and direction of child object affects the position and direction of parent object.
In this study, Final IK provided by Unity Asset Store was used for IK calculation [27]. Final IK provides Full Body IK based on Cyclic Coordinate Descent (CCD) [28]. CCD is suitable for calculating the Inverse Kinematics of joints with complex structures. Figure 13 shows the process of CCD. Assume there are five joints, and when P1 is the Base Joint and E is the End Effector, the angles of all joints are sequentially changed so that the difference in distance and direction between the End Effector and the Target (T) is minimized.
Equation (2) is to find the angle of each joint through the dot product of two vectors (u, v) obtain from the Target position and the positions of the two joints.
θ i = cos 1 ( u i · v i | u i | | v i | )
Figure 14 shows a model with Full Body IK applied, and the ball on the right hand of the model is an object for the calculation for the right-hand joint. Based on the IK, the position of the ball object appropriately changes the positions of the rest of the joints, creating a natural posture.
Figure 15 is a human body model in which body parts are connected in a hierarchical form. The Human IK model is basically configured in a top-down manner based on the spine. The model is divided into shoulder and hip based on spine. The shoulder consists of head, arm, and hand, and the hip consists of leg, knee, and foot.
As shown in Figure 16, the joint landmark of MideaPipe and the Scale Factor of the IK model were matched, and a total of nine joints were linked to an End Effector. Inputting the maximum weight of all linked effectors increases the response sensitivity of the connected joints. Figure 16 shows the angle GUI of the joint through the linked IK model.

3.4. REBA Posture Evaluation Algorithm

The working posture evaluation procedure of REBA is shown in Figure 17. In order to evaluate the working posture, the load value for the working posture needs to be calculated. Scores are given depending on the angles of the skeletons belonging to each group, and the load values are calculated using the given scores [8,9,12,14].

3.4.1. Group A Evaluation

First, the parts belonging to Group A are trunk, waist, neck, and legs. Scores for each posture are calculated based on the posture of each part, and then adjusted depending on body twisting or lateral bending. Scores are calculated based on the classification system of body, neck, and leg postures in Table 4, Table 5 and Table 6. One point is added with twisting or side bending in the case of neck and waist, and one point or two points are added depending on the knee bending angle in the case of legs.
Since the scores determined by the classification system represent the individual load level of each part, it is necessary to combine them. Therefore, using REBA Table A in Table 7, the scores of the three parts are combined, and the evaluation of the handling weight in Table 8 is added.

3.4.2. Group B Evaluation

As for Group B, scores are calculated based on the posture of the upper arm, lower arm, and wrist, and the score is adjusted depending on the posture of the upper arm and wrist as shown in Table 9, Table 10 and Table 11. In the case of the upper arm, one point is added when arm is open or rotated, or the shoulder is lifted, and one point is decreased when the arm is supported by something. In tfhe case of the wrist, a twist adds one point.
The score determined by the classification system of upper arm, lower arm, and wrist posture is combined using REBA Table B in Table 12, and the evaluation of the handle in Table 13 is added.

3.4.3. Determination of the Overall Workload

Finally, the posture score A and the posture score B are combined using REBA Table C in Table 14, and, additionally, the behavior scores in Table 15 are added to calculate the REBA score. The calculated REBA score is taken another action through the decision-making right in Table 16.

4. Experiments and Results

In order to verify the accuracy of the system, the seven types of postures (cutting machines, circular saws, drilling machines, saws, chisels, planers, loading/unloading of heavy objects) in Figure 18 that are most commonly used in actual wood and wood product manufacturing plants were selected, and the working scenes were recorded as videos. Based on the recorded images, the accuracy was evaluated by comparing the evaluation results of experts with the results of the proposed system.

4.1. Experimental Environments

The precision evaluation experiment in this study consists of three stages (evaluation preparation, expert evaluation, and feedback). First, the consent of ergonomic experts to participate in the experiment was obtained, and detailed information about the task to be evaluated (work description, photos, weight of the task object, number of repetitions of the task) was provided. Second, ergonomics experts watched the recorded video and evaluated it by applying REBA. Finally, feedback was provided on the results, in which errors occurred compared to the results of this system.
The six working posture images were shot with the iPhone 11 Pro’s default camera, and the resolution of the images was 1080p. During shooting, the distance between the worker and the camera was set to 2.5~3.0m, and the face was made to appear on the screen (Figure 19).

4.2. Evaluation Results by REBA Experts

The Golden Reference for the task to be evaluated in this study was determined by three ergonomic experts (nurses at a spine and joint hospital). The participating ergonomic experts independently performed the ergonomic precision evaluation (REBA), shown in Figure 20, through six images, and the average value of the calculated evaluation results was determined as the golden reference.

4.2.1. Evaluation of Cutting Machine Working Posture

Figure 21 shows the results of the REBA evaluation of three evaluators on the working posture using the cutter. The left and right parts of group B (upper arm, forearm, wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in 7 points, 6 points, and 5.5 points. As a result, the level of the action was found to be Level 2 (Further Investigate). In detail of the cutting machine working posture, the score A was high with 2.5 points, 4 points, and 5 points, respectively, and it was found that the waist and leg posture scores were a big part.

4.2.2. Evaluation of Circular Saw Working Posture

Figure 22 shows the results of the REBA evaluation of three evaluators for the circular saw working posture. The left and right parts of group B (upper arm, forearm, and wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in three points, four points, and five points. As a result, the level of action was found to be Level 1 (Change may be needed) and Level 2 (Further investigate/Change Soon). In detail of the circular saw working posture, the score B was slightly higher than that of other skeletons, with two points, two points and three points, but the overall score was good and it is considered to be a working posture with low musculoskeletal risk.

4.2.3. Evaluation of Drilling Machine Working Posture

Figure 23 shows the results of the REBA evaluation of three evaluators on the drilling machine working posture. The left and right parts of group B (upper arm, forearm, wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in three points, four points, and five points. As a result, the level of action was found to be Level 1 (Change may be needed) and Level 2 (Further Investigate/Change Soon). In detail of drilling machine working posture, the score B was found to be high, with 1.5 points, 4 points, and 5.5 points, and it was found that the shoulder score in particular was a big part.

4.2.4. Evaluation of Saw Working Posture

Figure 24 shows the results of the REBA evaluation of three evaluators for the saw working posture. The left and right parts of group B (upper arm, forearm, and wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in 5.5 points, 6 points, and 4 points. As a result, the level of action was found to be Level 2 (Further investigate/Change Soon). In detail of saw working posture, the score A was high, with four points, three points, and five points, and it was found that the scores of waist, neck and legs were, in particular, a big part.

4.2.5. Evaluation of Chisel Working Posture

Figure 25 shows the results of the REBA evaluation of three evaluators for the chisel working posture. The left and right parts of group B (upper arm, forearm, and wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in six points, six points, and five points. As a result, the level of action was found to be Level 1 (Change may be needed). In detail of chisel working posture, the score A was high, with four points, four points, and two points, and it was found that the scores of waist, neck and were, in particular, a big part.

4.2.6. Evaluation of Planer Working Posture

Figure 26 shows the results of the REBA evaluation of three evaluators for the planer working posture. The left and right parts of group B (upper arm, forearm, and wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in 6 points, 8 points, and 4.5 points. As a result, the level of action was found to be Level 2 Medium Risk (Further investigate/Change Soon) and Level 3 High Risk (Investigate/Implement Change). In detail of planer working posture, the scores of waist, neck and shoulder were high.

4.2.7. Evaluation of the Posture for Heavy Load Work

Figure 27 shows the results of the REBA evaluation of three evaluators for the heavy load working posture. The left and right parts of group B (upper arm, forearm, and wrist) were calculated as average values, and two behavioral points were added because of the fixed body parts and repeated work in a narrow range, resulting in 13 points, 10.5 points, and 10 points. As a result, the level of action was found to be Level 4 Very High Risk (Implement Change). In detail of heavy load working posture, the scores of waist, neck and shoulder were very high.

4.3. Evaluation Results by Computer-Based REBA

The seven working postures (cutters, circular saws, drilling machines, saws, chisels, planers, and heavy loads) used in the REBA evaluation using MediaPipe are commonly repeated postures in the wood manufacturing industry. In the evaluation system, the REBA score of the worker’s motion and posture is indicated.

4.3.1. Evaluation of Cutting Machine Working Posture

Posture analysis was conducted by inputting the same working posture image used for expert evaluation into the vision AI-based REBA evaluation system developed through this study. Figure 28 is the result of evaluating the cutting machine working image with the Vision AI-based working posture evaluation system. The bending score of the waist and legs was good with one point, but the bending score of the neck was high with three points. On the other hand, the results of the shoulder, elbow, and wrist were generally good. Combining the results of groups A and B, the score C was low with three points and two points on the left and right, but because more than one body part is fixed and repetitive tasks are performed in a narrow range, two behavioral points were added. The final REBA score was five on the left and four on the right (Medium risk, Further Investigate).

4.3.2. Evaluation of Circular Saw Working Posture

Figure 29 is the result of evaluating the circular saw machine working image. The joint scores were good overall, but the risk score of the right shoulder was high due to the lifting posture. Because more than one body part is fixed and repetitive tasks are performed in a narrow range, two behavioral points were added. The final REBA evaluation results are five points on both sides (Medium Risk, Further Investigate/Change soon).

4.3.3. Evaluation of Drilling Machine Working Posture

Figure 30 is the result of evaluating the drilling machine working image. The joint scores were good overall, but the elbow angle was not good and the risk score of the right shoulder was high due to the lifting posture. Because more than one body part is fixed and repetitive tasks are performed in a narrow range, two behavioral points were added. The final REBA evaluation result is three points on both sides (Low Risk, Change may be needed).

4.3.4. Evaluation of Saw Working Posture

Figure 31 is the result of evaluating the saw working image. The lower back, neck, and legs were higher due to the lower back posture. The risk scores of shoulder, elbow, and wrist were good, but because more than one body part is fixed and repetitive tasks are performed in a narrow range, two behavioral points were added. The REBA evaluation results are five points on both sides (Medium Risk, Further Investigate/Change Soon).

4.3.5. Evaluation of Chisel Working Posture

Figure 32 is the result of evaluating the chisel working image. The risk scores of waist, neck, and leg were slightly higher due to the posture of bending the waist and head, and the strength/weight score was added by one point due to hammering. In addition, since more than one body part is fixed and repetitive tasks are performed in a narrow range, two behavioral points were added. The final REBA results were six points on the left and seven points on the right (Medium Risk, Further Investigate/Change Soon).

4.3.6. Evaluation of Planer Working Posture

Figure 33 is the result of evaluating the planer working image. The risk scores of wrist, neck, and leg were rather high due to the posture of bending the body and the head. In addition, more than one body part is fixed and repetitive tasks are performed in a narrow range, so two behavioral points were added. The final REBA results were seven points on the left (Medium Risk, Further Investigate/Change Soon) and eight points on the right (High Risk, Investigate and Implement Change).

4.3.7. Evaluation of the Posture for Heavy Load Work

Figure 34 is the result of evaluating the image of the heavy load working posture. The scores were high overall due to the posture of bending the body sideways; the head and the legs also greatly influenced the results. The weight of the load was 20 kg, and two points were added to the weight score, and the handle score was high because there was no grip. In addition, since more than one body part is fixed, and repetitive tasks are performed in a narrow range, two behavioral points were added. The final REBA results were 10 points on the left (High Risk, Investigate and Implement Change) and 11 points on the right (High Risk, Investigate and Implement Change).

4.4. Comparison of Evaluation Results

REBA evaluation by experts shows differences depending on the proficiency of the evaluator. In addition, it takes about 20 min to explain the REBA evaluation method and the evaluation video to the subject, and about 10 to 15 min to fill out the evaluation table. On the other hand, in the CREBA evaluation, the average time was measured by performing the evaluation 100 times for each of the seven postures, and the results are shown in Table 17.
To verify the accuracy of the method proposed in this study, the system implemented in the selected working posture and the evaluation results of the evaluator were compared. Table 18 shows the average scores of three evaluators for six types of work postures (cutting machine, circular saw, drilling machine, saw, chisel, and planer), and Table 19 is the evaluation scores using MediaPipe.
Overall, the comparison of the cutting machine working posture scores (Figure 35) showed similar results between the evaluation scores of the ergonomic expert and the evaluation scores of the proposed system. As for the scores for each joint, the MediaPipe scores were lower in all parts except the neck, and the part showing the biggest difference was the trunk, with an error of 1.3 points. The REBA scores were 4.5 points and 6.7 points, respectively, indicating the same action level (Level 2).
The comparison of the circular saw working posture scores (Figure 36) showed similar results overall. The scores were the same for the trunk and upper arms, and the MediaPipe scores were lower in other parts except the neck. The part showing the biggest difference was the neck, with an error of one point. The REBA scores were three points and four points, respectively, indicating the action level of Level 2 and Level 3.
As for the comparison of drilling machine working posture scores (Figure 37), the scores for each joint were the same for the trunk, legs, and wrist, and in other parts except for the neck and lower arms, where the MediaPipe scores were lower. The part showing the biggest difference was the upper arms, with an error of 1.5 points. The REBA scores were 3.5 points and 4 points, indicating the same action level (Level 2).
As a result of the comparison of the saw working posture scores (Figure 38), scores for each joint were similar overall, with a maximum error of one point, and MediaPipe scores were higher in all parts except for the trunk. The REBA scores were six points and five points, indicating the same action level (Level 2).
As a result of the comparison of the chisel working posture scores (Figure 39), scores for each joint were similar overall, with a maximum error of one point, and MediaPipe scores were higher in all parts, except for the upper and lower arms. The REBA scores were five points and six points, indicating the same action level (Level 2).
As a result of the comparison of the planar working posture scores (Figure 40), scores for each joint were similar overall, with a maximum error of 1.3 points, and the REBA scores were all 6 points, indicating the same action level (Level 2).
As a result of the comparison of the scores of the heavy load working posture (Figure 41), the scores for each joint were similar, with a maximum error of 1.3 points, and the REBA scores were all 10–11, indicating the same action level (Level 4).
As a result of comparing the difference in measurement and evaluation time, it took an average of 40 min to evaluate all six working postures in the precision evaluation by ergonomic experts, whereas the precision evaluation using MediaPipe was performed in real time. Next, a comparison of the evaluation score of the ergonomic experts with the evaluation scores in this system shows that the overall scores were the same or with a slight error, but the musculoskeletal characteristics of each posture were the same. However, in the case of the cutting machine, the left arm was judged to be a safer angle than the right arm in the implemented system, but the left arm was judged to be a more dangerous angle than the right arm in the evaluation of ergonomic experts. Therefore, as a result of feedback, it was difficult to accurately identify the left arm because the left arm was covered during the experts’ evaluation, and MediaPipe’s results could be seen as more accurate.

5. Conclusions

The reports and research on domestic and foreign industrial sites show musculoskeletal disorder is a significant problem that affects not only economic and time loss, but also production efficiency. In this study, a method to measure the working posture of a worker using MediaPipe’s posture tracking technology and to evaluate the working posture ergonomically by applying the REBA technique was proposed.
First, the joint landmark was obtained by measuring the worker’s working posture using MediaPipe, and the joint landmark data was applied to the 3D skeleton model and visualized in a virtual environment. Using the relative positions of each joint landmark, the angles of the joints required by the REBA evaluation technique were calculated and scores were given. In addition, a system implemented with the same working posture image was compared with the evaluation results of ergonomic experts, and it was found that there was a difference in the occlusion.
However, the joint angle visualization of MediaPipe is expected to be used to help the expert evaluation, thereby reducing the time and cost consumed for evaluation. In addition, it is expected that it can be useful in general industrial sites because it can determine the degree of load on the working posture.
However, due to the environmental characteristics of various industrial sites, recognition accuracy may be lowered, and it is difficult to apply in an image frame where a worker’s face is covered. Therefore, future study aims to use images taken from multiple angles, using multiple cameras, and combined into one image to minimize the occlusion and increase the evaluation accuracy to.
It is expected that the ergonomic working method for the working posture analyzed through this study will reduce work effort and improve worker safety and efficiency.

Author Contributions

Conceptualization, S.-o.J. and J.K.; methodology, S.-o.J. and J.K.; software, S.-o.J.; validation, S.-o.J. and J.K.; formal analysis, S.-o.J.; investigation, S.-o.J.; resources, S.-o.J.; data curation, S.-o.J. and J.K.; writing—original draft preparation, S.-o.J. and J.K.; writing—review and editing, J.K.; visualization, S.-o.J.; supervision, J.K.; project administration, J.K.; funding acquisition, J.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This research was funded by a 2021 research Grant from Sangmyung University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Korea Occupational Safety & Health Agency (KOSHA). Musculoskeletal disorder Protection Business Manual. Available online: https://kosha.or.kr/kosha/data/musculoskeletalPreventionData_G.do?mode=download&articleNo=296739&attachNo=230707 (accessed on 6 May 2022).
  2. Yoo, C. An Analysis of Characteristics of Musculoskeletal Disorders Risk Factors. J. Ergon. Soc. Korea 2009, 28, 17–25. [Google Scholar] [CrossRef] [Green Version]
  3. Korea Occupational Safety & Health Agency (KOSHA). The Status of Industrial Accident (2012–2021). Available online: http://www.kosha.or.kr/kosha/data/industrialAccidentStatus.do (accessed on 6 May 2022).
  4. Korea Occupational Safety & Health Agency (KOSHA). 2019 Working Environment Factual Survey. Available online: http://www.kosha.or.kr/kosha/business/inspection.do (accessed on 6 May 2022).
  5. Hao, P.; Li, Y.B.; Wu, S.S.; Yang, X.Y. Investigation and analysis of work-related occupational musculoskeletal disorders and associated risk factors of manufacturing workers. Zhonghua Laodong Weisheng Zhiyebing Zazhi 2020, 38, 187–192. Available online: https://www-ncbi-nlm-nih-gov.libproxy.smu.ac.kr/pubmed/32306691 (accessed on 1 January 2023). [PubMed]
  6. Anonymous. Prevalence of Musculoskeletal Symptoms and Assessment of Working Conditions in an Iranian Petrochemical Industry. J. Health Sci. Surveill. Syst. 2013, 1, 33–40. Available online: https://explore.openaire.eu/search/publication?articleId=doajarticles::8228592723614b3d27cc21a38d1cc7a9 (accessed on 31 December 2022).
  7. Korea Occupational Safety & Health Agency (KOSHA). Musculoskeletal Risk Assessment Tool Manual. Available online: https://www.kosha.or.kr/kosha/business/musculoskeletal_c_d.do (accessed on 6 May 2022).
  8. Lee, K.; Shin, Y.; Koo, H.; Gwon, S. Comparison of Posture Evaluation Methods of OWAS, RULA and REBA in Orchards. Proc. Ergon. Soc. Korea 2011, 59–62. Available online: https://www-dbpia-co-kr.libproxy.smu.ac.kr/journal/articleDetail?nodeId=NODE01815744 (accessed on 1 January 2023).
  9. Nelfiyanti; Mohamed, N.; Rashid, M.F.F.A. Analysis of Measurement and Calculation of MSD Complaint of Chassis Assembly Workers Using OWAS, RULA and REBA Method. Int. J. Automot. Mech. Eng. 2022, 19, 9681. [Google Scholar] [CrossRef]
  10. Cheon, W.; Jung, K. Analysis of Accuracy and Reliability for OWAS, RULA, and REBA to Assess Risk Factors of Work-related Musculoskeletal Disorders. J. Korea Saf. Manag. Sci. 2020, 22, 31–38. [Google Scholar]
  11. Gómez-Galán, M.; Callejón-Ferre, Á.; Pérez-Alonso, J.; Díaz-Pérez, M.; Carrillo-Castrillo, J. Musculoskeletal Risks: RULA Bibliometric Review. Int. J. Environ. Res. Public Health 2020, 17, 4354. [Google Scholar] [CrossRef]
  12. Gorde, M.S.; Borade, A.B. The Ergonomic Assessment of Cycle Rickshaw Operators Using Rapid Upper Limb Assessment (Rula) Tool and Rapid Entire Body Assessment (Reba) Tool. Syst. Saf. 2019, 1, 219–225. [Google Scholar] [CrossRef] [Green Version]
  13. McAtamney, L.; Nigel Corlett, E. RULA: A survey method for the investigation of work-related upper limb disorders. Appl. Ergon. 1993, 24, 91–99. [Google Scholar] [CrossRef]
  14. Hignett, S.; McAtamney, L. Rapid Entire Body Assessment (REBA). Appl. Ergon. 2000, 31, 201–205. [Google Scholar] [CrossRef] [PubMed]
  15. Kim, J.; Park, H. Working Posture Analysis for Preventing Musculoskeletal Disorders using Kinect and AR Markers. Korean J. Comput. Des. Eng. 2018, 23, 19–28. [Google Scholar] [CrossRef]
  16. Marín, J.; Marín, J.J. Forces: A Motion Capture-Based Ergonomic Method for the Today’s World. Sensors 2021, 21, 5139. [Google Scholar] [CrossRef]
  17. LeCun, Y.; Boser, B.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L.D. Backpropagation Applied to Handwritten Zip Code Recognition. Neural Comput. 1989, 1, 541–551. [Google Scholar] [CrossRef]
  18. Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
  19. OpenPose. Available online: https://github.com/CMU-Perceptual-Computing-Lab/openpose (accessed on 6 May 2022).
  20. Zhe, C.; Simon, T.; Shih-En, W.; Sheikh, Y. Realtime Multi-person 2D Pose Estimation Using Part Affinity Fields. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1302–1310. [Google Scholar]
  21. Komilov, D.; Jung, K. Development of a Semi-Automatic Rapid Entire Body Assessment System using the Open Pose and a Single Working Image. Proc. Korean Inst. Ind. Eng. 2020, 84, 1503–1517. Available online: https://www-dbpia-co-kr.libproxy.smu.ac.kr/journal/articleDetail?nodeId=NODE10505750 (accessed on 27 December 2022).
  22. MediaPipe Object Detection. Available online: https://google.github.io/mediapipe/solutions/object_detection.html (accessed on 6 May 2022).
  23. MediaPipe Pose. Available online: https://google.github.io/mediapipe/solutions/pose.html (accessed on 6 May 2022).
  24. Detection of Human Body Landmarks-MediaPipe and OpenPose Comparison. Available online: https://www.hearai.pl/post/14-openpose/ (accessed on 6 May 2022).
  25. Unity3D. Available online: https://unity.com/ (accessed on 6 May 2022).
  26. Bazarevsky, V.; Grishchenko, I.; Raveendran, K.; Zhu, T.; Zhang, F.; Grundmann, M. BlazePose: On-device Real-Time Body Pose Tracking. 2020. Available online: https://arxiv.org/abs/2006.10204 (accessed on 27 December 2022).
  27. FinalIK Document. Available online: http://www.root-motion.com/finalikdox/html/pages.html (accessed on 6 May 2022).
  28. CCD IK. Available online: http://www.root-motion.com/finalikdox/html/page5.html (accessed on 6 May 2022).
Figure 1. Occupational accidents in Korea in 2012–2021.
Figure 1. Occupational accidents in Korea in 2012–2021.
Applsci 13 00938 g001
Figure 2. Convolutional Neural Network.
Figure 2. Convolutional Neural Network.
Applsci 13 00938 g002
Figure 3. OpenPose Architecture. When the image of (a) is input, it detects the joint using the VGG-19 model, one of the CNN models, and creates the heatmap of (b). Based on the heatmap, the PAF of (c) is constructed for the correlation analysis of joints, and the joints are properly divided and connected as shown in (d) to obtain the result of (e).
Figure 3. OpenPose Architecture. When the image of (a) is input, it detects the joint using the VGG-19 model, one of the CNN models, and creates the heatmap of (b). Based on the heatmap, the PAF of (c) is constructed for the correlation analysis of joints, and the joints are properly divided and connected as shown in (d) to obtain the result of (e).
Applsci 13 00938 g003
Figure 4. Posture estimation technique: (a) Heatmap; (b) Regression.
Figure 4. Posture estimation technique: (a) Heatmap; (b) Regression.
Applsci 13 00938 g004
Figure 5. MediaPipe pipeline.
Figure 5. MediaPipe pipeline.
Applsci 13 00938 g005
Figure 6. Conceptual Diagram of the body area tracking by BlazePose.
Figure 6. Conceptual Diagram of the body area tracking by BlazePose.
Applsci 13 00938 g006
Figure 7. MediaPipe Pose Landmark.
Figure 7. MediaPipe Pose Landmark.
Applsci 13 00938 g007
Figure 8. Posture detection and estimation based on the distance: (a) 7.0 m; (b) 6.0 m; (c) 2.5 m; (d) 1.0 m.
Figure 8. Posture detection and estimation based on the distance: (a) 7.0 m; (b) 6.0 m; (c) 2.5 m; (d) 1.0 m.
Applsci 13 00938 g008
Figure 9. REBA evaluation system using MediaPipe.
Figure 9. REBA evaluation system using MediaPipe.
Applsci 13 00938 g009
Figure 10. GUI of REBA evaluation system using MediaPipe.
Figure 10. GUI of REBA evaluation system using MediaPipe.
Applsci 13 00938 g010
Figure 11. Detection and body estimation: (a) Visualization of the detected body area; (b) body estimation and visualization.
Figure 11. Detection and body estimation: (a) Visualization of the detected body area; (b) body estimation and visualization.
Applsci 13 00938 g011
Figure 12. Procedure of finding midpoints of two points.
Figure 12. Procedure of finding midpoints of two points.
Applsci 13 00938 g012
Figure 13. Procedure of CCD IK.
Figure 13. Procedure of CCD IK.
Applsci 13 00938 g013
Figure 14. The position change of joints depending on the position change of the target.
Figure 14. The position change of joints depending on the position change of the target.
Applsci 13 00938 g014
Figure 15. Avatar model information to which IK will be applied: (a) Avatar rigging model; (b) Avatar skeleton mapping; (c) Joint point information.
Figure 15. Avatar model information to which IK will be applied: (a) Avatar rigging model; (b) Avatar skeleton mapping; (c) Joint point information.
Applsci 13 00938 g015
Figure 16. Angle GUI of the joint through the linked IK model.
Figure 16. Angle GUI of the joint through the linked IK model.
Applsci 13 00938 g016
Figure 17. REBA evaluation procedure.
Figure 17. REBA evaluation procedure.
Applsci 13 00938 g017
Figure 18. Working posture depending on wood processing tools: (a) cutting machines; (b) circular saws; (c) drilling machines; (d) saws; (e) chisels; (f) planers; (g) loading/unloading of heavy objects.
Figure 18. Working posture depending on wood processing tools: (a) cutting machines; (b) circular saws; (c) drilling machines; (d) saws; (e) chisels; (f) planers; (g) loading/unloading of heavy objects.
Applsci 13 00938 g018
Figure 19. Camera angle and distance.
Figure 19. Camera angle and distance.
Applsci 13 00938 g019
Figure 20. Examples of filling out the evaluation sheet.
Figure 20. Examples of filling out the evaluation sheet.
Applsci 13 00938 g020
Figure 21. Precision evaluation results of cutter working posture.
Figure 21. Precision evaluation results of cutter working posture.
Applsci 13 00938 g021
Figure 22. Precision evaluation results of circular saw working posture.
Figure 22. Precision evaluation results of circular saw working posture.
Applsci 13 00938 g022
Figure 23. Precision evaluation results of drilling machine working posture.
Figure 23. Precision evaluation results of drilling machine working posture.
Applsci 13 00938 g023
Figure 24. Precision evaluation results of saw working posture.
Figure 24. Precision evaluation results of saw working posture.
Applsci 13 00938 g024
Figure 25. Precision evaluation results of chisel working posture.
Figure 25. Precision evaluation results of chisel working posture.
Applsci 13 00938 g025
Figure 26. Precision evaluation results of planer working posture.
Figure 26. Precision evaluation results of planer working posture.
Applsci 13 00938 g026
Figure 27. Precision evaluation results of heavy load working posture.
Figure 27. Precision evaluation results of heavy load working posture.
Applsci 13 00938 g027
Figure 28. Evaluation of cutting machine working posture.
Figure 28. Evaluation of cutting machine working posture.
Applsci 13 00938 g028
Figure 29. Evaluation of circular saw working posture.
Figure 29. Evaluation of circular saw working posture.
Applsci 13 00938 g029
Figure 30. Evaluation of drilling machine working posture.
Figure 30. Evaluation of drilling machine working posture.
Applsci 13 00938 g030
Figure 31. Evaluation of saw working posture.
Figure 31. Evaluation of saw working posture.
Applsci 13 00938 g031
Figure 32. Evaluation of chisel working posture.
Figure 32. Evaluation of chisel working posture.
Applsci 13 00938 g032
Figure 33. Evaluation of planer working posture.
Figure 33. Evaluation of planer working posture.
Applsci 13 00938 g033
Figure 34. Evaluation of the posture for heavy load work.
Figure 34. Evaluation of the posture for heavy load work.
Applsci 13 00938 g034
Figure 35. Comparison of cutter working posture scores.
Figure 35. Comparison of cutter working posture scores.
Applsci 13 00938 g035
Figure 36. Comparison of circular saw working posture scores.
Figure 36. Comparison of circular saw working posture scores.
Applsci 13 00938 g036
Figure 37. Comparison of drilling machine working posture scores.
Figure 37. Comparison of drilling machine working posture scores.
Applsci 13 00938 g037
Figure 38. Comparison of saw working posture scores.
Figure 38. Comparison of saw working posture scores.
Applsci 13 00938 g038
Figure 39. Comparison of chisel working posture scores.
Figure 39. Comparison of chisel working posture scores.
Applsci 13 00938 g039
Figure 40. Comparison of planer working posture scores.
Figure 40. Comparison of planer working posture scores.
Applsci 13 00938 g040
Figure 41. Comparison of heavy load working posture scores.
Figure 41. Comparison of heavy load working posture scores.
Applsci 13 00938 g041
Table 1. Characteristics and the reliability of each evaluation tool.
Table 1. Characteristics and the reliability of each evaluation tool.
Evaluation ToolCharacteristicsEvaluation Reliability Standard
Deviation t (160)
OWASEasy and simple to apply to the field quickly
Difficulty in detailed analysis due to its oversimplification
trunk: 0.96
lower arms: 0.95
upper arms: 0.17
weight: 0.14
RULAMeasure the overall workload, but evaluation is focused on the upper limbs
Evaluation accuracy has the highest reliability among the three
ScoreA: 1.13
ScoreB: 1.28
trunk:1.35
neck: 1.35
legs: 0.49
upper arms: 0.86
forearm: 0.73
wrist: 0.90
wrist twist: 0.26
REBACompensate for the shortcomings of RULA confined to the
upper limbs
Improve the body load measurement
Score A: 1.44
Score B: 1.71
trunk: 0.73
neck: 0.70
lower arms: 0.83
upper arms: 0.82
forearm: 0.50
wrist: 0.65
Table 2. Frame rate comparison between OpenPose and MediaPipe with AR dataset.
Table 2. Frame rate comparison between OpenPose and MediaPipe with AR dataset.
ModelFPSAR Dataset,
[email protected]
Yoga Dataset,
[email protected]
OpenPose (CPU)0.487.883.4
BlazePose Full1084.184.5
BlazePose Lite3179.677.6
Table 3. The FPS comparison of the posture of a wood manufacturing worker.
Table 3. The FPS comparison of the posture of a wood manufacturing worker.
MediaPipe
(CPU)
OpenPose
(CPU)
OpenPose
(GPU)
Applsci 13 00938 i001Applsci 13 00938 i002Applsci 13 00938 i003
30.0 fps0.3 fps10.0 fps
Table 4. Classification based on the body posture.
Table 4. Classification based on the body posture.
IndexWorking Posture
1Upright posture
20~20° bending or 0~20° reclining
320~60° bending or more than 20° reclining
4More than 60 bending°
+1Trunk is twisted or bent sideways
Table 5. Classification based on the neck posture.
Table 5. Classification based on the neck posture.
IndexWorking Posture
10~20° bending
2More than 20° bending
+1Neck is twisted or bent sideways
Table 6. Classification based on the leg posture.
Table 6. Classification based on the leg posture.
IndexWorking Posture
1Both legs kept side by side or walking/sitting
2Only one foot is supported on the ground
+1Knee bent 30° to 60°
+2Knee bent more than 60°
Table 7. Scoreboard for REBA Table A (waist, neck, legs).
Table 7. Scoreboard for REBA Table A (waist, neck, legs).
NeckLegsWaist
12345
1112234
223456
334567
445678
2113456
224567
335678
446789
3134567
235678
356789
467899
Table 8. Evaluation of handling weight.
Table 8. Evaluation of handling weight.
<5 kg5–10 kg>10 kgShock or Sudden Force
012+1
Table 9. Classification based on the upper arm posture.
Table 9. Classification based on the upper arm posture.
IndexWorking Posture
120° reclining or 20° lifting forward
2Reclining more than 20° or 20~45° lifting forward
345~90° lifting forward
4More than 90° lifting forward
+1Upper arm is stretched or rotated
+1Shoulders lifted
-1Arm is supported or leaned on something
Table 10. Classification based on the lower arm posture.
Table 10. Classification based on the lower arm posture.
IndexWorking Posture
160~100° lifting
2More than 100° lifting or 0~60° lifting
Table 11. Classification based on the wrist.
Table 11. Classification based on the wrist.
IndexWorking Posture
10~15° bending or lifting
2More than 15° bending or lifting
+1Wrist is twisted
Table 12. Scoreboard for REBA Table B (upper arm(shoulder), lower arm, wrist).
Table 12. Scoreboard for REBA Table B (upper arm(shoulder), lower arm, wrist).
Lower ArmWristUpper Arm (Shoulder)
12345
1112234
223456
334567
2113456
224567
335678
Table 13. Classification of handle types.
Table 13. Classification of handle types.
GoodAcceptableBadVery Bad
With a strong and well-fixed handle located in the center of gravityWith acceptable handle or if a part of the object can be used like a handleNot suitable to hold by hand even though it can be lifted or with an inappropriate handleNo handle or with a dangerous type of handle
Table 14. Scoreboard for REBA Table C.
Table 14. Scoreboard for REBA Table C.
Posture Score A
123456789101112
Posture score B1112346789101112
2123446789101112
3123446789101112
42334578910111112
534456891010111212
634567891010111212
745678991011111212
8567889101011121212
96678910101011121212
107789910111112121212
117789910111112121212
127889910111112121212
Table 15. Static/repetitive behavior scores.
Table 15. Static/repetitive behavior scores.
IndexBehavior Scores
+1If one or more body parts are held
(ex: hold for more than a minute)
+1Repetitive tasks in a narrow range
(ex: repeat more than 4 times per minute except for walking)
+1Rapidly changing behavior over a wide range or unstable lower body posture
Table 16. REBA decision-making right.
Table 16. REBA decision-making right.
LevelREBA ScoreRisk LevelMeasurement
(Further Investigate)
01Very lowNo change
12–3LowChange may be needed
24–7MediumChange soon
38–10HighInvestigate and change soon
411–15Very highImplement change
Table 17. Evaluation time required for each posture of CREBA.
Table 17. Evaluation time required for each posture of CREBA.
TypeData Size
(MB) 1
MinimumMaximumAverage
Cutter6.51112.6
Circular saw4.98112
Drilling machine5.111.311.3
Saw6.760.300
Chisel6.161.614
Planer4.9312.61
Heavy load8.681.31.31.3
1 All videos are encoded at 1280 × 720, 30 fps.
Table 18. REBA evaluation results by experts (average).
Table 18. REBA evaluation results by experts (average).
GroupBody PartCutterCircular SawDrilling MachineSawChiselPlanerHeavy Load
AWaist2.3112.61.62.33
Neck1.311221.63.6
Legs2.31.311.311.32.3
Weight00.300112
Score A41.6143.33.67.3
BUpper Arms212.610.62.63
Lower Arms1.61.31.31.311.61
Wrist2111.3111
Handle1.3111.30.61.33
Score B (Left)423.60.61.643
Upper Arms1.633.311.62.33
Lower Arms1.61.61.311.61.61
Wrist21.61.31111
Handle0.60000.613
Score B (Right)2.343.6123.33
REBA ScoreLeft6.63.63.65.35713
Right5.64.34556.613
Table 19. REBA evaluation results by CREBA (average).
Table 19. REBA evaluation results by CREBA (average).
GroupBody PartCutterCircular
Aw
Drilling
Machine
SawChiselPlanerHeavy
Load
AWaist1112222
Neck3223335
Legs2112223
Weight0100102
Score A3215658
BUpper Arms1111112
Lower Arms1122212
Wrist1111111
Handle1110012
Score B (Left)2221125
Upper Arms2332223
Lower Arms1122212
Wrist1111111
Handle0000012
Score B (Right)1342227
REBA
Score
Left53366610
Right43466611
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jeong, S.-o.; Kook, J. CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe. Appl. Sci. 2023, 13, 938. https://doi.org/10.3390/app13020938

AMA Style

Jeong S-o, Kook J. CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe. Applied Sciences. 2023; 13(2):938. https://doi.org/10.3390/app13020938

Chicago/Turabian Style

Jeong, Seong-oh, and Joongjin Kook. 2023. "CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe" Applied Sciences 13, no. 2: 938. https://doi.org/10.3390/app13020938

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop