Next Article in Journal
Improved Deep Learning for Parkinson’s Diagnosis Based on Wearable Sensors
Previous Article in Journal
Analysis and Design of Low-Noise Radio-Frequency Power Amplifier Supply Modulator for Frequency Division Duplex Cellular Systems
Previous Article in Special Issue
Mixed Reality-Based Inspection Method for Underground Water Supply Network with Multi-Source Information Integration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Industrial Maintenance Task Complexity on Neck and Shoulder Muscle Activity During Augmented Reality Interactions

by
Mohammed H. Alhaag
1,*,†,
Faisal M. Alessa
2,*,†,
Ibrahim M. Al-harkan
2,
Mustafa M. Nasr
3,
Mohamed Z. Ramadan
2 and
Saleem S. AlSaleem
4
1
Department of Industrial Engineering, College of Engineering and Computer Science, Mustaqbal University, Buraydah 52547, Saudi Arabia
2
Department of Industrial Engineering, College of Engineering, King Saud University, Riyadh 11421, Saudi Arabia
3
Industrial Engineering Department, College of Engineering, Taibah University, Medina 41411, Saudi Arabia
4
Department of Civil Engineering, College of Engineering, Qassim University, Buraydah 52571, Saudi Arabia
*
Authors to whom correspondence should be addressed.
These authors contributed equality to this work.
Electronics 2024, 13(23), 4637; https://doi.org/10.3390/electronics13234637
Submission received: 16 October 2024 / Revised: 21 November 2024 / Accepted: 23 November 2024 / Published: 25 November 2024
(This article belongs to the Special Issue Applications of Virtual, Augmented and Mixed Reality)

Abstract

:
Extensive studies have demonstrated the advantages of augmented reality (AR) in improving efficiency, thereby fulfilling a quality role in industry. Yet, the corresponding physical strain on individuals poses a significant challenge. This study explores the effects of task difficulty (complex versus simple maintenance activities) and multimedia guidance (e.g., paper-based versus AR via HoloLens) on physical strain, body discomfort ratings, perceived exertion, and mental effort. A 2 × 2 mixed design was employed, involving a total of 28 participants with an average age of 32.12 ± 2.45 years. Physical strain was evaluated by measuring the normalized root mean square (RMS) of electromyography (EMG) indicators, expressed as a percentage of maximum voluntary contraction (%MVC) from six muscles (i.e., right flexor carpi radialis (RFCR), right middle deltoid (RMD), right upper trapezius (RUT), right cervical extensor (RCE), and right and left splenius (RSPL and LSPL) muscles. The results indicated that AR instruction, particularly in complex tasks, led to higher physical strain in the neck and shoulder muscles (RCE and RUT) compared with paper-based methods. However, AR significantly reduced strain in the RSPL, LSPL, RMD, and RFCR muscles during both simple and complex tasks. This study highlights that while AR can lower physical strain in certain muscle groups, it also introduces increased strain in the neck and shoulders, particularly during more demanding tasks. This study highlights the need for ergonomic considerations when designing and implementing AR technologies, especially for complex tasks that inherently demand more from the user, both physically and cognitively.

1. Introduction

By integrating digital contents into users’ actual surroundings, augmented reality (AR) improves individuals’ real-world perception [1], allowing them to effortlessly access relevant information without distraction from their immediate surroundings. Compared with conventional input interfaces, AR technologies offer user interfaces that are more natural and intuitive [2,3]. Technological advancements in hardware and optics have led to the development of market-ready head-mounted displays (HMDs) that are wearable and offer AR applications [4,5,6]. Previous research has demonstrated that AR systems impose not only physical strain but also impose significant cognitive demands and visual fatigue [7,8,9,10,11,12,13,14]. Understanding the correlation between these latter factors and physical strain is crucial for developing ergonomic and user-friendly AR applications for industrial environments. AR HMDs, which frequently come with hand-tracking systems, built-in cameras, and sensors, enable individuals to engage in 3D surroundings by utilizing natural body gestures that offer extensive freedom of movement and easy access to relevant content [15,16,17,18]. Due to these benefits, AR has received extensive attention in different fields such as education, manufacturing, training, aerospace, military, gaming, healthcare, construction, and safety [19,20,21,22].
Moreover, for challenging and intensive maintenance tasks, AR technologies are now widely utilized as an effective instructional method [23,24,25]. AR tools have been found to be more engaging than traditional methods for delivering instructions and job-relevant information. This increased engagement has been linked to higher productivity and enhanced quality control [26,27,28,29]. In addition, for complex tasks involving strategic planning and extensive problem solving, the use of AR offers significant advantages over traditional approaches. AR can enhance cognitive understanding of the tasks at hand [23]. Prior research has primarily focused on assessing the usefulness of AR, based on time to complete tasks, rates of success, and subjective evaluations [30,31,32,33,34,35,36,37,38].
Despite the promising applications of AR in many fields, potential effects such as the risk of musculoskeletal injuries have also been reported [39,40,41,42,43,44,45]. Interactions with AR HMDs have been found to cause strain in the neck and shoulders [40,45]. According to Knight and Baber [39] and Sardar et al. [42], musculoskeletal injuries can be caused by the strain that HMDs put on the neck and upper extremities. According to research by Kim et al. [45], Kong et al. [40], and Penumudi et al. [41], users may experience discomfort and strain in the shoulders due to constant movement of the upper extremities and maintaining static postures for extended periods while interacting with augmented reality.
Currently, only a few studies have assessed the impact of AR technology on neck muscle activation. No research has objectively measured the effects of the complexity of maintenance tasks on neck and shoulder strain during AR interactions. Prior studies have shown that interface design and errors in AR interactions can affect perceived physical demand [44,45]. Previous research comparing AR with traditional instruction methods has been limited by a lack of comprehensive user studies assessing the potential negative impacts of AR-based interactions. In addition, evaluations and comparisons have primarily been conducted using non-industrial items, involving activities of low complexity and limited variability. This may not accurately reflect real-world industrial situations. As a result, the conclusions derived from those studies may not be relevant to the broad spectrum of activities in industrial environments, given the considerable variety of products involved. To fully comprehend the characteristics of AR-based instructional systems compared with paper-based systems, it is crucial to extensively examine the potential adverse effects of AR systems, utilizing objective methodologies.
In order to accurately represent real-world industrial scenarios, the experiment’s design should incorporate a diverse range of intricate operations [46]. Therefore, this study is crucial as it will enhance our knowledge of how to assess the impact of AR systems on the physical strain experienced by users during maintenance activities, thereby adding to the existing knowledge base.
The main goal of this study was to assess the impact of different instructional methods (specifically, AR HMD and paper-based instructions) and task complexity on physical strain in the neck and shoulders. Physical strain was evaluated through body discomfort ratings, perceived exertion, mental efforts, and by measuring the normalized root mean square (RMS) of electromyography (EMG) indicators. These indicators were expressed as percentages of maximum voluntary contraction (%MVC) from the neck, shoulder, and forearm muscles. While this study primarily investigated the physical strain associated with neck and shoulder muscle activity, it is part of a broader research effort that also examines cognitive load and physical stress in AR-assisted industrial maintenance tasks, as detailed in our previous studies [21,46]. These complementary studies provide a holistic view of AR’s multifaceted impact on users.
This study seeks to address the following question: What is the impact of AR-based guidance strategies on body discomfort ratings, perceived exertion, mental efforts, and upper muscular activity across various task difficulties? It is hypothesized that AR interactions will lead to less neck and shoulder strain compared with interactions involving paper.

2. Materials and Methods

2.1. Participants

A group of twenty-eight male undergraduates from King Saud University were recruited. The participants averaged 32.12 (2.45) years old, 71 (9.80) kg, and 168.13 (3.72) cm tall. The self-reported screening survey found that all individuals were healthy, had normal vision, no nervous system illnesses, and no history of neck, shoulder, or other musculoskeletal ailments. None had experienced sleep interruption in the previous two weeks. Participants needed to be in good health conditions to ensure the study’s validity and minimize potential confounding factors. Pre-existing health conditions could have influenced muscle activity or discomfort ratings, introducing variability into the data. Additionally, this requirement safeguarded they participants while they performed the maintenance tasks, some of which required moderate physical effort.
Participants were required to follow specific guidelines before data collection. They were asked to abstain from food and smoking for at least two hours and to avoid strenuous physical activity prior to the trial. Additionally, they were instructed to maintain their usual sleep routine, ensuring a minimum of six hours of sleep each night. Only those with minimal or no prior experience in the activity were eligible to participate.

2.2. Experiment Variables

2.2.1. Maintenance Tasks

Repairing a gearbox (i.e., a complex task) and inspecting the pump housing seals (i.e., simple task) were the primary areas of attention in this study of piston pump overhaul maintenance. Figure 1 shows the parts and the fully built piston pump housing. Figure 1 displays the parts and completely assembled condition of the piston pump gearbox, specifically designed for the complex task. Participants were tasked with repairing the piston pump gearbox and thoroughly examining all of its seals. This assignment required the assembly of 32 sets of components in 40 steps, utilizing a soft hammer, screwdriver, jig, puller, and open-ended wrench. Figure 2 shows a detailed picture of the pump housing, illustrating its components separately for the purpose of the simple task. The task involved personnel verifying all the pump’s housing seals when disassembling its pieces. This activity was less arduous than the intricate task as it involved fewer stages (26), utilizing non-specialized hand tools and lightweight components. The level of intricacy and the specific procedures for the two jobs are comprehensively explained and have been assessed through experimentation in studies conducted by Alhaag et al. [47] and Alessa et al. [21,46].

2.2.2. Instruction Methods

For this study, instructions were developed in both paper-based and AR-based formats to guide individuals in performing the maintenance operations accurately. The instructions were printed on A4 paper and organized in a folder. Each page of the folder contained instructions for two steps. These instructions were directly adapted from the handbook provided with the piston pump. The printed instructions provided visual aids and written explanations and listed the essential items needed for each step.
The difficult task required a paper-based instruction manual that consisted of 22 pages and contained a total of 40 individual procedures to be executed. The paper-based instructions for the easy activity consisted of 14 pages and included a total of 26 discrete steps. Figure 3 illustrates an instance of a participant utilizing the paper-based instruction method to successfully accomplish a straightforward job.
The AR instructions were created using Creo Illustrate (2019 version) and Vuforia Studio software (2019 version) to generate the maintenance sequence and the AR experience, respectively. Creo Illustrate and Vuforia Studio were selected for their advanced capabilities and streamlined workflow from design to AR deployment. Their compatibility ensured efficient linking of the maintenance sequences with the AR environment. Creo Illustrate excels in creating precise sequences with complex 3D models and integrates seamlessly with various CAD software programs. Vuforia Studio offers a user-friendly interface with minimal coding and robust features for immersive AR experiences. These tools provided an efficient workflow for the AR-based instructional methods. The participants’ surroundings were presented with a 3D-animated maintenance sequence generated utilizing the initial version of Microsoft HoloLens (Microsoft, 2019). This strategy allowed participants to engage with the digital model while interacting with the actual surroundings. For example, they positioned the model on the maintenance desk at a distance of 130 cm from them, using the recognized marker. Figure 4 presents examples of AR instructions for one step during the complex and simple tasks.
This AR application employs head movement tracking as the primary interaction method, allowing users to control the maintenance 3D animation sequences through 3D buttons. To initiate the maintenance procedures, users simply move a pointer by tilting their heads and directing it toward a 3D button. They can navigate the instruction sequence by using the 3D buttons to receive guidance on the current step, move forward to the next, or return to the previous one, and quickly jump to any desired step. Figure 5 shows participants engaging in the tasks using Microsoft HoloLens 1 (first version, Microsoft, 2019).

2.3. Experimental Design

This experiment followed a two-way mixed design. The instruction methodologies, consisting of two levels (paper guidance and AR guidance), were considered a between-subject variable. The maintenance activities, with two levels (simple and complex tasks), were considered a within-subject variable. The dependent variable in this design was the percentage of maximum voluntary contraction (%MVC) according to the root mean square (RMS) of the EMG reading values from six muscles: right flexor carpi radialis (RFCR), right middle deltoid (RMD), right upper trapezius (RUT), right cervical extensor (RCE), and right and left splenius (RSPL and LSPL).

2.4. Response Variables

2.4.1. EMG Responses

The EMG signals were recorded using an eight-channel Mega amplifier (ME 6000) from Mega Electronics Ltd., Kuopio, Finland. The system operated at a sampling rate of 1000 Hz with a 14-bit resolution, and the electrode impedance was maintained below 10 kΩ. Before electrode placement, the skin at each recording site was carefully prepared by shaving any excess hair and cleaning the area with a 70% isopropyl alcohol swab. After the skin was dried, pre-gelled 20-mm silver/silver chloride (Ag/AgCl) adhesive electrodes (Blue Sensor, Ambu A/S, Ballerup, Denmark) were applied to measure muscle activity in six specific muscles, as illustrated in Figure 6. EMG signals were collected from six specific muscles: RSPL, LSPL, RCE, RUT, RMD, and RFCR, as illustrated in Figure 7. A band pass filter with a frequency range of 15–450 Hz was used to eliminate low-frequency artifacts, including motion potentials, surrounding muscle activity, and respiration [48]. Subsequently, a 60-Hz notch filter was employed to eliminate the 60 Hz power line interference present in the recorded EMG signals [48,49]. The muscle activities observed at the maximal voluntary contraction (MVC) of the six muscles under investigation were recorded. These recordings were utilized to normalize the muscle activities observed under the experimental conditions.
Therefore, the muscle activity data is represented as a percentage of the highest recorded muscle activities at maximal voluntary contraction (%MVC). The %MVC of root mean square (RMS) of the EMG for all sampled muscles were used as the dependent variable related to EMG response.

2.4.2. Body Discomfort Ratings

Participants were asked to rate their subjective comfort levels during the experiment. The body part discomfort scale was used to measure discomfort; a higher score indicated greater discomfort and a lower score indicated greater comfort [50]. Discomfort was rated after participants completed each task. The visual analog scale representing overall discomfort is shown in Figure 8, with a score of 10 indicating unbearable discomfort, 1 indicating very comfortable, and a continuum of discomfort between 1 and 10. Participants verbally reported their comfort levels for each part of the body (neck, shoulder, arm, back, waist, hip, thigh, knee, foot), and these ratings were recorded and scored by the experimenter from 0 to 10 for local discomfort.

2.4.3. Mental Effort and Perceived Exertion Ratings

Mental effort was assessed by means of the mental effort rating scale developed by Paas [51]. Participants indicated the amount of mental effort they invested in each task, ranging from “1” (very low mental effort) to “9” (very, very high mental effort). Perceived exertion was assessed by 10-point rating scale developed by Borg et al. [52]. Participants were asked to report how difficult they perceived each task to be, ranging from not at all difficult (2) to extremely difficult (10).

2.5. Experimental Procedure

Prior to participating in the study, all individuals were asked to review and sign an informed consent document that had been approved by the Institutional Review Board at King Saud University (approval code: E-19-4467). Upon their arrival, participants were given a health questionnaire to complete and received a thorough briefing about the study’s goals and procedures. They also filled out surveys assessing their prior experience with AR in maintenance activities. Following this, participants were randomly divided into two groups for the experiment. The first group received training using augmented reality (AR) via the first iteration of Microsoft HoloLens1, released in 2019. The training focused on maintenance procedures and involved a brief 10–15 min session to familiarize participants with HoloLens and its functionalities [23,53]. The training involved a demonstration of disassembling and assembling a scooter, during which participants were instructed on the correct utilization of the toolkit. The second group was provided with printed documents on A4 paper for the assignments. Once participants had familiarized themselves with their respective instructional approach, either paper-based or HoloLens guidance, they returned the next day to carry out the maintenance activities using the assigned methods.
The next day, participants’ bodies were equipped with Ag/AgCl electrodes for EMG. The EMG signal data were collected while the subjects performed MVC. Additionally, EMG signals were gathered throughout every training procedure and maintenance activity. Participants in Group 1, referred to as the AR guiding group, were allocated a task to accomplish using AR instructions. The tasks could be either complex or simple, as indicated. The allocation of tasks was carried out using a counterbalancing strategy to minimize the impact of task order. Two sequences, AB and BA, were used, with A representing complicated tasks and B representing simple activities. In the AB sequence, half of the participants in Group 1 initially participated in a complex task. The remaining participants were given the BA sequences to perform the simple task first. During this session, participants successfully finished 50% of the experiment, which involved either a simple or a complex activity.
In the second phase of the experiment, participants were asked to return to the laboratory after two days to finish their remaining tasks, which were either simple or complex. Group 1 received instructions through augmented reality (AR), whereas Group 2 received instructions on paper. Upon completion of all sessions and removal of the electrodes, participants were formally recognized for their participation.

2.6. Statistical Analysis

The statistical analysis was performed using IBM Inc.’s SPSS Statistics software, version 23.0, with a significance level of 0.05. The normality assumption was verified for all experimental conditions using Kolmogorov–Smirnov tests [54]. According to the Kolmogorov–Smirnov testing, none of the variables demonstrated statistical significance (p > 0.05), indicating that the data met the normality assumption. Mauchly’s test was conducted to assess the sphericity of all responses across all variables, confirming that the assumption of homogeneity of variance was met. The sphericity research validated the underlying premise, considering the binary nature of the data, which consisted of only two levels.
A two-way analysis of variance (ANOVA) with repeated-measures methods was conducted to investigate the primary and interactive effects of the instructional approach and task difficulty on the percentage of maximum voluntary contraction (%MVC) according to the root mean square (RMS) of EMG for the six muscles. The relevance of the discovered interaction effects was further studied through a study of the simple effects. A paired t-test was employed to detect significant differences in task difficulty variations. Significant combinations of the training strategies were identified by conducting pairwise evaluations of the interaction effects, using an independent sample t-test.
The assumptions of normality, as assessed by the Shapiro–Wilk test, were violated for all subjective measures. Consequently, non-parametric tests were utilized. The Wilcoxon signed-rank test was employed to analyze task complexity as a within-subjects factor, meaning the same participants performed both simple and complex tasks. In contrast, the Mann–Whitney U test was used to evaluate instruction methods as a between-subjects factor, indicating that participants were divided into two groups: one using paper-based and the other using AR instructions.
Descriptive statistics including means and standard deviations were calculated for each dependent variable. Additionally, the magnitude of the effects was assessed using partial eta-squared (η2), representing the proportion of total variance in the dependent variables that can be attributed to the independent variables.

3. Results

3.1. EMG Response

The statistical analysis revealed that the instruction method had a significant impact on muscle activation across various muscle groups. Specifically, AR-based instructions significantly reduced muscle activity (RMS EMG (%MVC)) in the RSPL, LSPL, RMD, and RFCR muscles compared with paper-based instructions (p < 0.05; Table 1). However, increased activity was observed in the RCE and RUT muscles with AR-based instructions (p = 0.04 and p = 0.01, respectively). Similarly, task complexity significantly affected muscle activation across all sampled muscles, with higher task complexity leading to greater activation (p < 0.05; Table 1). The interaction between instruction method and task complexity was also significant in several cases (i.e., for RSPL, LSPL, and RUT muscles (p < 0.05)), indicating that the effects of AR instructions were more pronounced for simpler tasks.
Additional analyses were conducted using paired and independent t-tests to examine the variations in % RMS EMG (%MVC) across different levels of task complexity and instructional approaches. Figure 9 shows the interaction effects between task complexity and instruction methods on %RMS EMG (%MVC) for RSPL, LSPL, and RUT muscles, respectively. For the RSPL muscle (Figure 9), pairwise comparisons at each level of the instruction method revealed that both AR and paper-based approaches resulted in significantly higher % RMS EMG (%MVC) during high-demand tasks compared with low-demand tasks (p < 0.05). Furthermore, within both task complexities, AR-based instruction led to significantly lower RSPL muscle activity compared with paper-based instruction (p < 0.05). The magnitude of this reduction in muscle activity with AR was more pronounced during high-demand tasks. These findings suggest that employing augmented reality (AR) instructions for both task complexities significantly decreased the physical strain on the RSPL muscle, with a greater effect observed during more complex tasks.
For the LSPL muscle (Figure 9), the results revealed a significant increase in % RMS EMG (%MVC) during high-demand tasks compared with low-demand tasks when using the paper-based instruction method (p = 0.03). In contrast, for the AR instruction method, there was no significant difference in % RMS EMG between the low- and high-demand tasks (p > 0.5), with values remaining relatively similar across both task complexities. Comparing the two instructional methods, AR-based instruction resulted in reduced LSPL muscle activity, with the difference being significant during the high-demand task (p < 0.05). These results indicate that using AR instruction for highly demanding tasks significantly reduced physical strain on the LSPL muscle compared with paper-based instruction.
For the RUT muscle (Figure 9), the results showed that performing the highly demanding task with paper-based instruction significantly increased RUT muscle activity (p < 0.05). A similar trend was observed with AR-based instruction, although the increase in muscle activity was not statistically significant. Further analyses compared %RMS EMG (%MVC) between the instruction methods at each level of task complexity. The results indicated a significant difference between the instruction methods during the low-demanding task (p < 0.05), with paper-based instruction leading to lower RUT muscle activity. Overall, these results suggest that paper-based instruction reduced physical stress on the RUT muscles, particularly during the less complex (i.e., low-demanding) tasks.
Figure 10 shows the main effects of instruction method on % RMS EMG (%MVC) for the RCE, RMD, and RFCR muscles. The results indicated that using AR instruction led to significantly higher muscle activity in the RCE muscle compared with paper-based instruction (p < 0.05). In contrast, AR instruction resulted in significantly lower muscle activity in both the RMD and RFCR muscles compared with paper-based instruction (p < 0.05). These findings suggested that while AR instruction increased strain on the RCE muscle, it effectively reduced strain on the RMD and RFCR muscles, regardless of the task being performed.

3.2. Body Discomfort Ratings

The findings revealed a statistically significant difference in participants’ discomfort ratings across most body parts when comparing simple and complex tasks, as shown in Table 2. According to the Wilcoxon test, participants perceived complex tasks as significantly more uncomfortable across various body parts compared with simple tasks, with the exception of the left leg and left foot, where task complexity did not significantly influence the discomfort ratings.
The Mann–Whitney U test was used to compare discomfort ratings between the AR and paper methods for both simple and complex tasks. The results indicated that for the complex tasks, the instruction method (AR vs. paper) did not significantly affect discomfort ratings. However, for simple tasks, the AR method had a significant positive impact, resulting in lower discomfort ratings for several body parts, including the overall body, the head, neck, shoulders, left trapezius, left elbow, forearms, hands, left hip, and left leg.
Additionally, neither task complexity nor the instruction method had a significant effect on discomfort ratings for the right trapezius, right upper arm, left upper arm, right erector spine, left erector spine, right thigh, or left thigh across either simple or complex tasks.
For some body parts, including the left wrist, knees, and legs, the AR method significantly reduced discomfort ratings for both simple and complex tasks compared with the paper method. The results also showed that for complex tasks, the AR method significantly reduced discomfort in the right wrist, hands, left hip, and feet compared with the paper method.

3.3. Perceived Exertion and Mental Workload

The findings revealed statistically significant differences in participants’ perceived exertion and mental effort ratings for simple versus complex tasks. The Wilcoxon test indicated that participants perceived the complex tasks (p < 0.001) and high mental effort (p < 0.0001) as significantly more challenging compared with the simple tasks. The results also showed a statistically significant difference between instruction methods across both task types. The AR method demonstrated significant advantages over the paper method for both simple and complex tasks (p < 0.002, p < 0.016).
Additionally, no statistically significant difference was found between the AR and paper instruction methods when participants performed tasks requiring complex effort. However, there was a statistically significant difference between the AR and paper instruction methods for simple tasks (p < 0.001), with the AR method having a significant effect when participants performed tasks involving a low mental workload.

4. Discussion

This study aimed to evaluate and compare levels of physical strain, body discomfort, perceived exertion, and mental effort during maintenance activities. It focused on analyzing how different instructional methods, namely paper-based instructions and AR HMD, along with varying task complexities (simple versus complex), influenced body discomfort, perceived exertion, mental effort, and EMG %MVC measurements. Existing approaches for monitoring user interactions with AR systems are primarily based on subjective measures. Therefore, it is crucial to develop precise and reliable methods to assess the workload of users while performing routine maintenance tasks. The findings of this study provide valuable insights into the physical strain imposed by AR systems, particularly in comparison to traditional paper-based instructions.
As expected, AR-based instructions were associated with lower fatigue and reduced muscle strain for most muscles, especially during simple tasks. The results revealed that the instruction method had a statistically significant impact on RMS EMG (%MVC) for the RSPL, LSPL, RCE, RUT, RMD, and RFCR muscles. AR instruction generally led to lower %MVC values, except for RCE and RUT, which exhibited higher %MVC during AR-based tasks. Additionally, task difficulty had a substantial effect on %MVC for these muscles, with greater muscle strain observed during complex tasks compared with simple tasks, regardless of the instruction method used. Notably, the interaction between instruction method and task complexity was significant for RSPL, LSPL, and RUT, suggesting that the effect of AR versus paper instructions on muscle strain varied depending on task complexity for these muscles.
In addition to the objective EMG measurements, participants’ subjective reports provided valuable insights into their perceptions of physical and cognitive demand. The findings revealed that complex tasks led to significantly higher discomfort ratings across most body parts compared with simple tasks. This aligns with the increased muscle activity observed in the EMG data during complex tasks, suggesting greater physical demand. Interestingly, while the AR method significantly reduced discomfort ratings for several body parts during simple tasks, including the overall body, head, neck, shoulders, and arms, this effect was less pronounced during complex tasks. This suggests that AR instructions may offer ergonomic benefits during less demanding tasks by minimizing unnecessary movements and providing direct visual guidance, thereby reducing physical discomfort. The reduced activation in the RSPL and LSPL muscles can be explained by their anatomical and functional roles during head and arm movements. These muscles are primarily responsible for rotating and extending the head and neck. Acting unilaterally, they rotate the head to the same side, while bilateral activation leads to extension of the head and upper cervical spine [55,56]. Given that AR tasks often involve less dramatic head movements compared with traditional tasks, especially during simple instructions, the reduced demand for head rotation and extension may partly explain the lower %MVC observed in these muscles. Correspondingly, participants reported lower discomfort in the neck region during simple tasks with AR instructions, supporting the notion that AR can reduce physical strain through more efficient task execution.
Similarly, the reduced activation of the RMD and RFCR muscles during AR tasks can be attributed to the difference in how instructions were delivered. In the paper-based condition, participants had to frequently reach for the manual, flip through pages, and shift their attention between the manual and the task. These actions required shoulder abduction (engaging the RMD) and wrist flexion (engaging the RFCR), especially in the more complex task. In contrast, under the AR conditions, the instructions were overlaid onto the environment, allowing participants to navigate the instructions using head movements and gaze-based interactions. This reduced the need for arm and wrist movements, leading to lower activation of the RMD and RFCR muscles. Participants’ discomfort ratings for the shoulders and wrists were also lower in the AR condition during the simple tasks, further corroborating the EMG findings.
In contrast, the RCE and RUT muscles exhibited increased activation during AR-guided tasks. This can be attributed to their distinct anatomical and functional roles, particularly in postural stability and scapular control. The RCE muscles play a critical role in extending and stabilizing the cervical vertebrae, thereby maintaining an upright neck posture [55,57]. Similarly, the RUT muscle, which spans the neck and shoulders, is essential for shoulder elevation, neck extension, and scapular stabilization [56,58]. During AR interactions, users frequently perform head movements and arm gestures to engage with virtual elements, placing additional demands on these muscles. The weight of an AR HMD (620 g) contributes to this strain, as users must stabilize their head and neck while performing tasks that require looking up and down or rotating the head.
Participants’ perceived exertion and mental workload ratings provide additional context to these findings. Complex tasks significantly increased perceived exertion and mental effort, irrespective of the instruction method. However, the AR method demonstrated significant advantages over paper instructions in reducing perceived exertion during simple tasks, indicating that AR may ease the cognitive load when tasks are less complex. This suggests that AR instructions can streamline information processing by presenting contextual information directly within the user’s field of view, thereby reducing the need for cognitive shifts between different information sources. The increased activation of the RCE and RUT muscles observed in this study reflects the stabilization demands imposed by the AR environment, where users must often maintain their gaze on holographic content for extended periods. Prolonged postural deviations and repetitive shoulder movements can exacerbate this strain, leading to heightened musculoskeletal risk, particularly in the neck and shoulders [41,59,60,61,62]. Participants did not report significantly higher discomfort in these areas during complex tasks with AR instructions, which may indicate a disconnect between objective muscle activation and subjective discomfort that might emerge over longer exposure times.
Our findings are consistent with those of Lim et al. [63], who demonstrated that normalized electromyography (NEMG) (%MVC) values increased with greater movement distances (MD) and decreased target-to-user distances (TTU) during drag-and-drop (DND) tasks in AR environments. Similarly, we observed that AR-based instructions led to higher muscle strain in the neck and shoulders, particularly during complex tasks. However, unlike the work of Lim et al., who focused on upper extremity muscles, our study specifically highlights the redistribution of physical load, as evidenced by the reduced strain in the RSPL and LSPL muscles. This suggests that AR can differentially impact muscle groups depending on task complexity and ergonomic factors such as MD and TTU. In contrast to Marklin et al. [64], who found no significant differences in sEMG activity in the neck and shoulders between AR and non-AR conditions, our study revealed increased muscle strain in these areas during complex tasks with AR-based instructions. This discrepancy may be attributed to task complexity and duration differences between the two studies. While Marklin et al. [64] focused on utility workers performing procedural tasks of varying lengths, our study emphasizes the impact of instructional methods and task intricacies on muscle activation. Additionally, Marklin et al. [64] highlighted a significant reduction in blink rate with the HoloLens, raising concerns about eyestrain—a factor not extensively covered in our current study but acknowledged as a crucial aspect for future research.
Moreover, our findings align with previous studies on virtual reality (VR) interactions, which have reported similarly increased discomfort and fatigue in the upper extremities due to the postures and movements required in such environments [41,59,60,61,62]. This study supports previous research [65,66,67,68] that has identified interactions with AR HMDs as a source of strain in the neck and shoulders. According to Knight and Baber [39] and Sardar et al. [42], musculoskeletal injuries in the upper extremities are often caused by the strain that HMDs place on the neck and shoulders. Users may experience discomfort and strain due to the combination of repetitive movements in the upper extremities and maintaining static postures for extended periods while interacting with augmented reality systems [40,41,45]. The absence of sufficient upper-limb support during mid-air interactions can further heighten the risk of musculoskeletal strain and discomfort in the shoulders and arms [45,65,69]. These findings highlight the need for ergonomic considerations in the design and implementation of AR technologies, especially for complex tasks that inherently demand more from the user, both physically and cognitively.
Although this study has focused on neck and shoulder muscle activity, it is essential to consider the interplay between cognitive load, visual fatigue, and physical strain in AR environments. Our previous research [21,46] explored cognitive workload and physical stress, revealing that AR interactions not only influence muscle activity but also affect users’ mental workload and physiological stress indicators. Integrating these findings provides a more comprehensive understanding of the ergonomic and cognitive challenges posed by AR systems in industrial settings. Moreover, AR systems impose significant visual demands and psychological pressure on users, which can indirectly influence physical strain through altered posture and increased cognitive effort [7,8,11,14]. Understanding these interdependencies is crucial for developing ergonomic AR applications that mitigate both physical and cognitive burdens.
One potential solution to mitigate the strain caused by AR HMDs involves reducing the weight and improving the ergonomic design of the devices. As Astrologo [70] demonstrated, heavier HMDs lead to higher cervical spine loading and discomfort due to increased torque on the cervical spine. Innovations in lightweight materials and better weight distribution could help decrease the physical burden on users, particularly during prolonged tasks. Additionally, integrating voice commands into the AR interface could reduce strain on the RUT and RCE muscles by allowing users to interact with holographic content without relying solely on gaze and hand gestures. This would decrease the need for prolonged postural deviations and repetitive upper limb movements, offering a more natural and less physically demanding way to engage with virtual elements.
Several limitations of this study should be acknowledged. First, we did not monitor participants’ upper-body movements; incorporating motion capture or biomechanical analysis in future research could provide deeper insights into kinematic variations and stiffness responses to complex tasks and different instructional methods. Second, we recognize the importance of future research incorporating assessments of visual fatigue, such as eye-tracking or oculomotor metrics. Such studies could offer a more comprehensive perspective on how AR influences the user experience, complementing our findings on physical strain. Third, since all participants were male, the generalizability of the current findings is limited; future studies should include female participants. Fourth, the participants had relatively limited expertise; involving more experienced individuals or experts in the subject matter could offer a more nuanced understanding of how expertise influences muscle activity and task performance under different instructional methods. Finally, results are highly dependent on specific choices of interface type (visual and auditory), technological instruments (e.g., paper manuals, head-mounted displays, laptops, handheld devices), and study design (e.g., between-subjects, within-subjects, or mixed designs); future research employing different combinations of these factors may yield contrasting outcomes, highlighting the contextual specificity of our current findings and underscoring the need to explore alternative technologies and designs to provide a broader perspective on the effects of AR systems on physical strain and performance.

5. Conclusions

This study aimed to assess the impact of instructional methods (paper-based vs. augmented reality) and task difficulty (simple vs. complex) on physical strain within the context of maintenance operations. The findings confirmed that augmented reality (AR) instructions resulted in a significant increase in strain on the neck muscles (RCE and RUT) compared with paper-based instructions, particularly during tasks that required high levels of demand. At higher levels of task difficulty, the differences in EMG features between instructional methods became more apparent.
For low-difficulty tasks, minimal disparities in physical strain were observed between the instructional approaches. However, as the level of difficulty increased (i.e., highly demanding tasks with 41 steps), the limitations of the paper-based method became evident. Therefore, in situations involving greater task complexity, AR may offer advantages in task performance despite altering the physical demands on certain muscles.
Using AR instructions for both high- and low-demand tasks significantly reduced strain on the RSPL, LSPL, RMD, and RFCR muscles but increased strain on the RCE and RUT muscles. This suggests that AR changed the distribution of physical strain across different muscle groups. While it may have alleviated strain on muscles involved in arm and wrist movements, it increased strain on the neck and shoulder muscles due to the demands of maintaining head posture and interacting with the AR interface.
Unlike previous studies that have primarily examined overall muscle strain or focused solely on upper extremity muscles, our research provides nuanced insights into how AR influences specific muscle groups based on task complexity and ergonomic factors. This differentiation is pivotal for developing targeted ergonomic interventions in industrial AR applications, ensuring that the benefits of AR do not come at the cost of increased strain in critical muscle areas.
In summary, the results of this study suggest that AR-based training redistributes physical strain among different muscle groups rather than simply increasing overall physical demand compared with paper-based instruction. The increased strain on the neck and shoulder muscles associated with AR use highlights the need for ergonomic considerations in the design and implementation of AR systems, especially for complex tasks. The present study utilized EMG measurements to assess the physical strain experienced by AR users during maintenance settings, facilitating a better understanding of the advantages and disadvantages of augmented reality instructional techniques. These findings establish a basis for the appropriate use of this technology in various industrial sectors worldwide, emphasizing the importance of addressing ergonomic concerns to optimize user performance and safety. Therefore, this study highlights the need for ergonomic considerations in designing and implementing AR technologies, especially for complex tasks that inherently demand more from the user, both physically and cognitively.

Author Contributions

Conceptualization, M.H.A.; data curation, M.H.A. and F.M.A.; formal analysis, M.H.A.; funding acquisition, F.M.A.; investigation, M.H.A. and M.Z.R.; methodology, M.H.A., F.M.A., and M.M.N.; project administration, M.H.A.; resources, F.M.A. and I.M.A.-h.; supervision, F.M.A., M.Z.R. and I.M.A.-h.; validation, M.H.A., F.M.A., I.M.A.-h., M.M.N., M.Z.R. and S.S.A.; writing—original draft, M.H.A. and F.M.A.; writing—review and editing, M.H.A., F.M.A., I.M.A.-h., M.M.N., M.Z.R. and S.S.A. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by King Saud University Researchers Supporting Project (RSPD2024R814).

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and approved by the Human Participants Review Subcommittee of the Institutional Review Board at King Khalid University Hospital, College of Medicine, King Saud University (protocol code E-19-4467, date of approval 23 January 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

All data supporting the reported results are included in the manuscript.

Acknowledgments

The authors appreciate the support from Researchers Supporting Project number (RSPD2024R814), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Cipresso, P.; Giglioli, I.A.C.; Raya, M.A.; Riva, G. The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Front. Psychol. 2018, 9, 2086. [Google Scholar] [CrossRef] [PubMed]
  2. Cometti, C.; Païzis, C.; Casteleira, A.; Pons, G.; Babault, N. Effects of Mixed Reality Head-Mounted Glasses during 90 Minutes of Mental and Manual Tasks on Cognitive and Physiological Functions. PeerJ 2018, 6, e5847. [Google Scholar] [CrossRef]
  3. Hanna, M.G.; Ahmed, I.; Nine, J.; Prajapati, S.; Pantanowitz, L. Augmented Reality Technology Using Microsoft HoloLens in Anatomic Pathology. Arch. Pathol. Lab. Med. 2018, 142, 638–644. [Google Scholar] [CrossRef]
  4. Chen, Y.; Wang, Q.; Chen, H.; Song, X.; Tang, H.; Tian, M. An Overview of Augmented Reality Technology. Proc. J. Phys. Conf. Ser. 2019, 1237, 22082. [Google Scholar] [CrossRef]
  5. Kiyokawa, K. Trends and Vision of Head Mounted Display in Augmented Reality. In Proceedings of the 2012 International Symposium on Ubiquitous Virtual Reality, Daejeon, Republic of Korea, 22–25 August 2012; pp. 14–17. [Google Scholar]
  6. Xu, W.; Liang, H.-N.; He, A.; Wang, Z. Pointing and Selection Methods for Text Entry in Augmented Reality Head Mounted Displays. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 14–18 October 2019; pp. 279–288. [Google Scholar]
  7. Chen, Y.; Wang, X.; Xu, H. Human Factors/Ergonomics Evaluation for Virtual Reality Headsets: A Review. CCF Trans. Pervasive Comput. Interact. 2021, 3, 99–111. [Google Scholar] [CrossRef]
  8. Wang, Y.; Zhai, G.; Chen, S.; Min, X.; Gao, Z.; Song, X. Assessment of Eye Fatigue Caused by Head-Mounted Displays Using Eye-Tracking. Biomed. Eng. Online 2019, 18, 111. [Google Scholar] [CrossRef] [PubMed]
  9. Mustonen, T.; Berg, M.; Kaistinen, J.; Kawai, T.; Häkkinen, J. Visual Task Performance Using a Monocular See-through Head-Mounted Display (HMD) While Walking. J. Exp. Psychol. Appl. 2013, 19, 333. [Google Scholar] [CrossRef] [PubMed]
  10. Zhuang, J.; Liu, Y.; Jia, Y.; Huang, Y. User Discomfort Evaluation Research on the Weight and Wearing Mode of Head-Wearable Device. In Advances in Human Factors in Wearable Technologies and Game Design, Proceedings of the AHFE 2018 International Conferences on Human Factors and Wearable Technologies, and Human Factors in Game Design and Virtual Environments, Orlando, FL, USA, 21–25 July 2018; Springer: Berlin/Heidelberg, Germany, 2019; pp. 98–110. [Google Scholar]
  11. Mai, C.; Steinbrecher, T. Evaluation of Visual Discomfort Factors in the Context of HMD Usage. Proc. IEEE VR 2018, 2018, 1–4. [Google Scholar]
  12. Drouot, M.; Le Bigot, N.; Bricard, E.; De Bougrenet, J.-L.; Nourrit, V. Augmented Reality on Industrial Assembly Line: Impact on Effectiveness and Mental Workload. Appl. Ergon. 2022, 103, 103793. [Google Scholar] [CrossRef]
  13. Suzuki, Y.; Wild, F.; Scanlon, E. Measuring Cognitive Load in Augmented Reality with Physiological Methods: A Systematic Review. J. Comput. Assist. Learn. 2024, 40, 375–393. [Google Scholar] [CrossRef]
  14. Souchet, A.D.; Philippe, S.; Lourdeaux, D.; Leroy, L. Measuring Visual Fatigue and Cognitive Load via Eye Tracking While Learning with Virtual Reality Head-Mounted Displays: A Review. Int. J. Hum.-Comput. Interact. 2022, 38, 801–824. [Google Scholar] [CrossRef]
  15. Ajanki, A.; Billinghurst, M.; Gamper, H.; Järvenpää, T.; Kandemir, M.; Kaski, S.; Koskela, M.; Kurimo, M.; Laaksonen, J.; Puolamäki, K.; et al. An Augmented Reality Interface to Contextual Information. Virtual Real. 2011, 15, 161–173. [Google Scholar] [CrossRef]
  16. Evans, G.; Miller, J.; Pena, M.I.; MacAllister, A.; Winer, E. Evaluating the Microsoft HoloLens through an Augmented Reality Assembly Application. In Degraded Environments: Sensing, Processing, and Display; SPIE: San Francisco, CA, USA, 2017; Volume 10197, p. 101970V. [Google Scholar]
  17. Andersen, D.; Popescu, V. Ar Interfaces for Mid-Air 6-Dof Alignment: Ergonomics-Aware Design and Evaluation. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Virtual, 9–13 November 2020; pp. 289–300. [Google Scholar]
  18. Reipschlager, P.; Flemisch, T.; Dachselt, R. Personal Augmented Reality for Information Visualization on Large Interactive Displays. IEEE Trans. Vis. Comput. Graph. 2020, 27, 1182–1192. [Google Scholar] [CrossRef]
  19. Villagran-Vizcarra, D.C.; Luviano-Cruz, D.; Pérez-Domínguez, L.A.; Méndez-González, L.C.; Garcia-Luna, F. Applications Analyses, Challenges and Development of Augmented Reality in Education, Industry, Marketing, Medicine, and Entertainment. Appl. Sci. 2023, 13, 2766. [Google Scholar] [CrossRef]
  20. Gao, Y.; Gonzalez, V.A.; Yiu, T.W. The Effectiveness of Traditional Tools and Computer-Aided Technologies for Health and Safety Training in the Construction Sector: A Systematic Review. Comput. Educ. 2019, 138, 101–115. [Google Scholar] [CrossRef]
  21. Alessa, F.M.; Alhaag, M.H.; Al-harkan, I.M.; Nasr, M.M.; Kaid, H.; Hammami, N. Evaluating Physical Stress across Task Difficulty Levels in Augmented Reality-Assisted Industrial Maintenance. Appl. Sci. 2023, 14, 363. [Google Scholar] [CrossRef]
  22. Carroll, J.; Hopper, L.; Farrelly, A.M.; Lombard-Vance, R.; Bamidis, P.D.; Konstantinidis, E.I. A Scoping Review of Augmented/Virtual Reality Health and Wellbeing Interventions for Older Adults: Redefining Immersive Virtual Reality. Front. Virtual Real. 2021, 2, 655338. [Google Scholar] [CrossRef]
  23. Gavish, N.; Gutiérrez, T.; Webel, S.; Rodríguez, J.; Peveri, M.; Bockholt, U.; Tecchia, F. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks. Interact. Learn. Environ. 2015, 23, 778–798. [Google Scholar] [CrossRef]
  24. Watanabe, M.; Kaneoka, K.; Okubo, Y.; Shiina, I.; Tatsumura, M.; Miyakawa, S. Trunk Muscle Activity While Lifting Objects of Unexpected Weight. Physiotherapy 2013, 99, 78–83. [Google Scholar] [CrossRef]
  25. Moghaddam, M.; Wilson, N.C.; Modestino, A.S.; Jona, K.; Marsella, S.C. Exploring Augmented Reality for Worker Assistance versus Training. Adv. Eng. Inform. 2021, 50, 101410. [Google Scholar] [CrossRef]
  26. Söderberg, C.; Johansson, A.; Mattsson, S. Design of Simple Guidelines to Improve Assembly Instructions and Operator Performance. In Proceedings of the 6th Swedish Production Symposium, Gothenburg, Sweden, 16–18 September 2014. [Google Scholar]
  27. Vanneste, P.; Huang, Y.; Park, J.Y.; Cornillie, F.; Decloedt, B.; den Noortgate, W. Cognitive Support for Assembly Operations by Means of Augmented Reality: An Exploratory Study. Int. J. Hum. Comput. Stud. 2020, 143, 102480. [Google Scholar] [CrossRef]
  28. Jeffri, N.F.S.; Rambli, D.R.A. A Review of Augmented Reality Systems and Their Effects on Mental Workload and Task Performance. Heliyon 2021, 7, e06277. [Google Scholar] [CrossRef] [PubMed]
  29. Tumler, J.; Doil, F.; Mecke, R.; Paul, G.; Schenk, M.; Pfister, E.A.; Huckauf, A.; Bockelmann, I.; Roggentin, A. Mobile Augmented Reality in Industrial Applications: Approaches for Solution of User-Related Issues. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, 15–18 September 2008; pp. 87–90. [Google Scholar]
  30. Kolla, S.S.V.K.; Sanchez, A.; Plapper, P. Comparing Effectiveness of Paper Based and Augmented Reality Instructions for Manual Assembly and Training Tasks. Available SSRN 3859970 2021. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3859970 (accessed on 10 October 2024).
  31. Havard, V.; Baudry, D.; Savatier, X.; Jeanne, B.; Louis, A.; Mazari, B. Augmented Industrial Maintenance (AIM): A Case Study for Evaluating and Comparing with Paper and Video Media Supports. In Proceedings of the International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Lecce, Italy, 15–18 June 2016; pp. 302–320. [Google Scholar]
  32. Havard, V.; Baudry, D.; Jeanne, B.; Louis, A.; Savatier, X. A Use Case Study Comparing Augmented Reality (AR) and Electronic Document-Based Maintenance Instructions Considering Tasks Complexity and Operator Competency Level. Virtual Real. 2021, 25, 999–1014. [Google Scholar] [CrossRef]
  33. Illing, J.; Klinke, P.; Grünefeld, U.; Pfingsthorn, M.; Heuten, W. Time Is Money! Evaluating Augmented Reality Instructions for Time-Critical Assembly Tasks. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia, Essen, Germany, 22–25 November 2020; pp. 277–287. [Google Scholar]
  34. Hoover, M.; Miller, J.; Gilbert, S.; Winer, E. Measuring the Performance Impact of Using the Microsoft Hololens 1 to Provide Guided Assembly Work Instructions. J. Comput. Inf. Sci. Eng. 2020, 20, 61001. [Google Scholar] [CrossRef]
  35. Brice, D.; Rafferty, K.; McLoone, S. AugmenTech: The Usability Evaluation of an AR System for Maintenance in Industry. In Proceedings of the International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Lecce, Italy, 7–10 September 2020; pp. 284–303. [Google Scholar]
  36. Gonzalez-Franco, M.; Pizarro, R.; Cermeron, J.; Li, K.; Thorn, J.; Hutabarat, W.; Tiwari, A.; Bermell-Garcia, P. Immersive Mixed Reality for Manufacturing Training. Front. Robot. AI 2017, 4, 3. [Google Scholar] [CrossRef]
  37. Pringle, A.; Campbell, A.G.; Hutka, S.; Keane, M.T. Using an Industry-Ready AR HMD on a Real Maintenance Task: AR Benefits Performance on Certain Task Steps More than Others. In Proceedings of the ISMAR: The International Symposium on Mixed and Augmented Reality 2018, Munich, Germany, 16–20 October 2018. [Google Scholar]
  38. Jetter, J.; Eimecke, J.; Rese, A. Augmented Reality Tools for Industrial Applications: What Are Potential Key Performance Indicators and Who Benefits? Comput. Human Behav. 2018, 87, 18–33. [Google Scholar] [CrossRef]
  39. Knight, J.F.; Baber, C. Effect of Head-Mounted Displays on Posture. Hum. Factors 2007, 49, 797–807. [Google Scholar] [CrossRef]
  40. Kong, Y.-K.; Park, S.-S.; Shim, J.-W.; Choi, K.-H.; Shim, H.-H.; Kia, K.; Kim, J.H. A Passive Upper-Limb Exoskeleton Reduced Muscular Loading during Augmented Reality Interactions. Appl. Ergon. 2023, 109, 103982. [Google Scholar] [CrossRef]
  41. Penumudi, S.A.; Kuppam, V.A.; Kim, J.H.; Hwang, J. The Effects of Target Location on Musculoskeletal Load, Task Performance, and Subjective Discomfort during Virtual Reality Interactions. Appl. Ergon. 2020, 84, 103010. [Google Scholar] [CrossRef]
  42. Sardar, S.K.; Lim, C.H.; Yoon, S.H.; Lee, S.C. Ergonomic Risk Assessment of Manufacturing Works in Virtual Reality Context. Int. J. Hum.-Comput. Interact. 2024, 40, 3856–3872. [Google Scholar] [CrossRef]
  43. Akçayır, M.; Akçayır, G. Advantages and Challenges Associated with Augmented Reality for Education: A Systematic Review of the Literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  44. Kia, K.; Hwang, J.; Kim, I.-S.; Ishak, H.; Kim, J.H. The Effects of Target Size and Error Rate on the Cognitive Demand and Stress during Augmented Reality Interactions. Appl. Ergon. 2021, 97, 103502. [Google Scholar] [CrossRef] [PubMed]
  45. Kim, J.H.; Ari, H.; Madasu, C.; Hwang, J. Evaluation of the Biomechanical Stress in the Neck and Shoulders during Augmented Reality Interactions. Appl. Ergon. 2020, 88, 103175. [Google Scholar] [CrossRef]
  46. Alessa, F.M.; Alhaag, M.H.; Al-Harkan, I.M.; Ramadan, M.Z.; Alqahtani, F.M. A Neurophysiological Evaluation of Cognitive Load during Augmented Reality Interactions in Various Industrial Maintenance and Assembly Tasks. Sensors 2023, 23, 7698. [Google Scholar] [CrossRef]
  47. Alhaag, M.H.; Ramadan, M.Z.; Al-harkan, I.M.; Alessa, F.M.; Alkhalefah, H.; Abidi, M.H.; Sayed, A.E. Determining the Fatigue Associated with Different Task Complexity during Maintenance Operations in Males Using Electromyography Features. Int. J. Ind. Ergon. 2022, 88, 103273. [Google Scholar] [CrossRef]
  48. Potvin, J.R. Effects of Muscle Kinematics on Surface EMG Amplitude and Frequency during Fatiguing Dynamic Contractions. J. Appl. Physiol. 1997, 82, 144–151. [Google Scholar] [CrossRef]
  49. Khalaf, T.M.; Ramadan, M.Z.; Ragab, A.E.; Alhaag, M.H.; AlSharabi, K.A. Psychophysiological Responses to Manual Lifting of Unknown Loads. PLoS ONE 2021, 16, e0247442. [Google Scholar] [CrossRef]
  50. Corlett, E.N.; Bishop, R.P. A Technique for Assessing Postural Discomfort. Ergonomics 1976, 19, 175–182. [Google Scholar] [CrossRef]
  51. Paas, F.G.W.C. Training Strategies for Attaining Transfer of Problem-Solving Skill in Statistics: A Cognitive-Load Approach. J. Educ. Psychol. 1992, 84, 429. [Google Scholar] [CrossRef]
  52. Borg, G.; Bratfisch, O.; Dorni’c, S. On the Problems of Perceived Difficulty. Scand. J. Psychol. 1971, 12, 249–260. [Google Scholar] [CrossRef]
  53. Werrlich, S.; Daniel, A.; Ginger, A.; Nguyen, P.-A.; Notni, G. Comparing HMD-Based and Paper-Based Training. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, 16–20 October 2018; pp. 134–142. [Google Scholar]
  54. Hanusz, Z.; Tarasińska, J. Normalization of the Kolmogorov--Smirnov and Shapiro--Wilk Tests of Normality. Biom. Lett. 2015, 52, 85–93. [Google Scholar] [CrossRef]
  55. Standring, S. Gray’s Anatomy E-Book: Gray’s Anatomy E-Book; Elsevier Health Sciences: Amsterdam, The Netherlands, 2021. [Google Scholar]
  56. Moore, K.L.; Dalley, A.F. Clinically Oriented Anatomy; Wolters Kluwer India Pvt Ltd.: Gurgaon, India, 2018. [Google Scholar]
  57. Kendall, F.P.; McCreary, E.K.; Provance, P.G.; Rodgers, M.M.; Romani, W.A. Muscles: Testing and Function with Posture and Pain; Lippincott Williams & Wilkins Baltimore: Baltimore, MD, USA, 2005; Volume 5. [Google Scholar]
  58. Palastanga, N.; Field, D.; Soames, R. Anatomy and Human Movement: Structure and Function; Elsevier Health Sciences: Amsterdam, The Netherlands, 2006; Volume 20056. [Google Scholar]
  59. Kar, G.; Vu, A.; Juliá Nehme, B.; Hedge, A. Effects of Mouse, Trackpad and 3D Motion and Gesture Control on Performance, Posture, and Comfort. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA, USA, 26–30 October 2015; Volume 59, pp. 327–331. [Google Scholar]
  60. Penumudi, S.A.; Kuppam, V.A.; Kim, J.H.; Hwang, J. Biomechanical Exposures in the Neck and Shoulders During Virtual Reality Interactions. Graduate Research Theses & Dissertations, Northern Illinois University, DeKalb, IL, USA, 2019. [Google Scholar]
  61. Samani, A.; Pontonnier, C.; Dumont, G.; Madeleine, P. Shoulder Kinematics and Spatial Pattern of Trapezius Electromyographic Activity in Real and Virtual Environments. PLoS ONE 2015, 10, e0116211. [Google Scholar] [CrossRef] [PubMed]
  62. Ram, S.; Mahadevan, A.; Rahmat-Khah, H.; Turini, G.; Young, J.G. Effect of Control-Display Gain and Mapping and Use of Armrests on Accuracy in Temporally Limited Touchless Gestural Steering Tasks. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Rome, Italy, 28–30 November 2017; Volume 61, pp. 380–384. [Google Scholar]
  63. Lim, C.H.; Cha, M.C.; Lee, S.C. Physical Loads on Upper Extremity Muscles While Interacting with Virtual Objects in an Augmented Reality Context. Appl. Ergon. 2024, 120, 104340. [Google Scholar] [CrossRef] [PubMed]
  64. Marklin, R.W., Jr.; Toll, A.M.; Bauman, E.H.; Simmins, J.J.; LaDisa, J.F., Jr.; Cooper, R. Do Head-Mounted Augmented Reality Devices Affect Muscle Activity and Eye Strain of Utility Workers Who Do Procedural Work? Studies of Operators and Manhole Workers. Hum. Factors 2022, 64, 305–323. [Google Scholar] [CrossRef]
  65. Kang, H.; Shin, G. Hand Usage Pattern and Upper Body Discomfort of Desktop Touchscreen Users. Ergonomics 2014, 57, 1397–1404. [Google Scholar] [CrossRef]
  66. Kang, H.; Shin, G. Effects of Touch Target Location on Performance and Physical Demands of Computer Touchscreen Use. Appl. Ergon. 2017, 61, 159–167. [Google Scholar] [CrossRef]
  67. Boring, S.; Jurmu, M.; Butz, A. Scroll, Tilt or Move It: Using Mobile Phones to Continuously Control Pointers on Large Public Displays. In Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7, Melbourne, Australia, 23–27 November 2009; pp. 161–168. [Google Scholar]
  68. Shin, G.; Zhu, X. User Discomfort, Work Posture and Muscle Activity While Using a Touchscreen in a Desktop PC Setting. Ergonomics 2011, 54, 733–744. [Google Scholar] [CrossRef]
  69. Syamala, K.R.; Ailneni, R.C.; Kim, J.H.; Hwang, J. Armrests and Back Support Reduced Biomechanical Loading in the Neck and Upper Extremities during Mobile Phone Use. Appl. Ergon. 2018, 73, 48–54. [Google Scholar] [CrossRef]
  70. Astrologo, A.N. The Effects of Head-Mounted Displays (HMDS) and Their Inertias on Cervical Spine Loading. Master’s Thesis, Northeastern University, Boston, MA, USA, 2022. [Google Scholar]
Figure 1. Parts and completely assembled condition of the gearbox for a piston pump.
Figure 1. Parts and completely assembled condition of the gearbox for a piston pump.
Electronics 13 04637 g001
Figure 2. Parts and completely assembled condition of the pump housing of the piston pump.
Figure 2. Parts and completely assembled condition of the pump housing of the piston pump.
Electronics 13 04637 g002
Figure 3. Participant using paper-based instruction methods to perform (a) the simple task; (b) the complex task.
Figure 3. Participant using paper-based instruction methods to perform (a) the simple task; (b) the complex task.
Electronics 13 04637 g003
Figure 4. Examples of AR experience as viewed via Microsoft HoloLens: (a) AR instruction for removing the V-belt pulley from the crankshaft during the complex task; (b) AR instruction for removing the hex nuts and washers from the valve bridge stud bolts during the simple task.
Figure 4. Examples of AR experience as viewed via Microsoft HoloLens: (a) AR instruction for removing the V-belt pulley from the crankshaft during the complex task; (b) AR instruction for removing the hex nuts and washers from the valve bridge stud bolts during the simple task.
Electronics 13 04637 g004
Figure 5. Using Microsoft HoloLens 1 to complete steps of the complex and simple tasks.
Figure 5. Using Microsoft HoloLens 1 to complete steps of the complex and simple tasks.
Electronics 13 04637 g005
Figure 6. EMG electrodes placement.
Figure 6. EMG electrodes placement.
Electronics 13 04637 g006
Figure 7. EMG signals recorded from six muscles using Mega Win 3.0.1 software (Mega Electronics Ltd., Kuopio, Finland).
Figure 7. EMG signals recorded from six muscles using Mega Win 3.0.1 software (Mega Electronics Ltd., Kuopio, Finland).
Electronics 13 04637 g007
Figure 8. Overall discomfort and local discomfort rating scales.
Figure 8. Overall discomfort and local discomfort rating scales.
Electronics 13 04637 g008
Figure 9. Interaction effects of instruction method and task complexity on % RMS EMG (%MVC) of RSPL, LSPL, and RUT muscles, respectively. * marks are significantly different from each other.
Figure 9. Interaction effects of instruction method and task complexity on % RMS EMG (%MVC) of RSPL, LSPL, and RUT muscles, respectively. * marks are significantly different from each other.
Electronics 13 04637 g009
Figure 10. Main effects of instruction methods on % RMS EMG (%MVC) of RCE, RMD and RFCR muscles. * marks are significantly different from each other.
Figure 10. Main effects of instruction methods on % RMS EMG (%MVC) of RCE, RMD and RFCR muscles. * marks are significantly different from each other.
Electronics 13 04637 g010
Table 1. Mean (SD) and statistical outcomes for EMG responses across the experimental tasks. Significant differences are indicated by p-values in bold.
Table 1. Mean (SD) and statistical outcomes for EMG responses across the experimental tasks. Significant differences are indicated by p-values in bold.
VariableMean (SD)Statistics p2)
Instruction MethodsPaper-Based MethodAR-Based MethodInstruction Complexity Interaction
Task ComplexityHigh DifficultyLow DifficultyHigh DifficultyLow Difficulty
RSPL12.76(2.36)9.86(2.57)8.50(2.88)7.21(2.26)0.00(0.36)0.00(0.63)0.015(0.21)
LSPL12.58(4.53)9.07(4.07)7.79(1.48)6.93(1.59)0.00(0.30)0.00(0.31)0.04(0.14)
RCE15.50(3.94)12.92(3.61)19.36(5.96)17.14(6.78)0.04(0.15)0.00(0.369)0.77(0.00)
RUT15.36(3.56)10.64(2.68)16.85(4.82)16.36(5.02)0.01(0.21)0.00(0.31)0.01(0.23)
RMD29.57(5.26)24.21(8.69)21.57(6.17)16.25(6.69)0.00(0.31)0.00(0.54)0.92(0.00)
RFCR35.53(7.39)30.14(5.82)24.71(5.94)18.64(5.76)0.00(0.51)0.00(0.51)0.57(0.01)
Table 2. Statistical results of body discomfort ratings according to the Wilcoxon and Mann–Whitney U tests: 1 refers to complex task, 2 indicates simple task.
Table 2. Statistical results of body discomfort ratings according to the Wilcoxon and Mann–Whitney U tests: 1 refers to complex task, 2 indicates simple task.
VariablesWilcoxon Signed-Rank Test (Z, p)Mann–Whitney U Test (U1,2, Z1,2, p1,2)
Body PartInstruction MethodTask Complexity
Overall body−4.183, 0.000170.0 vs. 32.5, −1.308 vs. −3.521, 0.191 vs. 0.0001
Head−2.529, 0.01180.0 vs. 42.0, −0.927 vs. −3.243, 0.354 vs. 0.009
Neck−1.944, 0.0579.0 vs. 41.0, −0.951 vs. −2.987, 0.342 vs. 0.003
Right and left shoulder−2.256, 0.024; −2.514, 0.01286 vs. 33, −0.568 vs. −3.2, 0.57 vs. 0.001; 97.5 vs. 49, −0.026 vs. −2.975, 0.98 vs. 0.003
Right and left trapezius−3.942, 0.0001; −2.392, 0.01773 vs. 75, −1.172 vs. −1.208, 0.241 vs. 0.227; 78 vs. 56, −1.046 vs. −2.696, 0.296 vs. 0.007
Right and left upper arm−2.884, 0.004; −2.725, 0.00686 vs. 95.5, −0.58 vs. −0.137, 0.562 vs. 0.891; 96.5 vs. 93.5, −0.74 vs. −0.254, 0.941 vs. 0.800
Right and left elbow−2.380, 0.017; −2.373, 0.01876 vs. 93, −1.109 vs. −0.305, 0.268 vs. 0.761; 84 vs. 63, −0.756 vs. −2.423, 0.450 vs. 0.015
Right and left forearm−2.825, 0.005; −2.288, 0.02263 vs. 64, −1.706 vs. −1.974, 0.088 vs. 0.048; 62 vs. 70, −1.863 vs. −2.121, 0.062 vs. 0.034
Rightand left wrist−2.741, 0.006; −2.224, 0.02649.5 vs. 65.5, −2.299 vs. −1.762, 0.022 vs. 0.78; 57 vs. 63, −2.160 vs. −2.415, 0.031 vs. 0.016
Right and left hand−3.210, 0.001; −3.017, 0.00373 vs. 42, −1.245 vs. −3.266, 0.213 vs. 0.001; 64.5 vs. 70, −1.759 vs. −2.115, 0.079 vs. 0.034
Right and left erector spine−2.980, 0.003; −2.461, 0.01469 vs. 94, −1.362 vs. −0.205, 0.173 vs. 0.837; 85 vs. 92, −0.616 vs. −0.311, 0.538 vs. 0.756
Right and left hip−2.546, 0.011; −2.530, 0.01151.5 vs. 70, −2.586 vs. −2.115, 0.010 vs. 0.034; 66 vs. 70, −1.934 vs. −2.115, 0.053 vs. 0.034
Right and left thigh−2.714, 0.007; −2.53, 0.01167 vs. 84, −1.795 vs. −1.441, 0.073 vs. 0.15; 68 vs. 77, −1.742 vs. −1.80, 0.082 vs. 0.072
Right and left knee−2.701, 0.007; −2.714, 0.00754 vs. 91, −2.368 vs. −1.00, 0.018 vs. 0.317; 58 vs. 91, −2.152 vs. −1.00, 0.031 vs. 0.317
Right and left leg−2.63, 0.009; −1.913, 0.05660 vs. 63, −2.12 vs. −2.423, 0.034 vs. 0.015; 71 vs. 77, −1.563 vs. −1.8, 0.118 vs. 0.072
Right and left foot−2.598, 0.009; −1.768, 0.07768 vs. 63, −1.742 vs. −2.423, 0.082 vs. 0.015; 69 vs. 63, −1.562 vs. −2.412, 0.118 vs. 0.016
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alhaag, M.H.; Alessa, F.M.; Al-harkan, I.M.; Nasr, M.M.; Ramadan, M.Z.; AlSaleem, S.S. Effects of Industrial Maintenance Task Complexity on Neck and Shoulder Muscle Activity During Augmented Reality Interactions. Electronics 2024, 13, 4637. https://doi.org/10.3390/electronics13234637

AMA Style

Alhaag MH, Alessa FM, Al-harkan IM, Nasr MM, Ramadan MZ, AlSaleem SS. Effects of Industrial Maintenance Task Complexity on Neck and Shoulder Muscle Activity During Augmented Reality Interactions. Electronics. 2024; 13(23):4637. https://doi.org/10.3390/electronics13234637

Chicago/Turabian Style

Alhaag, Mohammed H., Faisal M. Alessa, Ibrahim M. Al-harkan, Mustafa M. Nasr, Mohamed Z. Ramadan, and Saleem S. AlSaleem. 2024. "Effects of Industrial Maintenance Task Complexity on Neck and Shoulder Muscle Activity During Augmented Reality Interactions" Electronics 13, no. 23: 4637. https://doi.org/10.3390/electronics13234637

APA Style

Alhaag, M. H., Alessa, F. M., Al-harkan, I. M., Nasr, M. M., Ramadan, M. Z., & AlSaleem, S. S. (2024). Effects of Industrial Maintenance Task Complexity on Neck and Shoulder Muscle Activity During Augmented Reality Interactions. Electronics, 13(23), 4637. https://doi.org/10.3390/electronics13234637

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop