Next Article in Journal
Advance Landslide Prediction and Warning Model Based on Stacking Fusion Algorithm
Next Article in Special Issue
Exploring the Barriers to the Advancement of 3D Printing Technology
Previous Article in Journal
Floquet Theory of Classical Relaxation in Time-Dependent Field
Previous Article in Special Issue
Evaluating Natural Hazards in Cities Using a Novel Integrated MCDM Approach (Case Study: Tehran City)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combined Framework of Multicriteria Methods to Identify Quality Attributes in Augmented Reality Applications

1
Faculty of System Engineering, Universidad Santo Tomás, Tunja 15003, Colombia
2
Institute of Robotics and Information and Communication Technologies (IRTIC), Universitat de València, 46980 Valencia, Spain
3
Department of Computer Science and Engineering, Universidad del Norte, Barranquilla 081007, Colombia
4
Department of Design, School of Architecture, Urbanism and Design, Universidad del Norte, Barranquilla 081007, Colombia
5
Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Santiago 12455, Chile
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(13), 2834; https://doi.org/10.3390/math11132834
Submission received: 31 May 2023 / Revised: 14 June 2023 / Accepted: 21 June 2023 / Published: 24 June 2023
(This article belongs to the Special Issue Multiple Criteria Decision Making, 2nd Edition)

Abstract

:
This study proposes a combined framework of multicriteria decision methods to describe, prioritize, and group the quality attributes related to the user experience of augmented reality applications. The attributes were identified based on studies of high-impact repositories. A hierarchy of the identified attributes was built through the multicriteria decision methods Fuzzy Cognitive Maps and DEMATEL. Additionally, a statistical analysis of clusters was developed to determine the most relevant attributes and apply these results in academic and industrial contexts. The main contribution of this study was the categorization of user-experience quality attributes in augmented reality applications, as well as the grouping proposal. Usability, Satisfaction, Stimulation, Engagement, and Aesthetics were found to be among the most relevant attributes. After carrying out the multivariate analysis, two clusters were found with the largest grouping of attributes, oriented to security, representation, social interaction, aesthetics, ergonomics of the application, and its relationship with the user’s emotions. In conclusion, the combination of the three methods helped to identify the importance of the attributes in training processes. The holistic and detailed vision of the causal, impact, and similarity relationships between the 87 attributes analyzed were also considered. This framework will allow the generation of a baseline for the use of multicriteria methods in research into relevant aspects of Augmented Reality.

1. Introduction

Augmented reality is a variant of virtual environments (VE), or virtual reality as VE technologies are more commonly called, which completely immerse the user within a synthetic environment. While immersed, the users cannot see the real world around them. However, AR allows virtual objects to be superimposed on the real world [1].
There are studies in the educational field that evidence the increase in the use of AR in teaching and learning processes. In the report 2020 EDUCAUSE Horizon Report™ Teaching and Learning Edition [2], AR is included in the category of “Emerging Technologies and Practices”. In this classification, the technologies and practices that will have a significant impact on teaching and learning in higher education are found. The report indicates that higher education is experimenting with AR as a vehicle for collaborating in the learning process. There is evidence of an increase in the creation of laboratories and centers focused on the development of experiences with Extended Reality—XR (augmented reality AR, virtual reality VR, mixed reality MR, HAPTIC).
In the last decade, the implementation of AR-based solutions has grown substantially. In the study carried out by [3] the authors identify nine AR application areas: Perception, Medical Education, Entertainment and Gaming, Industrial Application, Navigation and Driving, Tourism and Exploration, Collaboration, and Interaction. Similarly, in the research developed by [4] seven areas are identified: Medical, Assembly, Education, Shopping, Entertainment, Militaries, and Furniture design. In addition, in the work by [5] four areas are highlighted: Training and Education, Entertainment and Commerce, Navigation and Tourism, and Medical and Construction. These works demonstrate the growing increase in the development of AR applications in different areas such as health, education, games, tourism, and industry, among others.
Taking into account the relevance of AR in multiple areas, this article seeks to make a contribution to knowledge in the academic-business context, proposing a ranking of user-experience quality attributes in AR applications, which contribute to the training evaluation process in industrial environments mediated by augmented reality from the user experience approach.
Since the human factor is a crucial aspect for the success in the adoption of this technology, the adequate preparation of workers must be considered. Practical training is important for many disciplines and professions, such as medical workers, mechanics, technicians, electricians, engineers, sailors, pilots, and firefighters, among others. Compared to traditional training for the consulting firm ABI Research, AR/VR training can save between USD 2000 and USD 2500 per employee, improve the candidate/employee experience, and build a competitive employer brand [6].
The article is focused on the analysis of quality attributes from the perspective of user experience, mixing technology, quality, and training processes in a set that allows companies to generate added value. This analysis will be the input to subsequently proposed AR applications with an objective approach to the attributes.
Therefore, it is important to explore which attributes are the ones that affect the industrial training process within the context of every worker and its environment. In this way, the following research question arises: What user experience quality attributes could be prioritized in applications mediated by augmented reality in training processes?
The Section 2 describes the importance of multicriteria decision methods in various disciplines and the validity of their use. The entire methodological process of ap-plying the framework to identify quality attributes in augmented reality applications is presented in the Section 3. Section 4 describes the most relevant findings after ap-plying each of the multicriteria methods. The following section analyzes the results obtained in light of the validity and applicability in scenarios of use. Finally, the conclusions consolidate the contributions of this study.

2. Literature Review

2.1. UX Attributes in AR Applications

In the literature, there is evidence of studies around the evaluation topic of the user experience using augmented reality applications. These works address aspects such as user experience measurement through methods and models, devices evaluation, and use description of specific applications.
In Ref. [7], the fundamental areas for the development of AR applications using the DELPHI-AHP method through five first-level indicators and twenty second-level indicators are identified. This method is tested and verified with six model visualization applications which found that the most important first-level indicators that affect the user experience are the functionality of a system and its visualization.
Ref. [8] defines a Uses and Gratifications theory-based model for the adoption of AR mobile games. The model is composed of eight constructs: enjoyment, fantasy, escapism, social interaction, social presence, achievement, self-presentation, and continuance intention. The case study is Pokémon Go. The results confirmed the positive influence of the aforementioned constructs on the intention to continue using the game.
Ref. [9] proposes the User Experience Measurement Index (UXMI). This indicator is calculated through different evaluation methods when testing the MAR Madness application. The tests were carried out before, during, and after using the application. In the first stage, a questionnaire with questions about age, gender, familiarity with AR applications, and tendency to suffer motion sickness was completed. In the second stage, the user’s physiological data were collected with the EMOTIV EPOC + EEG headset. In the last stage of the process the UEQ questionnaire was used.
Ref. [10] presents the results obtained from the analysis of the user experience (UX) of a digital educational material (DEM) based on augmented reality. The AttrakDiff model for evaluation is used.
Ref. [11] describe the behavior of a test group that uses an HMD application to perform an industrial task. Quantitative measures such as accuracy, task completion, consistency and time taken are evaluated.
Other measurement models are presented in [12,13].
In works such as [14,15,16,17,18] specific applications in the industrial area, tourism, and education are assessed, among others.
The evaluation of devices used in AR was also studied in the works by [11,19,20,21,22,23].
After the literature review, several research opportunities can be identified that may be addressed:
AR research focus has been oriented toward the definition and measurement of quality indicators of user experience [24]. However, there is no general consensus on what elements should be analyzed when fully evaluating an AR solution;
There are studies on evaluations of the use of AR applications where they use tools such as surveys, interviews, evaluation by experts, and biometric measurements. However, they are mechanisms external to the application and have a high degree of subjectivity;
No tools were identified that evaluate augmented reality applications in industrial contexts with quantitative techniques or methods integrated into the industrial prototype;
Interest in the industrial sector in conducting training through AR applications. In the works by [25,26,27], training with AR applications is carried out, resulting in improvements in the time for completing a task (task completion time), and in the reduction in the number of errors (error counts).

2.2. Importance of Multicriteria Decision Methods

Multicriteria decision-making methods (MCDM), used in different areas, are an important tool to weigh factors where the experience and knowledge of experts are taken into account. Previous research shows the usability and advantages of integrating multicriteria decision methods in application areas such as engineering, medicine, finance, education, and even in the social sciences [28,29,30,31,32,33,34,35]. Here are some of the most commonly used methods.

2.2.1. Analytic Hierarchy Process (AHP)

The Analytic Hierarchy Process was postulated by Tomas Saaty in 1980. The fundamental scale of the AHP is an absolute number scale used to answer a basic question in each of the paired comparisons: How many times is one element more dominant than another with respect to a certain criterion or attribute? It models relationships of interdependence and alternatives [36].

2.2.2. Analytic Network Process (ANP)

The Analytical Hierarchy Process was postulated by Tomas Saaty in 1996 as a generalization of the AHP method. The essential characteristic of ANP, unlike AHP, is to allow the inclusion of interdependence and feedback relationships between elements of the system [37].

2.2.3. Clustering

Clustering also known as Cluster Analysis, Numerical Taxonomy or Pattern Recognition, is a multivariate statistical technique whose purpose is to divide a set of objects into groups, so that the profiles of the objects in the same cluster are very similar among themselves (internal cohesion of the group) and the distinct cluster objects are different (external isolation of the group) [38]. Three main groups can be distinguished:
  • Hierarchical Methods: These types of algorithms do not require the user to specify the cluster number in advance (agglomerative clustering, divisive clustering);
  • Non-Hierarchical Method: These types of algorithms require the user to specify in advance the number of clusters to be created (K-means, K-medoids, CLARA);
  • Other methods: Methods that combine or modify the previous ones (hierarchical K means, fuzzy clustering, model-based clustering, and density-based clustering).

2.2.4. DEMATEL

The DEMATEL (Decision Making Trial and Evaluation Laboratory) technique was first introduced by Gabus and Fontela in 1972 to solve real-world problems. The DEMATEL technique is an effective way to analyze direct and indirect relationships between system components according to their type and intensity [39]. Analyzing the general relationship between the components with the DEMATEL technique allows a better understanding of structural relationships, as well as an ideal way to solve complex system problems [40].
The DEMATEL method can be summarized as the following steps: build the direct influence matrix, normalize the direct influence matrix, obtain the total relationships matrix, develop a causal diagram, establish the threshold value, and obtain the impact-digraph map.

2.2.5. Fuzzy Cognitive Map (FCM)

A Fuzzy Cognitive Map (FCM) is a graphical representation consisting of nodes indicating the most relevant factors of a decisional environment, and links between these nodes representing the relationships among those factors [41].
They are a tool that can be used in different situations or problems to identify, define and validate the constructs or items of a system and identify the cause–effect relationships that exist among them, in order to propose strategies and help the decision-making process [31].
Previously, only precise causal relationships had been discussed, the same cause always causing the same effect. However on a day-to-day basis, this does not usually happen in this way, there are causal relationships that are imprecise, and the same cause does not always cause the same effect [31,42].
The methodology uses four matrices to represent the results that the methodology provides in each one of its stages. These are Initial Matrix of Success (IMS), Fuzzified Matrix of Success (FZMS), Strength of Relationships Matrix of Success (SRMS) and Final Matrix of Success (FMS). The result of these computations is a new matrix called Strength of Relationships Matrix of Success (SRMS) [41].

2.2.6. Multi-Attributive Border Approximation Area Comparison (MABAC)

Multi-Attributive Border Approximation Area Comparison (MABAC), developed by Pamucar and Cirovic [43], is one of the recent approaches to multi-attribute decision making (MCDM/MADM) problems. The basic principle used in MABAC is that it divides the performances of each criteria/attribute function into an Upper Approximation Area (UAA) containing ideal alternatives and Lower Approximation Area (LAA) containing anti-ideal alternatives. In other words, this method provides a direct observation of the relative strength and weakness of an alternative to others according to each criterion. In the classical MABAC method, the performance rating and criteria weights were represented by crisp/deterministic numerical values [44].

2.2.7. TODIM

An acronym in Portuguese for Interactive Multi-criteria Decision Making. The idea of TODIM is to compare the alternatives with respect to each criterion in a pairwise fashion in terms of gains and losses. The gains and losses are then passed to the prospect function to obtain the partial dominances and then the partial dominances are aggregated to form the final dominance. The rank order of the alternatives is basically based on this final dominance. In the standard formulation, the TODIM method only deals with crisp numbers [45].

2.2.8. Technique for Order Performance by Similarity to Ideal Solution (TOPSIS)

The TOPSIS method was proposed by Chen and Hwang in 1992. It is based on an aggregation function representing “closeness to ideal”, which originated from the commitment programming method. It determines a solution with the shortest distance to the ideal solution and the greatest distance from the negative ideal solution [46].

2.2.9. VIKOR

The VIKOR method was proposed by Serafín Opricovic in 1990. This method focuses on categorizing a series of alternatives aimed at solving a decision-making problem in a scenario in which there are conflicting criteria through the introduction of a multicriteria classification index based on the mean of the “closeness” to the ideal solution to that problem [47].

2.3. Use of Multicriteria Decision Methods in AR

As mentioned in the Section 2.2, the MCDM are useful in many disciplines. Within the framework in this research, it is important to highlight their use in AR environments.
In [42] a methodology to capture IT project requirements applying FCM and BPM (Business Process Model), through the use of an AR mobile application is described.
Reference [48] presents the evaluation of adopting augmented reality systems for maintenance activities. The identification of advantages was carried out using Fuzzy TOPSIS. The four evaluated alternatives were HoloLens, Tablet, Smartphone and Vuzix.
Continuing with the evaluation of AR devices in maintenance operations, it is found that in [49] AHP and Fuzzy-TOPSIS Model are used.
Reference [35] proposes a new recovery approach based on a multicriteria decision method to generate a compact descriptor that represents a signature for each 3D model. The aim is to try to extract the measure by obtaining a combined score using the data envelopment analysis method (DEA).
Evaluating the effectiveness of the use of AR applications in museums is also one of the environments where multicriteria decision methods were used. Data analysis was carried out via multiple correspondence analysis and hierarchical agglomerative clustering analysis [34].
Reference [40] proposes a model to select the most appropriate 4.0 technologies in the supply chain. In this work, DEMATEL was used, and augmented reality was among the first level technologies.

3. Methodology

Due to the nature of this research, the high number of variables to be analyzed and the conditioning of the extensive work of the experts linked to the study, the work is organized into four phases, as shown in Figure 1. The scope of the first phase was to identify the quality attributes that affected a user’s experience with augmented reality applications, resulting in 87 attributes.
In the second phase, the objective was to select the most appropriate multicriteria decision methods for the investigation, build the instruments for data collection and analysis and, finally, organize the panel of experts.
The third phase was the densest of the investigation because it had two stages. In the first stage, the intention was to find the direct and indirect causal relationships between the attributes, degrees of importance and polarity of the attributes. In the second stage, the aim was to group the 87 attributes by degree of similarity.
The fourth and final phase aimed to reduce the dimensionality of the attributes, as well as build the ranking and interpret the influence relationships among the 19 attributes.
The following sections will describe the steps of the proposed methodology.

3.1. Phase 1. Selection of Quality Attributes to Include in the Study

A systematic literature review was carried out to identify the quality attributes that affect the user experience using augmented reality applications. This review took the protocol guidelines proposed by Kitchenham et al. [50] as a reference.
The research questions that guided this search were:
  • Q1: How is the user experience measured in AR applications?
  • Q2: What are the user experience quality metrics used in augmented reality applications?
The review protocol begins with the selection of search repositories. This selection was made based on the work by [51]. The repositories used in this study were IEEE Xplore, SCOPUS and ACM Digital Library. The search string that was established is presented in Figure 2.
Exclusion filters were applied such as (1) duplicated documents and (2) observation window posterior to 2009.
For the pre-selection of the studies, the metadata verification process was carried out: title, abstract and keywords. The pre-selection criteria applied included studies that addressed the subject of AR, defining concepts and variables that affected the user experience.
As (Kitchenham, 2007 [50]) mentions, after having the initial sample of papers available, the relevance of the study for research must be verified. For this reason, quality criteria were established to decide if a study was a candidate to enter the base of primary studies of this research.
The quality criteria applied to select the works that are part of this study were: (1) the study presents evaluations of AR applications, (2) the study describes criteria associated with user experience, and (3) the study describes the results of research and/or application.

3.2. Phase 2. Selection of Multi-Criteria Methods for the Construction of the Ranking

3.2.1. MCDM Analysis

As it was mentioned in the Section 2.2, multi-criteria decision methods are a useful tool in decision-making when problems are complex and involve a large number of factors, variables and/or characteristics. The nature of this research, having a high number of attributes, implied a rigorous and exhaustive analysis of the influence of each attribute in the industrial training processes with AR applications.
In this research, the AHP method is not used since this method assumes linear dependence relationships between the criteria and the alternatives, which may not be suitable for complex problems with direct and indirect relationships. This limits the ability of the AHP method to analyze the complexity of the system studied in this research. On the other hand, in AHP, marginal changes in the comparison matrices can have a significant impact on the results. The ANP method is also not used, because it can be more complex and require more effort compared to other methods, such as DEMATEL. In general, it may take a longer time to complete the analysis and obtain results. The interpretation of the comparison matrices can be more complicated than with the DEMATEL method. This could limit its usefulness in some contexts. In this research, methods such as TOPSIS, VIKOR, MABAC or TODIM were not applied, since it was not necessary to select or weigh between different alternatives or choices. Moreover, this research focuses on the weighting and selection of the most relevant attributes. The main objective was to assign weights or importance to the different attributes that comprises the system. This weighting of attributes allows for a more complete vision of the augmented reality attributes, without the need to make a comparison between different alternatives.
Therefore, the criterion for selecting FCM was the functionality of the method to achieve consensus among the interested parties, since they actively contribute to the development of the solution. It is able to represent not only the direct relationships between the attributes but also the indirect ones and, if they exist, the inverse relationships can be identified [52,53,54]. Fuzzy cognitive maps make it possible to represent and manage vague or ambiguous knowledge that cannot easily be expressed in exact or precise terms. In addition, they can be used for making decisions in systems with uncertainty. Fuzzy cognitive maps provide an intuitive and visual graphical representation of knowledge and the relationships between concepts.
In the case of clustering, the reasons were compelling because this method allows grouping the elements that have common characteristics and separating those that are more dispersed among them. Thus, this method was essential to generate the attribute subgroups and find interest subgroups depending on the context of use. Clustering makes it possible to identify patterns associated with a set of observations. In addition, it permits the discovery of natural groups, which can represent observations with similar characteristics, facilitating their interpretation. It provides an overview of the structure and distribution of the data, and allows the identification of trends, anomalies, outliers and relationships that are not evident in the data. Finally, another advantage of clustering is that it allows a reduction in the data dimensionality by grouping similar objects and summarizing complex information in a set of representative groups.
In the case of DEMATEL, it is a widely used method among the scientific community, as evidenced by a large number of studies [40,55,56,57,58,59,60,61]. It has demonstrated effectiveness in various areas of knowledge. In addition to being easy to implement, the results obtained are significant when prioritizing the attributes, as well as in identifying the impact between the cause and effect attributes. One advantage of DEMATEL is its ability to identify causal relationships among the criteria, determining whether it is a cause or an effect. In addition, it permits obtaining its weighting or influence within the study context. Compared to ANP, DEMATEL is simpler in terms of its application according to the experts.

3.2.2. Construction of the Instruments to Be Evaluated by Experts

The objective of the evaluation was to identify the attributes that affect the user experience using applications with augmented reality (AR). The concept of user used in this research is presented below.
User: a person who must operate a machine/equipment after receiving a training process using an application with AR.
For the FCM method the type of assessment used is the Likert scale [62,63]. For the case study of this research, the extremes of the valuation are defined as follows:
  • Very Low: The attribute does not contribute significantly to the user experience using augmented reality applications.
  • Very High: The contribution of the attribute is essential to improve the user experience using augmented reality applications.
The scale defined in Table 1 was used to evaluate the attributes to find the influence relationship of the attributes. This scale is the one generally used in fuzzy method applications [32]. In this stage, the instrument “Qualitative Level of Influence instrument” (QLIi) is built.
For the DEMATEL method, the selected assessment scale is the one described in Table 2 [33]. This scale was used to evaluate the attributes and their influence among them. In this stage, the instrument is designed to identify the level of influence of the attributes.

3.2.3. Expert Panel Selection Process

The process to select the experts was carried out by taking into account a bank of profiles that was assembled based on information about people who are linked to technology centers, development companies and recognized universities. The steps to link the experts were the following:
  • Review of profiles in academic and business platforms. The selected profiles were those who met: (1) two-year experience or more in software development with AR, (2) two-year experience or more in the productive sector projects with AR, and (3) one-year experience or more in the academic environment;
  • Sending an email where the investigation is presented, inquiring if the person was interested in joining the panel of experts;
  • Review of the people who gave an affirmative answer by the researchers;
  • Sending an email with the Qualitative Level of Influence instrument;
  • In the second phase of the investigation, the DEMATEL assessment instrument is sent.

3.3. Phase 3. Identification of Causality and Clustering between Attributes

3.3.1. Stage 1: FCM Method

The steps for the implementation of the fuzzy cognitive maps method in this research are described below.
  • Step 1: Generation of the initial matrix of success IMS.
This initial matrix is a linguistic assessment carried out by the experts, using the scale of Table 1. The data collected in the QLIi instrument are transformed into numerical vectors associated with each one of the attributes identified. Each vector element represents the value that the corresponding attribute has for each one of the experts interviewed. Each element O i j of the matrix represents the importance that an expert j gives to attribute i , as the results are later transformed into a fuzzy set with values between 0 and 1. The elements O i 1   , O i 2 , , O i m are the components of the vector V i associated with the Attribute belonging to row i of the matrix. Thresholds are applied to the IMS in order to adapt the information contained to the real world and maintain the logical integrity of data [41]. Then, IMS is a matrix of n   ×   m , where n is the number of attributes, and m is the number of experts.
  • Step 2: Obtaining the fuzzified matrix of success FZMS.
After having the initial matrix of concepts, it was converted into a fuzzy matrix called We removed the heading style, please confirm all FZMS. The FZMS containing the grades of membership of each component of a vector to a fuzzy set. The numerical vectors are converted into fuzzy sets with values within the interval [0, 1] based on Equations (1)–(3).
Find the maximum value in V i , and assign X i = 1 to it, that is:
M A X O i q X i O i q = 1
Find the minimum value in V i , and assign X i = 0 to it, that is:
M I N O i p X i O i p = 0
In this phase, the minimum and maximum by rows of the matrix must be found to then apply Equation (3), which is the equation used to return the dataset to its fuzzy form.
X i   O i j = O i j M i n O i p M a x O i q M i n O i p
where X i O i j is the degree of membership of the element O i j to the vector V i .
  • Step 3: Obtaining the strength of relationships matrix of success SRMS.
The importance of each factor is calculated with the FZMS fuzzy matrix in order to generate the Strength of Relationships Matrix of Success SRMS using fuzzy cognitive maps. The SRMS matrix is an n × n matrix.
S i j can accept values in the interval [−1, 1]. Each attribute is represented as a numerical vector S i containing n components, one for each concept to be represented in the map.
  • Step 4: Indicators assessment.
With the results of the SRMS matrix, the indicators called outdegree, indegree, and centrality [31] are obtained. Additionally, the maximum value per row and the standard deviation are identified, and a dispersion graph between the outdegree and the deviation can be represented.
  • The outdegree indicator is the sum of the values in the adjacent array associated with the connectors leaving a node or variable. A transmitter variable presents a high outdegree.
  • The indegree indicator is the sum of the values of the adjacent matrix associated with the connections entering a node. This indicator shows the degree of dependence of the variable. A receiving variable has a high indegree.
  • The centrality indicator is the sum of the outdegree and indegree indicators. This indicator shows the degree of participation or importance of the variable in the system.

3.3.2. Stage 2: Clustering Method

Due to the large number of attributes to work on in this research, it was not only necessary to prioritize but also to group them to identify common characteristics among them. The type of clustering carried out was non-hierarchical using the K-Mean measure.
The input variables for the clustering implementation were those identified in the SRMS matrix, found through the method of fuzzy cognitive maps. Therefore, the matrix that enters the clustering is n × n . The steps of this procedure are detailed below.
  • Step 1: Defining the number of clusters.
Among the most used methods to define the number of clusters is the Elbow method, Equation (4). Where k is the number of clusters, n r is the number of points in the r group, and D r is the sum of the distances between all the points in the group.
W k   = r = 1 k 1 n r c r e a s e   o f D r
In this investigation the number of points is high, so the cost of applying this method is not justified. For this reason, the definition of the number of clusters was based on a previous study [64].
  • Step 2: SRMS matrix processing.
The processing of the SRMS matrix to identify the clusters was carried out using the SPSS software [65], which handles algorithms such as intergroup linkage, intragroup linkage, nearest neighbor (simple chaining), farthest neighbor (full chaining), Hierarchical, K-Medoids and Ward.

3.4. Phase 4. Attribute Prioritization and Cause-Effect Analysis by DEMATEL Method

In order to reduce the number of quality attributes to be analyzed through the DEMATEL method, the criterion of literary warrant was used. According to the ANSI/NISO Z-39.19 standard [66], literary warrant is defined as the justification for the representation of a concept in an indexing language or for the selection of a preferred term due to its frequent presence in literature. Based on the literary warrant criterion, the attributes with the highest number of references were selected.
The steps of the DEMATEL method applied in this investigation will be presented below.
  • Step 1: Generation of the direct relationship matrix A.
The evaluation of the direct relationships of the attributes was carried out by experts in the subject of software development with AR, using the assessment scale described in Table 2. The direct relationships matrix A is described in Equation (5), where a ij represents the degree of influence of attribute i on attribute j , and sets its corresponding diagonal effect to 0.
A = a 11 a 1 j a 1 n a i 1 a i j a i n a n 1 a n j a n n
  • Step 2: Normalization of the direct relationship matrix.
The normalized matrix M is generated using Equations (6) and (7). The objective of the transformation is to have a matrix with norm less than 1.
k = min 1 max 1 i n j = 1 n a i j , 1 max 1 j n i = 1 n a i j             i ,   j     1 ,   2 ,   3 ,   ,   n
M = k A
  • Step 3: Obtaining the total relationship matrix.
Subsequently, the total relationship matrix S was generated using Equation (8). The S matrix contains the direct and indirect relationships among the attributes.
S = M + M 2 + M 3 + = i = 1 M i = M 1 M 1
  • Step 4: Determine the cause group and effect group.
Based on Equations (9)–(11) a vector was generated with the sum of the elements by rows of matrix S , called D ; then a vector was created with the sum of the elements by columns of matrix S and it was called R .
S = S i j n × n             i , j     1 ,   2 ,   ,   n
D = j = 1 n S i j
R = i = 1 n S i j
In this step the values D + R , D R and the group are calculated. The “ D + R ” column represents importance and the “ D R ” column represents correlation [58].
The higher the D + R value, the more important the attribute is for the study. However, if the correlation ( D R ) is a positive value, the attribute belongs to the “Cause” group, because it served as the origin for other attributes, while if the correlation value is negative, it belongs to the “Effect” group, which means that this attribute is a consequence of others. In the DEMATEL method, an indicator is considered a cause when it has a significant impact or influence on other indicators/factors within the system under study and is regarded as the sources or roots of the study. On the other hand, an indicator is considered an effect when it is influenced by other indicators in the system and is subject to changes or impacts due to those influences.
  • Step 5: Weighting of the attributes.
In this step, the attributes that have the greatest weight in augmented reality applications are identified. For the calculation of the weighting coefficient, Equation (12) [55] is used. The standardized coefficients are obtained by applying Equation (13) [55].
W i = D i + R i 2   + D i R i 2  
W i = W i i = 1 n W i

4. Case Study

4.1. Selected Quality Attributes

After applying the review protocol described in the Section 3.1, 101 studies passed the quality evaluation phase of this research. The summary of the execution of the systematic literature review protocol is presented in Figure 3. Relevant information on basic AR concepts, UX quality criteria, trends and challenges of AR applications was collected from these studies.
From the analysis of the studies in this investigation, four types of AR were identified: HMD, Mobile, PAR (Pervasive Augmented Reality) and SAR (Interactive Spatial Augmented Reality). Most works use mobile AR apps. Figure 4 presents the distribution. It can be seen that the most used type of AR is the mobile AR with 54%.
Six AR application areas were identified: Education, Entertainment and Gaming, Industrial Application, Medical, Navigation and Driving, and Tourism and Exploration. The most widespread application area among the analyzed studies is Industrial Application. The studies focused on Industrial Application are presented in Table 3.
Overall, the 101 reviewed studies present information related to how UX-related features are evaluated in AR applications. From these works, 87 quality attributes were extracted, which are presented in Supplementary Material Table S1.

4.2. Selected Multicriteria Methods

4.2.1. Analysis of the Selected Multicriteria Methods

Based on the analysis performed in Section 3.2.1, three multicriteria methods were selected for this research. These methods contribute to this research because the FCM method was used to identify how much an attribute influenced the user experience using augmented reality applications. The expert evaluated it with a linguistic scale from very low to very high (Table 1). The second method was Clustering, which helped to group the 87 attributes and identify similarities among them. In the last phase of the investigation, the DEMATEL method was applied to prioritize and find causality analysis among the attributes.
Due to all of the above, in this research these three methods were chosen. The strength of the methodological process is the integration of the methods to obtain the most appropriate solution.

4.2.2. Applied Instruments and Experts Profiles

After applying the protocol described in Section 3.2.3, the experts completed and sent their contributions to the investigation. The evaluations of the experts were tabulated and contrasted by the researchers in order to apply the corresponding statistical methods. In this way, the relationships evaluation matrix among the attributes was carried out by the expert panel.
Table 4 describes the profiles of the experts.

4.3. FCM Method Application

4.3.1. Indicator Analysis of the IMS and SRMS Matrices

After applying “Step 1: Generation of the initial matrix of success IMS”, the IMS matrix is obtained. Table 5 describes the analysis of the IMS matrix, identifying the average, the minimum score, and the maximum score obtained for each attribute.

4.3.2. Analysis of the FZMS Matrix Importance

As described in “Step 2: Obtaining the fuzzified matrix of success FZMS”, three equations are used to obtain the FZMS matrix. From the FZMS matrix, it is important to analyze the level of importance of the attributes. Table 6 presents the attributes grouped by level of importance. The importance is a vector that oscillates between [0, 1], where 1 represents the highest value that can be given to an attribute. The attributes with the highest value were Usability and Perceived ease of use.

4.3.3. Analysis of the SRMS Matrix Results

The calculation to obtain the SRMS matrix described in the “Step 3: Obtaining the strength of relationships matrix of success SRMS” was implemented in MATLAB (see Algorithm 1). The complexity of the algorithm is Θ n 2 · m . We consider m as a constant, then,
Θ n 2 · m = Θ n 2 ; n m
Algorithm 1 Implementation of matrix SRMS
1:Enter the FZMS matrix.
2:The size of the FZMS matrix is calculated.
3:k = 1
4:while (k  n)
5:for (j = 1, m)
6:for (i = 1, n)
7:if (FZMS ((k, j) − FZMS (i, j)) < 0)
8:B(i, j) = FZMS (i, j) − FZMS ((k, j)
9:else
10:B(i, j) = FZMS ((k, j) − FZMS (i, j)
11:end if
12:end for
13:end for
14:for (i = 1, n)
15:accumulated = 0
16:for (j = 1, m)
17:accumulated = accumulated + B(i, j)
18:end for
19:Average = accumulated/m
20:P(i) = Average
21:D(i) = 1 − P(i)
22:end for
23:for (j = 1, n)
24:S(k, j) = D(j)
25:if (k = j)
26:S(k, j) = 0
27:end if
28:end for
29:k = k + 1
30:end while
The SRMS matrix is not presented in this article due to its large size. It can be seen at Supplementary Material Table S2. Next, the structural indicators of the SRMS matrix will be analyzed.

4.3.4. Indicators Analysis

As described in the “Step 4: Indicators assessment”, Outdegree and Indegree indicators are necessary to obtain Centrality. Therefore, these two indicators affect the degree of importance of each of the variables presented in Table 5.
As it can be seen in Table 5, the values of outdegree and indegree are the same. These two indicators have the same value, since they were calculated using the SRMS matrix. The SRMS matrix is symmetric, therefore the sum of the rows equals the sum of the columns.
Figure 5a presents the relationship between the Outdegree indicator vs. the deviation. The attributes with the highest standard deviation value are Enjoyment, Pleasant, Ergonomics, and Naturalness, and correspond to the Outdegree of quartile 2 or median. The attribute with the lowest deviation value is Effectiveness located between the minimum and quartile 1.
The diagram in Figure 5b presents the distribution of the centrality indicator values in Table 5. The minimum value is 79.28, quartile 1 is 112.78, quartile 2 is 129.04, quartile 3 is 132.86 and the maximum value is 136.28. As it can be seen in Figure 5b the centrality indicator does not have outliers.

4.3.5. Weighting of Attributes Based on the Centrality Criterium

According to the degree of importance of each variable, 13 attributes with the highest weighting were identified. The three highest values were selected, finding attributes with the same value. As a result, the attributes described in Figure 6 were selected.
The thirteen (13) attributes identified with the highest value of the centrality indicator represent the most important attributes for the user experience of a worker who uses augmented reality applications in their industrial training process. Although Figure 6 represents the attributes selection with the highest weighting; the prioritization of the first two groups from 1 to 2 and from 3 to 4 demarcate only the aspirational, compliance, and familiarity factors which workers will undergo during a training with AR. Thus, the user occupation will be focused on learning, organizing, and intuitively addressing each of the requirements, with the aim of fulfilling certain types of tasks. In the next group, as factors 5 to 11 are more related to aspects of load and cognitive processing mediated by technologies, the motivational learning process will be focused on stimulating a perception of individual or collaborative achievement in the user. The last group, from 12 to 13, will stimulate the sensation of a safe environment where the learning will be seen as a useful, necessary, reliable, and effective factor.
This concentric order reveals how the AR importance in industrial training environments must respond to an order of attributes and priorities, which in deterministic terms will affirmatively condition the experiences that users will live to guarantee the training of any process.

4.4. Clustering Method Application

4.4.1. Distances and Cases by Clusters

As mentioned in Section 3.3.2, in [64] two categories and nine subcategories of quality attributes in augmented reality applications are identified; based on this proposal, in this work k = 2 and k = 9 are used.
The data matrix that enters SPPS is the SRMS matrix. After its processing, the distances between clusters and the number of cases per cluster for the two k selected are obtained.
Figure 7 and Figure 8 present the distances between clusters and the number of cases per cluster.

4.4.2. Clustering Method Results

As mentioned in “Step 1: Defining the number of clusters” of Section 3.3.2, two values for k were used. It was proceeded to calculate the initial cluster centers, membership clusters, final cluster centers, distances among final cluster centers, and the number of cases in each cluster. After analyzing the results of the resulting clusters with k = 2 and k = 9 , it was concluded that with k = 9 a better analysis of relationships among attributes can be performed. Therefore, only the results for the 9 clusters will be presented in this section. Figure 9, Figure 10, Figure 11, Figure 12, Figure 13, Figure 14, Figure 15, Figure 16 and Figure 17 describe the clusters identified when k = 9 .
Figure 9 presents cluster 1 with four attributes that are related to the use of applications in its two dimensions: real and perceived. The connection that emerges from these attributes is based on technological determinism, where a factor such as camera height conditionally influences the cognitive dimension of users. The consumption of content assisted with augmented reality leads to translating these reactions into collective evaluations, where the degrees of usability, the sensation of ease of use, and intuitiveness, although they are attributes anticipated objectively by the developers, are perceived by the users as subjective evaluations, where its effectiveness is represented by the quality of the lived experience.
Cluster 2, described in Figure 10, has 27 attributes. This cluster is the second largest in this iteration. However, the connection of the attributes that appear in this cluster seems to be predominantly focused on aesthetics and human factors. It places its accent on emotional stimuli by subordinating cognitive aspects, with the aim of generating experiences in the user that evoke sensations of wonder, naturalness, spontaneity, pleasure, and happiness.
Figure 11 presents cluster 3 with 12 attributes. In this cluster features related to accuracy, time to complete the task, reliability, and understanding the application content are found. The connection among these attributes is justified by the fact that the user can meet his needs for consumption and compression of digital content from instances and/or processes that imply the least amount of cognitive effort to achieve it. The attributes of this cluster point to precision and synthesis of the information, in order to achieve understanding, internalization, and learning, among other important aspects.
Figure 12 groups two attributes with the same distance, one focused on assistance and the other on autonomy. The relationship between them is based on the coherence that must exist between a sensation of choice derived from or dependent on assistance and support, by offering users different ways or options to complete the same process easily and precisely.
Cluster 5 groups the attributes related to the guidance and user interaction within an application, as well as the perceptions of that use (Figure 13). In this cluster, those attributes exclusively linked to avoiding user frustration due to inadequate user flow planning are prioritized. In the relationship among these four attributes, the ease of navigation use prevails, the passage and transitions from one wireframe to another, and some aspects leading to the practical utility perception of the content offered by the digital product with AR.
Cluster 6 in Figure 14 has four attributes that are responsible for attention and innovation, and it highly stimulates the feeling of novelty and wonder from innovative aspects in the interface and in the disruptive way in which the digital content with augmented reality is displayed.
Cluster 7 presented in Figure 15 is the most numerous with 31 attributes. It focuses on security, representation, and social interaction. These attributes are based on trust; hence, they are present in digital environments and products that offer security, practicality, transparency, ease of use, desires, group approval, and aspirational expectations, among other aspects that are subordinated to dominant contexts of social interaction. These cluster attributes are consistent with the weighting results of the FCM method presented in Figure 6. This fact emphasizes the importance of task fulfillment and adaptation to the work environment that the worker must develop.
Cluster 8 has two attributes with the same distance (Figure 16). When the algorithm finds attributes that have the same distance, it groups them and does not include another attribute with a close value. Provided that performance is subject to the prevention of bottlenecks to which users are exposed when interacting with an AR digital product, the Identifiability attribute is also related to the same principle, but differently. While the reduction in waiting times arises from simple and efficient algorithms composed of the least number of actions to process, if the graphic and chromatic unit that comes from the Identifiability attribute is assertively designed, it will be understandable from the UI and the sensation of performance and efficiency that the user will experience is considered as affirmative.
Cluster 9 has only one attribute with a distance value of zero (Figure 17). Although this solitary attribute is subject to the desire to progress and compete with others, unlike Cluster 7, it does not take its reference point in social interaction, but rather in the achievements, scopes or growth that reflect an individual progressive improvement.

4.5. DEMATEL Method Application

4.5.1. Results Matrices

Applying the criterion of literary warrant, the dimensionality of the attributes is reduced from 87 to 19 (see Section 3.4). For this reason, the analysis of the experts for the DEMATEL method was built from a 19 × 19 attributes matrix.
The attributes that entered into this phase of the project are those described in Table 7. It is important to clarify that throughout the article the IDs of each attribute are those defined in Table 5.
According to “Step 1: Generation of the direct relationship matrix A” the direct relationships matrix among attributes A with dimensions 19 × 19 was generated, based on the information recorded by the experts. This matrix was found as the average of the evaluations by the experts. Table 8 presents matrix A with the obtained averages.
Applying the procedure of “Step 2: Normalization of the direct relationship matrix”, the matrix M is obtained. A fragment of the results of M is presented in Table 9 (full data can be found in Table S3: Matrices generated with DEMATEL).
Applying Equation (8) presented in “Step 3: Obtaining the total relationship matrix”, the S matrix is obtained. Table 10 presents a fragment of the data in matrix S .
Applying Equations (10) and (11) described in “Step 4: Determine the group of causes and the group of effects”, the vectors D and R are obtained. Table 11 presents the calculation values of D + R and D − R and the group. This table is organized according to the column “D − R” in descending order. Ten attributes were identified in the Cause group and nine in the Effect group.
From the analysis of these indicators, it can be inferred that the “Usability” attribute is the one with the highest value of importance, so it should be considered a priority to improve this criterion. It is followed by Satisfaction, Stimulation, Engagement and Aesthetics, with values higher than 14.13. An interesting finding is that four of the five most important attributes are located in the “Effect” group.
The attributes located in the Cause group such as Image quality, Novelty, Navigability, Performance, Attractiveness, Space Perception, Cognitive demand, Perspicuity, Aesthetics and Dependability are responsible for originating the effect attributes.
As described in “Step 5: Weighting of the attributes”, the weighted coefficients and standardized coefficients of the 19 attributes were calculated. Table 12 presents the weighting coefficients obtained for the 19 attributes. The standardized coefficients are described in Table 13.

4.5.2. Ranking by DEMATEL

After applying the weighting (Table 12) and standardized coefficients (Table 13), the attributes with the greatest weight in the augmented reality applications were identified and presented in Figure 18.
At the top of the ranking is the “Usability” attribute, corresponding to the measure of how well a specific user in a specific context can use a product/design to achieve a defined goal effectively, efficiently, and satisfactorily. Designers usually measure a design’s usability throughout the development process from wireframes to the final deliverable to ensure maximum usability [14,16,23,69,75,78,81,83,87,91,92,93,94,95,96,97,98,99].
In the second position is “Satisfaction”, which can be defined as that affective, emotional, and cognitive appeal that digital products and services arouse in the users for whom they are developed [20,68,77,78,87,95,100,103,107,109,119,133,134].
In the third position is “Stimulation”, defined as satisfying inexplicit desire by stimulating personal development; by communicating a sense of belonging and by in-spiring/evoking memories [9,17,101,102,104,105,106,108].

4.5.3. Influence Relationships among Attributes

According to the DEMATEL results, Figure 19 describes how much one attribute affects another and how much it is affected by another. The attribute “A05—Navigability” greatly influences the attribute “A01—Usability” which would be an expected result because if an application does not handle adequate navigability, it will be proportionally reflected in the usability. The coherence of a product with augmented reality is defined by the intuitiveness that highlights its navigability; this is perceived by the user as a sensation of ease of use.
Nine attributes affect the “A01—Usability” attribute, which indicates that this is the one that receives the greatest influence from the other factors. The attribute that most affects the other problems is “A01”, with an occurrence of 11 out of 19. This bidirectional relationship of influence denotes the great importance of the Usability attribute in the processes of creating and using augmented reality applications over all the other attributes.
Although the DEMATEL Method in Figure 18 reveals to what extent an attribute causes a deterministic effect or influences another, the attribute with the greatest weight or value will continue to be Usability. The interactive, dependency, affectation or subordination relationship that emerges among the 19 attributes, also shown in Figure 19, is graphed in a single or two-way relationship that the DEMATEL method manages to visualize. This method quantifies how the ease of use of a product with AR, in dynamic relation by weight with its 19 preponderant and hierarchical attributes, can guarantee the motivations for use in any type of user as one of the most important contributions of this research.
Figure 20 shows the influence relationship between attribute A01—Usability and attributes A07—Perspicuity, A19—Correctness and A53—Ergonomics. The graphed data correspond to the values of D + R and D − R of each attribute (see Table 11). The grouping was carried out based on the fact that the affected attributes were located in cluster 2 as a result of applying the clustering method of Section 3.3.2 of this investigation. The bidirectional relationship between A01 and A19 confirms the consistency that exists between design and use since the interface elements must be distinguishable and predictable so that the task workflow can be performed effectively, efficiently, and satisfactorily. A01 directly affects A07, because if the task can be carried out properly, in the end the process will be clear and easy to understand. Between A01 and A53, a two-way relationship is also revealed, because the interactions between the worker and the elements of the system proportionally affect the task development in the context of use.
Figure 21 shows the influence relationship between attribute A01—Usability and attributes A05—Navigability and A08—Dependability. The graphed data correspond to the values of D + R and D − R of each attribute (see Table 11). The grouping was carried out based on the fact that the affected attributes were located in cluster 5 as a result of applying the clustering method of Phase 3 Stage 2 of this investigation. The bidirectional relationship between A01 and A05 is due to the fact that the navigation concept is a basic aspect that configures usability and, in turn, usability affects the fulfillment of the objectives, establishing the best routes for the development of the task. Between A01 and A08 the relationship occurs because the interaction with the system/product must be predictable and safe in order to meet expectations, which leads to the system being usable.
In summary, the attribute A01—Usability represents the most important attribute identified with the DEMATEL method, since it has the highest D + R value and obtained the greatest weight in the attributes vector.

5. Discussion

The results of this investigation are based on a processual methodology that having been organized by four instances of inquiry was administered by iterative dynamics. The preliminary selection of such attributes was made from sources referenced by significant citation indices. Hence, 1165 works from IEEE Xplore, Scopus, and ACM Digital Library repositories, that appear as the case studies addressed, were part of the state of the art. The iterations that emerged between describing, grouping, and prioritizing the quality attributes that must prevail in all AR-supported digital products led to revealing subgroups of attributes that were connecting systematically. These connections found their justification from hierarchies identified through the implementation of Multicriteria Decision Methods such as Fuzzy Connective Maps, Clustering and DEMATEL, and a revealing matrix of cohesion and distancing indices thanks to a contrasting analysis of experts in the Frontend and Backend areas. The training particularities of these experts were based on the knowledge and trajectory in the topics of Digital Interface Design and User Experience Design, only and exclusively in applications linked to AR that pursue the formation of skills from semi-immersive experiences.
In any case, although this work concludes that there is coherence amongst the literature analyzed with the results that the matrix produced, its most important contribution is based on the binding conditions between the concepts UX, UI and AR in the field of digital products. Although other works have already shown that this UI + UX + AR triad raises skill formation rates much more than the products that only implement the UX + UI, current works are lacking more standardized and precise advice. Hence, the central contribution of this study consisted in such a categorization, consolidated in a grouping proposal that can be used by designers and developers of digital products when the objective is to encourage the development of skills from applications assisted with AR. It is also clarified that although in the theoretical framework of this work the exhaustive differences between AR and VR are defined, from the work of Azuma [1] and other authors the focus is only on products assisted with AR and not with VR. In addition, the previous works that were taken into account are Nielsen, Budiu, and Riders [148], Krug [149], and Gutiérrez et al. [64], among others.
Although any digital product assisted with Augmented Reality responds to a typology of various functions, the “practical functionalities” [150], which relate the perceived reality and the design of contents that are superimposed on it, in order to stimulate cognitive aspects [1]. Nonetheless, although AR can be classified as a subproduct of software, the aesthetic effects that it generates for users account for another type of relationship which implies a determinism that focuses on the emotional dimensions of people. Hence, content design with AR is causally linked to UI Interface Design, and thus to UX User Experience Design [148]. The triadic dynamics of these three aspects that operate systematically, depending on a partial immersion, seeks, through the mediation of one or several technological devices (hardware), the transfer of knowledge and/or particular skills to those who consume them [149]. Therefore, all emerging content resulting from AR is deterministic and operates with the UI and UX in an aligned manner in the search to optimize experiences.
The methods and instruments that support the UX are intended not only qualitatively and phenomenologically to describe the effects that digital products have on users, but also to measure their level of usability in quantitative terms. When the contents with AR are systematically conceived together with the UI and the UX [1], improvements are presented in aesthetic, practical and operational aspects that can determine cognitive and emotional reactions in the users thanks to their semi-immersive effects.
Concerning the application areas where the design, development, and consumption of content with AR ensure success in improving the user experience, there are HUDs (Head Up Displays), multisensory devices, and mobile devices, among others. The first two tend to be found on the dashboards of fighter jets and commercial flights [151], on ships, submarines [152], and also on any type of automobile or high end utility vehicle [153]. These offer data that are essential for navigation, whether by air, water or land. The information provided by AR combined with reality takes into account human factors that, in anthropometric and ergonomic terms, avoid distractions towards traditional meters and reduce cognitive processing in the consumption of information. Various investigations have demonstrated that multisensory devices [154] and “fused interfaces” [155] prevent accidents and reduce risks [156]. In the third category, mobile devices, cell phones, tablets or smart glasses, and “Spatial AR” [157], markers are used that display enveloping information that facilitates user tasks. These are tested and implemented in the contexts of the market economy [158], and in education, both in elementary schools [159,160], in universities [161], and also in the industrial sector [162].
A possible scenario where the attributes identified in the ranking of this study can be used is in the implementation of information technology architectures.
These architectures are used in academic and industrial contexts [163,164], evidencing the need to improve or innovate processes based on the implementation of business logic. Through information technology architectures, a broad understanding of specific domains is achieved, promoting the reuse of design experience and facilitating the development, standardization, and evolution of software systems [165].
Figure 22 represents a reference model of information technology architectures that is product of a review of works described in [163,166,167,168,169]. Five layers can be seen: the quality layer, the training processes layer, the technological layer, governance, and AR. The attributes identified in this study feed the quality layer of this type of architecture. The quality layer is in charge of managing the UX quality metrics and the respective indicators in order to measure and evaluate the use of the AR training system.
The results obtained by combining the three methods favor the identification of the attributes importance in training processes, because the holistic and detailed vision of the causal, impact and similarity relationships among the 87 attributes was taken into account. The proposed framework will allow the generation of a baseline for the use of these multicriteria methods in the identification of relevant aspects in the area of AR.

6. Conclusions, Limitations, and Future Work

6.1. Conclusions

This article presents an analysis of the quality attributes of user experience in augmented reality applications in four phases. The first phase of the study sought to identify the attributes present in this dynamic. This process of recognizing the problem was carried out through a systematic review of literature in high-impact repositories. In the second phase, the multicriteria methods to be used were selected, the instruments were built, and the panel of experts was created. The third phase, made up of two stages, was the hierarchy of the attributes employing the FCM method. It made it possible to identify the most relevant attributes in the process of training the workers with augmented reality applications and the level of relationship among these attributes. Stage two of the second phase developed by means of the Clustering method was the formation of the conglomerates or clusters. The fourth phase consisted of reducing the dimensionality of the 87 attributes to 19 in order to apply the DEMATEL method and thus obtain a vector of weighted attributes.
The contribution of this research can be considered in three aspects: (1) Ranking of attributes with multicriteria decision methods in the context of augmented reality applications. (2) Grouping of UX attributes in AR by similarity of characteristics among them, and (3) Methodological framework for the use of the FCM, Clustering and DEMATEL multicriteria methods.
The conclusions of this work are related below according to the phases of the investigation.
From the systematic literature review process carried out, it was possible to obtain information about the attributes, such as concepts, uses, and metrics. These results evidence the interest of the academic community in presenting alternatives to implement strategies for improving this process.
According to the results obtained by applying the multicriteria techniques, the attributes “Information processing” and “Task completion” were identified in the FCM method as transmitters and the weighting value was high. These results may indicate the importance of information processing in training processes and the need not to omit steps in an industrial process.
Based on the results of the Clustering method, it is found that the cluster with the most attributes is cluster 7. These attributes are highly cohesive and are oriented toward security, representation, and social interaction. For this reason, the role of the designer becomes very important in developing these types of applications. It is noteworthy that the best weighted attributes with FCM were all located in cluster 7, which reinforces the importance of this cluster.
The ranking attributes differ when comparing the results of the two multicriteria decision methods FCM and DEMATEL. With DEMATEL, the five best weighted attributes are Usability, Satisfaction, Stimulation, Engagement, and Aesthetics. This difference in the results is due to the fact that the FCM worked with the relationships of 87 attributes, which has an impact on different values of causality.
These results can be used depending on the context. If it is wanted to build applications oriented to great handling of information and repetitive tasks, the attributes “Information processing” and “Task completion” would be ideal. However, if it is wanted to build augmented reality applications oriented to industrial training, Usability, Satisfaction, Stimulation, Engagement, and Aesthetics are fundamental requirements to generate a positive user experience.
It was evidenced that there is coherence between the literary warrant that supports the quality attributes presented in this study and the approaches that the experts in the development area highlight as relevant when it comes to identifying weaknesses in the process.

6.2. Limitations and Future Work

As demonstrated in the evaluations results, the attributes have different impacts according to the application scenario. In the present research, the study scenario corresponds to the training processes in industrial environments with AR. Therefore, one of the objectives of the application of decision methods was to identify which were the attributes that had more or less weight within the scope. In other words, the weighting process was a tangible result from applying the multicriteria decision methods. This allowed prioritizing, organizing and in some cases eliminating by the obtained degree of importance, being this, one of the objectives of the research presented. It is important to take into account that, if the methodology proposed in this study is applied in other contexts, it is expected that the weighting will change.
The metrics identified in the review will be the material of a subsequent study. In the current investigation, they were not analyzed from the perspective of multicriteria decision methods.
In this phase of the investigation, there was no participation of workers/operators. This is a limitation because it is the employees who are directly involved in the training processes. In future studies, the participation of employees could be counted on to continue validating the proposed model.
As a future line of work, it is proposed to implement the best ranked attributes in an AR application that allows measuring and comparing the results during the three moments of an industrial training (before-during-after) to determine the impact of using AR as training support technology, all within the framework of building an information technology architecture.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/math11132834/s1, Table S1: Description of the attributes; Table S2: SRMS Matrix; Table S3: Matrices generated with DEMATEL.

Author Contributions

Conceptualization, L.E.G., J.J.S., D.J., W.N., C.A.G. and H.A.L.-O.; Data curation, L.E.G., D.J., W.N., C.A.G. and M.M.B.; Formal analysis, L.E.G. and H.A.L.-O.; Investigation, L.E.G.; Methodology, L.E.G.; Software, L.E.G.; Validation, J.J.S., D.J. and W.N.; Visualization, L.E.G. and M.M.B.; Writing—original draft, L.E.G., C.A.G., M.M.B. and H.A.L.-O.; Writing—review and editing, J.J.S., D.J., W.N. and C.A.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Universidad del Norte grant number FOESPC 56883, Universidad Santo Tomás grant number ING.SIST.2022-01 and “Convocatoria Doctorados Nacionales No. 785 de 2017”. The APC was funded by Universidad del Norte.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank the universities involved in the project and the Institute of Robotics and Information and Communication Technologies—IRTIC—for allowing the articulation of an interdisciplinary team for the development of the research project. Thanks are due to the experts who provided valuable contributions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  2. Brown, M.; Mccormack, M.; Reeves, J.; Brooks, D.C.; Grajek, S.; Bali, M.; Bulger, S.; Dark, S.; Engelbert, N.; Gannon, K.; et al. 2020 EDUCAUSE Horizon Report. Teaching and Learning Edition; Educause: Louisville, CO, USA, 2020. [Google Scholar]
  3. Dey, A.; Billinghurst, M.; Lindeman, R.W.; Swan, J.E. A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Front. Robot. AI 2018, 5, 16–17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Tatwany, L.; Ouertani, H.C. A Review on Using Augmented Reality in Text Translation. In Proceedings of the 2017 6th International Conference on Information and Communication Technology and Accessibility (ICTA), Muscat, Oman, 19–21 December 2017; pp. 1–6. [Google Scholar]
  5. Kim, S.K.; Kang, S.J.; Choi, Y.J.; Choi, M.H.; Hong, M. Augmented-Reality Survey: From Concept to Application. KSII Trans. Internet Inf. Syst. 2017, 11, 982–1004. [Google Scholar] [CrossRef]
  6. ABIresearch AR and VR Solutions Are Key Enablers for a New Normal in Human Resources, Training, and Collaboration. Available online: https://www.abiresearch.com/press/ar-remote-expertise-and-training-applications-have-almost-60-million-active-users-2025/ (accessed on 6 April 2021).
  7. Wang, L.; Lv, M. Study on Assessing User Experience of Augmented Reality Applications. In HCII 2020: Virtual, Augmented and Mixed Reality. Design and Interaction; Springer: Berlin/Heidelberg, Germany, 2020; pp. 208–222. [Google Scholar]
  8. Bueno, S.; Gallego, M.D.; Noyes, J. Uses and Gratifications on Augmented Reality Games: An Examination of Pokémon Go. Appl. Sci. 2020, 10, 1644. [Google Scholar] [CrossRef] [Green Version]
  9. Satti, F.A.; Hussain, J.; Muhammad Bilal, H.S.; Khan, W.A.; Khattak, A.M.; Yeon, J.E.; Lee, S. Holistic User EXperience in Mobile Augmented Reality Using User EXperience Measurement Index. In Proceedings of the 2019 Conference on Next Generation Computing Applications (NextComp), Mauritius, 19–21 September 2019; pp. 1–6. [Google Scholar]
  10. Lovos, E.N. Augmented Educational Material. User Experience Analysis. Edutec. Rev. Electrónica de Tecnol. Educ. 2019, 57–67. [Google Scholar] [CrossRef]
  11. Pringle, A.; Hutka, S.; Mom, J.; van Esch, R.; Heffernan, N.; Chen, P. Ethnographic Study of a Commercially Available Augmented Reality HMD App for Industry Work Instruction. In Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 5–7 June 2019; ACM: New York, NY, USA, 2019; pp. 389–397. [Google Scholar]
  12. Greenfeld, A.; Lugmayr, A.; Lamont, W. Comparative Reality: Measuring User Experience and Emotion in Immersive Virtual Environments. In Proceedings of the 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), Taichung, Taiwan, 10–12 December 2018; pp. 204–209. [Google Scholar]
  13. Irshad, S.; Awang Rambli, D.R.; Muhamad Nazri, N.I.A.; binti Mohd Shukri, S.R.; Omar, Y. Measuring User Experience of Mobile Augmented Reality Systems Through Non-Instrumental Quality Attributes. In Communications in Computer and Information Science; Abdullah, N., Wan Adnan, W.A., Foth, M., Eds.; Springer: Singapore, 2018; Volume 886, pp. 349–357. ISBN 978-981-13-1627-2. [Google Scholar]
  14. Heo, M.-H.; Kim, D.; Lee, J. Evaluating User Experience of Augmented Reality-Based Automobile Maintenance Content -Mobile Device and HoloLens Comparison-. Int. J. Control. Autom. 2018, 11, 187–196. [Google Scholar] [CrossRef]
  15. Werrlich, S.; Daniel, A.; Ginger, A.; Nguyen, P.-A.; Notni, G. Comparing HMD-Based and Paper-Based Training. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, 16–20 October 2018; pp. 134–142. [Google Scholar]
  16. Sekhavat, Y.A.; Parsons, J. The Effect of Tracking Technique on the Quality of User Experience for Augmented Reality Mobile Navigation. Multimed. Tools Appl. 2018, 77, 11635–11668. [Google Scholar] [CrossRef]
  17. Han, D.I.; tom Dieck, M.C.; Jung, T. User Experience Model for Augmented Reality Applications in Urban Heritage Tourism. J. Herit. Tour. 2018, 13, 46–61. [Google Scholar] [CrossRef]
  18. Cheng, K.H. Parents’ User Experiences of Augmented Reality Book Reading: Perceptions, Expectations, and Intentions. Educ. Technol. Res. Dev. 2019, 67, 303–315. [Google Scholar] [CrossRef]
  19. Xu, W.; Liang, H.-N.; Zhao, Y.; Yu, D.; Monteiro, D. DMove: Directional Motion-Based Interaction for Augmented Reality Head-Mounted Displays. In Proceedings of the Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; ACM: New York, NY, USA, 2019; pp. 1–14. [Google Scholar]
  20. Hamacher, A.; Hafeez, J.; Csizmazia, R.; Whangbo, T.K. Augmented Reality User Interface Evaluation—Performance Measurement of Hololens, Moverio and Mouse Input. Int. J. Interact. Mob. Technol. 2019, 13, 95. [Google Scholar] [CrossRef] [Green Version]
  21. Montuwy, A.; Cahour, B.; Dommes, A. Older Pedestrians Navigating With AR Glasses and Bone Conduction Headset. In Proceedings of the Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; ACM: New York, NY, USA, 2018; pp. 1–6. [Google Scholar]
  22. Merino, L.; Bergel, A.; Nierstrasz, O. Overcoming Issues of 3D Software Visualization through Immersive Augmented Reality. In Proceedings of the 2018 IEEE Working Conference on Software Visualization (VISSOFT), Madrid, Spain, 24–25 September 2018; pp. 54–64. [Google Scholar]
  23. Seok, A.; Choi, Y. A Study on User Experience Evaluation of Glasses-Type Wearable Device with Built-in Bone Conduction Speaker. In Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video, Seoul, Republic of Korea, 26–28 June 2018; ACM: New York, NY, USA, 2018; pp. 203–208. [Google Scholar]
  24. Dünser, A.; Grasset, R.; Billinghurst, M. A Survey of Evaluation Techniques Used in Augmented Reality Studies. In Proceedings of the ACM SIGGRAPH ASIA 2008 Courses on—SIGGRAPH Asia ’08; ACM Press: New York, New York, USA, 2008; pp. 1–27. [Google Scholar]
  25. Westerfield, G.; Mitrovic, A.; Billinghurst, M. Intelligent Augmented Reality Training for Motherboard Assembly. Int. J. Artif. Intell. Educ. 2015, 25, 157–172. [Google Scholar] [CrossRef]
  26. Randeniya, N.; Ranjha, S.; Kulkarni, A.; Lu, G. Virtual Reality Based Maintenance Training Effectiveness Measures—A Novel Approach for Rail Industry. In Proceedings of the 2019 IEEE 28th International Symposium on Industrial Electronics (ISIE), Vancouver, BC, Canada, 12–14 June 2019; pp. 1605–1610. [Google Scholar]
  27. Webel, S.; Bockholt, U.; Engelke, T.; Gavish, N.; Olbrich, M.; Preusche, C. An Augmented Reality Training Platform for Assembly and Maintenance Skills. Robot. Auton. Syst. 2013, 61, 398–403. [Google Scholar] [CrossRef]
  28. Jahantigh, F.F.; Khanmohammadi, E.; Sarafrazi, A. Crisis Management Model Using Fuzzy Cognitive Map. Int. J. Bus. Excell. 2018, 16, 177. [Google Scholar] [CrossRef]
  29. Bakhtavar, E.; Shirvand, Y. Designing a Fuzzy Cognitive Map to Evaluate Drilling and Blasting Problems of the Tunneling Projects in Iran. Eng. Comput. 2019, 35, 35–45. [Google Scholar] [CrossRef]
  30. Kazemi, F.; Bahrami, A.; Abdolahi Sharif, J. Mineral Processing Plant Site Selection Using Integrated Fuzzy Cognitive Map and Fuzzy Analytical Hierarchy Process Approach: A Case Study of Gilsonite Mines in Iran. Miner. Eng. 2020, 147, 106143. [Google Scholar] [CrossRef]
  31. Infante-Moro, A.; Infante-Moro, J.C.; Gallardo-Pérez, J. Los Mapas Cognitivos Difusos y Su Aplicación En La Investigación de Las Ciencias Sociales: Estudio de Sus Principales Problemáticas. Educ. Knowl. Soc. (EKS) 2021, 22, e26380. [Google Scholar] [CrossRef]
  32. Malakoutikhah, M.; Alimohammadlou, M.; Jahangiri, M.; Rabiei, H.; Faghihi, S.A.; Kamalinia, M. Modeling the Factors Affecting Unsafe Behaviors Using the Fuzzy Best-Worst Method and Fuzzy Cognitive Map. Appl. Soft Comput. 2022, 114, 108119. [Google Scholar] [CrossRef]
  33. Gutiérrez, L.E.; Guerrero, C.A.; López-Ospina, H.A. Ranking of Problems and Solutions in the Teaching and Learning of Object-Oriented Programming. Educ. Inf. Technol. 2022, 27, 7205–7239. [Google Scholar] [CrossRef] [PubMed]
  34. Izzo, F.; Camminatiello, I.; Sasso, P.; Solima, L.; Lombardo, R. Creating Customer, Museum and Social Value through Digital Technologies: Evidence from the MANN Assiri Project. Socio-Econ. Plan. Sci. 2023, 85, 101502. [Google Scholar] [CrossRef]
  35. Bouksim, M.; Zakani, F.; Arhid, K.; Aboulfatah, M.; Gadi, T. New Approach for 3D Mesh Retrieval Using Data Envelopment Analysis. Int. J. Intell. Eng. Syst. 2018, 11, 1–10. [Google Scholar] [CrossRef]
  36. Saaty, T.L. Decision Making—The Analytic Hierarchy and Network Processes (AHP/ANP). J. Syst. Sci. Syst. Eng. 2004, 13, 1–35. [Google Scholar] [CrossRef]
  37. Saaty, T.L. Fundamentals of the Analytic Network Process—Dependence and Feedback in Decision-Making with a Single Network. J. Syst. Sci. Syst. Eng. 2004, 13, 129–157. [Google Scholar] [CrossRef]
  38. Rodríguez, M.D.; Ariza, Á.L.G.; Pérez, A.H.; Mora, M.E.D. Introducción Al Análisis Estadístico Multivariado Aplicado. Experiencia y Casos En El Caribe Colombiano; Editorial Universidad del Norte: Barranquilla, Colombia, 2016; ISBN 9789587419269. [Google Scholar]
  39. Wu, H.-H.; Chang, S.-Y. A Case Study of Using DEMATEL Method to Identify Critical Factors in Green Supply Chain Management. Appl. Math. Comput. 2015, 256, 394–403. [Google Scholar] [CrossRef]
  40. Sharifpour, H.; Ghaseminezhad, Y.; Hashemi-Tabatabaei, M.; Amiri, M. Investigating Cause-and-Effect Relationships between Supply Chain 4.0 Technologies. Eng. Manag. Prod. Serv. 2022, 14, 22–46. [Google Scholar] [CrossRef]
  41. Rodriguez-Repiso, L.; Setchi, R.; Salmeron, J.L. Modelling IT Projects Success with Fuzzy Cognitive Maps. Expert Syst. Appl. 2007, 32, 543–559. [Google Scholar] [CrossRef]
  42. Kotsopoulos, K.I.; Papadopoulos, A.; Babathanasis, A.; Axiotopoulos, S. A Research in the Design of Augmented Reality Gamified Mobile Applications for Promoting Traditional Products Implemented in Ambient Intelligence Environment of Small Shops. In Proceedings of the 2020 11th International Conference on Information, Intelligence, Systems and Applications (IISA), Piraeus, Greece, 15–17 July 2020; pp. 1–8. [Google Scholar]
  43. Pamučar, D.; Ćirović, G. The Selection of Transport and Handling Resources in Logistics Centers Using Multi-Attributive Border Approximation Area Comparison (MABAC). Expert Syst. Appl. 2015, 42, 3016–3028. [Google Scholar] [CrossRef]
  44. Roy, J.; Ranjan, A.; Debnath, A.; Kar, S. An Extended MABAC for Multi-Attribute Decision Making Using Trapezoidal Interval Type-2 Fuzzy Numbers. arXiv 2016, arXiv:1607.01254. [Google Scholar]
  45. Lourenzutti, R.; Krohling, R.A. TODIM Based Method to Process Heterogeneous Information. Procedia Comput. Sci. 2015, 55, 318–327. [Google Scholar] [CrossRef] [Green Version]
  46. Opricovic, S.; Tzeng, G.-H. Compromise Solution by MCDM Methods: A Comparative Analysis of VIKOR and TOPSIS. Eur. J. Oper. Res. 2004, 156, 445–455. [Google Scholar] [CrossRef]
  47. Shekhovtsov, A.; Sałabun, W. A Comparative Case Study of the VIKOR and TOPSIS Rankings Similarity. Procedia Comput. Sci. 2020, 176, 3730–3740. [Google Scholar] [CrossRef]
  48. Touami, O.; Djekoune, O.; Benbelkacem, S.; Mellah, R.; Guerroudji, M.A.; Zenati-Henda, N. An Application of a Fuzzy TOPSIS Multi-Criteria Decision Analysis Algorithm for Augmented Reality Maintenance Aid Systems Selection. In Proceedings of the 2022 2nd International Conference on Advanced Electrical Engineering (ICAEE), Constantine, Algeria, 29–31 October 2022; pp. 1–6. [Google Scholar]
  49. Touami, O.; Djekoune, O.; Benbelkacem, S.; Mellah, R.; Guerroudji, M.A.; Zenati-Henda, N. Evaluation of Augmented Reality Maintenance Assistance Systems: An Integrated AHP and Fuzzy-Topsis Model. In Proceedings of the 2022 International Conference on Advanced Aspects of Software Engineering (ICAASE), Constantine, Algeria, 17–18 September 2022; pp. 1–8. [Google Scholar]
  50. Kitchenham, B.; Charters, S. Guidelines for Performing Systematic Literature Reviews in Software Engineering; ResearchGate: Erlin, Germany, 2007. [Google Scholar]
  51. Brereton, P.; Kitchenham, B.A.; Budgen, D.; Turner, M.; Khalil, M. Lessons from Applying the Systematic Literature Review Process within the Software Engineering Domain. J. Syst. Softw. 2007, 80, 571–583. [Google Scholar] [CrossRef] [Green Version]
  52. Bidel, M.J.; Safari, H.; Amoozad Mahdiraji, H.; Zavadskas, E.K.; Antucheviciene, J. A Framework for Project Delivery Systems via Hybrid Fuzzy Risk Analysis: Application and Extension in ICT. Mathematics 2022, 10, 3185. [Google Scholar] [CrossRef]
  53. Huang, S.-F. Using Linguistic VIKOR and Fuzzy Cognitive Maps to Select Virtual Reality Games Development Project. Mathematics 2021, 9, 1253. [Google Scholar] [CrossRef]
  54. Poczeta, K.; Papageorgiou, E.I.; Gerogiannis, V.C. Fuzzy Cognitive Maps Optimization for Decision Making and Prediction. Mathematics 2020, 8, 2059. [Google Scholar] [CrossRef]
  55. Jeong, J.S.; Ramírez-Gómez, Á. Optimizing the Location of a Biomass Plant with a Fuzzy-DEcision-MAking Trial and Evaluation Laboratory (F-DEMATEL) and Multi-Criteria Spatial Decision Assessment for Renewable Energy Management and Long-Term Sustainability. J. Clean. Prod. 2018, 182, 509–520. [Google Scholar] [CrossRef]
  56. Alzahrani, A.I.; Al-Samarraie, H.; Eldenfria, A.; Alalwan, N. A DEMATEL Method in Identifying Design Requirements for Mobile Environments: Students’ Perspectives. J. Comput. High. Educ. 2018, 30, 466–488. [Google Scholar] [CrossRef]
  57. Aldowah, H.; Al-Samarraie, H.; Alzahrani, A.I.; Alalwan, N. Factors Affecting Student Dropout in MOOCs: A Cause and Effect Decision-making Model. J. Comput. High. Educ. 2020, 32, 429–454. [Google Scholar] [CrossRef]
  58. Tseng, S.H.; Chen, H.C.; Nguyen, T.S. Key Success Factors of Sustainable Organization for Traditional Manufacturing Industries: A Case Study in Taiwan. Mathematics 2022, 10, 4389. [Google Scholar] [CrossRef]
  59. Kao, Y.C.; Shen, K.Y.; Lee, S.T.; Shieh, J.C.P. Selecting the Fintech Strategy for Supply Chain Finance: A Hybrid Decision Approach for Banks. Mathematics 2022, 10, 2393. [Google Scholar] [CrossRef]
  60. Nguyen, P. A Fully Completed Spherical Fuzzy Data-Driven Model for Analyzing Employee Satisfaction in Logistics Service Industry. Mathematics 2023, 11, 2235. [Google Scholar] [CrossRef]
  61. Chang, J.J.; Lin, C.L. Determining the Sustainable Development Strategies and Adoption Paths for Public Bike-Sharing Service Systems (PBSSSs) under Various Users’ Considerations. Mathematics 2023, 11, 1196. [Google Scholar] [CrossRef]
  62. Matas, A. Diseño Del Formato de Escalas Tipo Likert: Un Estado de La Cuestión. Rev. Electrónica de Investig. Educ. 2018, 20, 38–47. [Google Scholar] [CrossRef] [Green Version]
  63. Li, Q. A Novel Likert Scale Based on Fuzzy Sets Theory. Expert Syst. Appl. 2013, 40, 1609–1618. [Google Scholar] [CrossRef]
  64. Gutierrez, L.E.; Betts, M.M.; Wightman, P.; Salazar, A.; Jabba, D.; Nieto, W. Characterization of Quality Attributes to Evaluate the User Experience in Augmented Reality. IEEE Access 2022, 10, 112639–112656. [Google Scholar] [CrossRef]
  65. IBM Software IBM SPSS. Available online: https://www.ibm.com/co-es/analytics/spss-statistics-software (accessed on 14 April 2022).
  66. ANSI/NISO. Guidelines for the Construction, Format and Management of Monolingual Controlled Vocabularies; National Information Standards Organization Location: Baltimore, MD, USA, 2010; Volume 2003. [Google Scholar]
  67. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.-K. AR-Based Interaction for Human-Robot Collaborative Manufacturing. Robot. Comput. -Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
  68. Lee, J.G.; Seo, J.O.; Abbas, A.; Choi, M. End-Users’ Augmented Reality Utilization for Architectural Design Review. Appl. Sci. 2020, 10, 5363. [Google Scholar] [CrossRef]
  69. Düwel, T.; Herbig, N.; Kahl, D.; Krüger, A. Combining Embedded Computation and Image Tracking for Composing Tangible Augmented Reality. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–7. [Google Scholar]
  70. Keskinen, T.; Makela, V.; Kallionierni, P.; Hakulinen, J.; Karhu, J.; Ronkainen, K.; Makela, J.; Turunen, M. The Effect of Camera Height, Actor Behavior, and Viewer Position on the User Experience of 360° Videos. In Proceedings of the 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019—Proceedings 2019, Osaka, Japan, 23–27 March 2019; pp. 423–430. [Google Scholar] [CrossRef] [Green Version]
  71. Lee, G.A.; Park, H.S.; Billinghurst, M. Optical-Reflection Type 3D Augmented Reality Mirrors. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, Parramatta, NSW, Australia, 12–15 November 2019; pp. 2–3. [Google Scholar]
  72. Seeling, P. Visual User Experience Difference: Image Compression Impacts on the Quality of Experience in Augmented Binocular Vision. In Proceedings of the 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 9–12 January 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 924–929. [Google Scholar]
  73. Zhang, J. Emotions Detection of User EXperience (UX) for Mobile Augmented Reality (MAR) Applications. Int. J. Adv. Trends Comput. Sci. Eng. 2019, 8, 63–67. [Google Scholar] [CrossRef]
  74. Grzegorczyk, T.; Sliwinski, R.; Kaczmarek, J. Attractiveness of Augmented Reality to Consumers. Technol. Anal. Strateg. Manag. 2019, 31, 1257–1269. [Google Scholar] [CrossRef]
  75. Chen, Y.-P.; Ko, J.-C. CryptoAR Wallet: A Blockchain Cryptocurrency Wallet Application That Uses Augmented Reality for On-Chain User Data Display. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 1–4 October 2019; ACM: New York, NY, USA, 2019; pp. 1–5. [Google Scholar]
  76. Kang, S.; Choi, H.; Park, S.; Park, C.; Lee, J.; Lee, U.; Lee, S.-J. Fire in Your Hands. In Proceedings of the 25th Annual International Conference on Mobile Computing and Networking, Los Cabos, Mexico, 21–25 October 2019; ACM: New York, NY, USA, 2019; pp. 1–16. [Google Scholar]
  77. Forte, J.L.B.; Vela, F.L.G.; Rodríguez, P.P. User Experience Problems in Immersive Virtual Environments. ACM Int. Conf. Proc. Ser. 2019, 1–4. [Google Scholar] [CrossRef]
  78. Aromaa, S.; Väätänen, A.; Hakkarainen, M.; Kaasinen, E. User Experience and User Acceptance of an Augmented Reality Based Knowledge-Sharing Solution in Industrial Maintenance Work. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2018; Volume 607, pp. 145–156. [Google Scholar]
  79. Dirin, A.; Laine, T.H. User Experience in Mobile Augmented Reality: Emotions, Challenges, Opportunities and Best Practices. Computers 2018, 7, 33. [Google Scholar] [CrossRef] [Green Version]
  80. Chakravorty, A.; Rowe, A. UX Design Principles for Mobile Augmented Reality Applications. In Proceedings of the MCCSIS 2018—Multi Conference on Computer Science and Information Systems and Proceedings of the International Conferences on Interfaces and Human Computer Interaction 2018, Game and Entertainment Technologies 2018 and Computer Graphics, Visualization, Comp, Madrid, Spain, 17–20 July 2018; pp. 319–323. [Google Scholar]
  81. Irshad, S.; Awang Rambli, D.R. Multi-Layered Mobile Augmented Reality Framework for Positive User Experience. In Proceedings of the 2nd International Conference in HCI and UX Indonesia 2016, Jakarta, Indonesia, 13–15 April 2016; ACM: New York, NY, USA, 2016; pp. 21–26. [Google Scholar]
  82. Nivedha, S.; Hemalatha, S. Enhancing User Experience through Physical Interaction in Handheld Augmented Reality. In Proceedings of the 2015 International Conference on Computer Communication and Informatics (ICCCI), Coimbatore, India, 8–10 January 2015; pp. 1–7. [Google Scholar]
  83. Singh, M.; Singh, M.P. Augmented Reality Interfaces. IEEE Internet Comput. 2013, 17, 66–70. [Google Scholar] [CrossRef]
  84. Dhir, A.; Al-Kahtani, M. A Case Study on User Experience (UX) Evaluation of Mobile Augmented Reality Prototypes. J. Univ. Comput. Sci. 2013, 19, 1175–1196. [Google Scholar]
  85. Stutzman, B.; Nilsen, D.; Broderick, T.; Neubert, J. MARTI: Mobile Augmented Reality Tool for Industry. In Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering, Washington, DC, USA, 31 March–2 April 2009; Volume 5, pp. 425–429. [Google Scholar]
  86. Neumann, A.; Strenge, B.; Uhlich, J.C.; Schlicher, K.D.; Maier, G.W.; Schalkwijk, L.; Waßmuth, J.; Essig, K.; Schack, T. AVIKOM—Towards a Mobile Audiovisual Cognitive Assistance System for Modern Manufacturing and Logistics. In Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments, New York, NY, USA, 30 June–3 July 2020; ACM: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
  87. Faust, F.G.; Catecati, T.; de Souza Sierra, I.; Araujo, F.S.; Ramírez, A.R.G.; Nickel, E.M.; Gomes Ferreira, M.G. Mixed Prototypes for the Evaluation of Usability and User Experience: Simulating an Interactive Electronic Device. Virtual Real. 2019, 23, 197–211. [Google Scholar] [CrossRef]
  88. Fuste, A.; Reynolds, B.; Hobin, J.; Heun, V. Kinetic AR: A Framework for Robotic Motion Systems in Spatial Computing. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
  89. Materna, Z.; Kapinus, M.; Beran, V.; Smrž, P.; Zemčík, P. Interactive Spatial Augmented Reality in Collaborative Robot Programming: User Experience Evaluation. In Proceedings of the RO-MAN 2018—27th IEEE International Symposium on Robot and Human Interactive Communication, Tai’an, China, 27–31 August 2018; pp. 80–87. [Google Scholar] [CrossRef]
  90. Siriborvornratanakul, T. Enhancing User Experiences of Mobile-Based Augmented Reality via Spatial Augmented Reality: Designs and Architectures of Projector-Camera Devices. Adv. Multimed. 2018, 2018, 8194726. [Google Scholar] [CrossRef] [Green Version]
  91. Braitmaier, M.; Kyriazis, D. Virtual and Augmented Reality: Improved User Experience through a Service Oriented Infrastructure. In Proceedings of the 2011 Third International Conference on Games and Virtual Worlds for Serious Applications, Athens, Greece, 4–6 May 2011; pp. 40–46. [Google Scholar]
  92. Marques, M.; Elvas, F.; Nunes, I.L.; Lobo, V.; Correia, A. Augmented Reality in the Context of Naval Operations. In International Conference on Human Systems Engineering and Design; Ahram, T., Karwowski, W., Taiar, R., Eds.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2019; Volume 876, pp. 307–313. ISBN 978-3-030-02052-1. [Google Scholar]
  93. Helin, K.; Kuula, T.; Vizzi, C.; Karjalainen, J.; Vovk, A. User Experience of Augmented Reality System for Astronaut’s Manual Work Support. Front. Robot. AI 2018, 5, 1–10. [Google Scholar] [CrossRef] [Green Version]
  94. Ramli, R.; Duriraju, N.; Rozzani, N. Augmented Reality for Improved User Experience: Himalayan Wildlife Tour Book. In Proceedings of the 2019 IEEE 9th International Conference on System Engineering and Technology (ICSET), Shah Alam, Malaysia, 7 October 2019; Volume 6, pp. 56–61. [Google Scholar]
  95. Kim, K.; Norouzi, N.; Losekamp, T.; Bruder, G.; Anderson, M.; Welch, G. Effects of Patient Care Assistant Embodiment and Computer Mediation on User Experience. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), San Diego, CA, USA, 9–11 December 2019; pp. 17–177. [Google Scholar]
  96. Hammady, R.; Ma, M.; Powell, A. User Experience of Markerless Augmented Reality Applications in Cultural Heritage Museums: ‘MuseumEye’ as a Case Study. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2018; Volume 10851, LNCS; pp. 349–369. ISBN 9783319952819. [Google Scholar]
  97. Jakobsen, C.L.; Larsen, J.B.; Nørlem, M.L.; Kraus, M. Improving User Experience for Lost Heritage Sites with a User-Centered Indirect Augmented Reality Application. In Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST; Springer: Berlin/Heidelberg, Germany, 2018; Volume 229, pp. 54–63. [Google Scholar]
  98. Seppälä, K.; Heimo, O.I.; Korkalainen, T.; Pääkylä, J.; Latvala, J.; Helle, S.; Härkänen, L.; Jokela, S.; Järvenpää, L.; Saukko, F.; et al. Examining User Experience in an Augmented Reality Adventure Game: Case Luostarinmäki Handicrafts Museum. In IFIP Advances in Information and Communication Technology; Springer: Berlin/Heidelberg, Germany, 2016; Volume 474, pp. 257–276. ISBN 9783319448046. [Google Scholar]
  99. Kerr, S.J.; Rice, M.D.; Teo, Y.; Wan, M.; Cheong, Y.L.; Ng, J.; Ng-Thamrin, L.; Thura-Myo, T.; Wren, D. Wearable Mobile Augmented Reality. In Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry, Hong Kong, China, 11–12 December 2011; ACM: New York, NY, USA, 2011; pp. 209–216. [Google Scholar]
  100. Romli, R.; Razali, A.F.; Ghazali, N.H.; Hanin, N.A.; Ibrahim, S.Z. Mobile Augmented Reality (AR) Marker-Based for Indoor Library Navigation. IOP Conf. Ser. Mater. Sci. Eng. 2020, 767, 012062. [Google Scholar] [CrossRef]
  101. Nunes, I.L.; Lucas, R.; Simões-Marques, M.; Correia, N. An Augmented Reality Application to Support Deployed Emergency Teams. In Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018), Florence, Italy, 26–30 August 2019; pp. 195–204. [Google Scholar]
  102. Kohler, C.; Weidner, F.; Broll, W. AR Training for Paragliding Pilots: An Investigation of User Experience and Requirements. In Proceedings of the 2019 21st Symposium on Virtual and Augmented Reality (SVR), Rio de Janeiro, Brazil, 28–31 October 2019; pp. 92–101. [Google Scholar]
  103. Sánchez, A.; Redondo, E.; Fonseca, D. Developing an Augmented Reality Application in the Framework of Architecture Degree. In Proceedings of the 2012 ACM Workshop on User Experience in e-Learning and Augmented Technologies in Education, Nara, Japan, 2 November 2012; ACM: New York, NY, USA, 2012; pp. 37–42. [Google Scholar]
  104. Smaragdina, A.A.; Ningrum, G.D.K.; Nidhom, A.M.; Rahmawati, N.S.Y.; Rusdiansyah, M.R.; Putra, A.B.N.R. The User Experience Analysis of Computer Graphics Educational Comics (GRAFMIC) Based on Markerless Augmented Reality. In Proceedings of the 2019 International Conference on Electrical, Electronics and Information Engineering (ICEEIE), Denpasar, Indonesia, 3–4 October 2019; pp. 220–225. [Google Scholar]
  105. Lindemann, P.; Eisl, D.; Rigoll, G. Acceptance and User Experience of Driving with a See-Through Cockpit in a Narrow-Space Overtaking Scenario. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1040–1041. [Google Scholar]
  106. Brata, K.C.; Liang, D. Comparative Study of User Experience on Mobile Pedestrian Navigation between Digital Map Interface and Location-Based Augmented Reality. Int. J. Electr. Comput. Eng. (IJECE) 2020, 10, 2037. [Google Scholar] [CrossRef] [Green Version]
  107. Rehrl, K.; Häusler, E.; Leitinger, S.; Bell, D. Pedestrian Navigation with Augmented Reality, Voice and Digital Map: Final Results from an in Situ Field Study Assessing Performance and User Experience. J. Locat. Based Serv. 2014, 8, 75–96. [Google Scholar] [CrossRef]
  108. Davidavičienė, V.; Raudeliūnienė, J.; Viršilaitė, R. User Experience Evaluation and Creativity Stimulation with Augmented Reality Mobile Applications. Creat. Stud. 2019, 12, 34–48. [Google Scholar] [CrossRef]
  109. Andri, C.; Alkawaz, M.H.; Waheed, S.R. Examining Effectiveness and User Experiences in 3D Mobile Based Augmented Reality for MSU Virtual Tour. In Proceedings of the 2019 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Selangor, Malaysia, 29 June 2019; pp. 161–167. [Google Scholar]
  110. Contreras, P.; Chimbo, D.; Tello, A.; Espinoza, M. Semantic Web and Augmented Reality for Searching People, Events and Points of Interest within of a University Campus. In Proceedings of the 2017 XLIII Latin American Computer Conference (CLEI), Córdoba, Argentina, 4–8 September 2017; pp. 1–10. [Google Scholar]
  111. Skinner, P.; Ventura, J.; Zollmann, S. Indirect Augmented Reality Browser for GIS Data. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Munich, Germany, 16–20 October 2018; pp. 145–150. [Google Scholar]
  112. Patkar, N.; Merino, L.; Nierstrasz, O. Towards Requirements Engineering with Immersive Augmented Reality. In Proceedings of the Conference Companion of the 4th International Conference on Art, Science, and Engineering of Programming, Porto, Portugal, 23–26 March 2020; ACM: New York, NY, USA, 2020; pp. 55–60. [Google Scholar]
  113. Giloth, C.F.; Tanant, J. User Experiences in Three Approaches to a Visit to a 3D Labyrinthe of Versailles. In Proceedings of the 2015 Digital Heritage, Granada, Spain, 28 September–2 October 2015; pp. 403–404. [Google Scholar]
  114. Unal, M.; Bostanci, E.; Sertalp, E. Distant Augmented Reality: Bringing a New Dimension to User Experience Using Drones. Digit. Appl. Archaeol. Cult. Herit. 2020, 17, e00140. [Google Scholar] [CrossRef]
  115. Ocampo, A.J.T. TourMAR: Designing Tourism Mobile Augmented Reality Architecture with Data Integration to Improve User Experience. In Proceedings of the 2019 4th International Conference on Multimedia Systems and Signal Processing, Guangzhou, China, 10–12 May 2019; ACM: New York, NY, USA, 2019; pp. 79–83. [Google Scholar]
  116. Thi Minh Tran, T.; Parker, C. Designing Exocentric Pedestrian Navigation for AR Head Mounted Displays. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
  117. Ader, L.G.M.; McManus, K.; Greene, B.R.; Caulfield, B. How Many Steps to Represent Individual Gait? In Proceedings of the Companion Proceedings of the 12th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, Sophia Antipolis, France, 23–26 June 2020; ACM: New York, NY, USA, 2020; pp. 1–4. [Google Scholar]
  118. Kim, M.J.; Wang, X.; Han, S.; Wang, Y. Implementing an Augmented Reality-Enabled Wayfinding System through Studying User Experience and Requirements in Complex Environments. Vis. Eng. 2015, 3, 14. [Google Scholar] [CrossRef] [Green Version]
  119. Arifin, Y.; Sastria, T.G.; Barlian, E. User Experience Metric for Augmented Reality Application: A Review. Procedia Comput. Sci. 2018, 135, 648–656. [Google Scholar] [CrossRef]
  120. Müller, J.; Zagermann, J.; Wieland, J.; Pfeil, U.; Reiterer, H. A Qualitative Comparison Between Augmented and Virtual Reality Collaboration with Handheld Devices. In Proceedings of the Proceedings of Mensch und Computer 2019, Hamburg, Germany, 8–11 September 2019; ACM: New York, NY, USA, 2019; pp. 399–410. [Google Scholar]
  121. Střelák, D.; Škola, F.; Liarokapis, F. Examining User Experiences in a Mobile Augmented Reality Tourist Guide. In Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu Island, Greece, 29 June–1 July 2016; ACM: New York, NY, USA, 2016; pp. 1–8. [Google Scholar]
  122. Seo, D.W.; Lee, J.Y. Direct Hand Touchable Interactions in Augmented Reality Environments for Natural and Intuitive User Experiences. Expert Syst. Appl. 2013, 40, 3784–3793. [Google Scholar] [CrossRef]
  123. Brancati, N.; Caggianese, G.; Frucci, M.; Gallo, L.; Neroni, P. Experiencing Touchless Interaction with Augmented Content on Wearable Head-Mounted Displays in Cultural Heritage Applications. Pers. Ubiquitous Comput. 2017, 21, 203–217. [Google Scholar] [CrossRef]
  124. Lamberti, F.; Manuri, F.; Paravati, G.; Piumatti, G.; Sanna, A. Using Semantics to Automatically Generate Speech Interfaces for Wearable Virtual and Augmented Reality Applications. IEEE Trans. Hum. Mach. Syst. 2017, 47, 152–164. [Google Scholar] [CrossRef]
  125. Olsson, T.; Lagerstam, E.; Kärkkäinen, T.; Väänänen-Vainio-Mattila, K. Expected User Experience of Mobile Augmented Reality Services: A User Study in the Context of Shopping Centres. Pers. Ubiquitous Comput. 2013, 17, 287–304. [Google Scholar] [CrossRef]
  126. Lyons, N.; Smith, M.; McCabe, H. Sensory Seduction & Narrative Pull. In Proceedings of the 2018 IEEE Games, Entertainment, Media Conference (GEM), Galway, Ireland, 15–17 August 2018; pp. 1–56. [Google Scholar]
  127. Grubert, J.; Langlotz, T.; Zollmann, S.; Regenbrecht, H. Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality. IEEE Trans. Vis. Comput. Graph. 2017, 23, 1706–1724. [Google Scholar] [CrossRef]
  128. Okimoto, M.L.L.R.; Okimoto, P.C.; Goldbach, C.E. User Experience in Augmented Reality Applied to the Welding Education. Procedia Manuf. 2015, 3, 6223–6227. [Google Scholar] [CrossRef] [Green Version]
  129. Rajappa, S.; Raj, G. Application and Scope Analysis of Augmented Reality in Marketing Using Image Processing Technique. In Proceedings of the 2016 6th International Conference—Cloud System and Big Data Engineering, Confluence, Noida, India, 14–15 January 2016; pp. 435–440. [Google Scholar] [CrossRef]
  130. Bauman, B.; Seeling, P. Evaluation of EEG-Based Predictions of Image QoE in Augmented Reality Scenarios. In Proceedings of the IEEE Vehicular Technology Conference, Chicago, IL, USA, 27–30 August 2018; pp. 1–5. [Google Scholar] [CrossRef]
  131. Bauman, B.; Seeling, P. Spherical Image QoE Approximations for Vision Augmentation Scenarios. Multimed. Tools Appl. 2019, 78, 18113–18135. [Google Scholar] [CrossRef]
  132. Riegler, A.; Wintersberger, P.; Riener, A.; Holzmann, C. Augmented Reality Windshield Displays and Their Potential to Enhance User Experience in Automated Driving. I-Com 2019, 18, 127–149. [Google Scholar] [CrossRef] [Green Version]
  133. Xue, H.; Sharma, P.; Wild, F. User Satisfaction in Augmented Reality-Based Training Using Microsoft HoloLens. Computers 2019, 8, 9. [Google Scholar] [CrossRef] [Green Version]
  134. Irshad, S.; Awang, D.R.B. A UX Oriented Evaluation Approach for Mobile Augmented Reality Applications. In Proceedings of the 16th International Conference on Advances in Mobile Computing and Multimedia, Singapore, 28–30 November 2016; pp. 108–112. [Google Scholar] [CrossRef]
  135. Kim, H.C.; Jin, S.; Jo, S.; Lee, J.H. A Naturalistic Viewing Paradigm Using 360° Panoramic Video Clips and Real-Time Field-of-View Changes with Eye-Gaze Tracking: Naturalistic Viewing Paradigm Based on 360° Panoramic Video and Real-Time Eye Gaze. NeuroImage 2020, 216, 116617. [Google Scholar] [CrossRef]
  136. Pittarello, F. Designing AR Enhanced Art Exhibitions: A Methodology and a Case Study. In Proceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the Next Interaction, Padova, Italy, 23–25 September 2019. [Google Scholar] [CrossRef] [Green Version]
  137. Sánchez-Francisco, M.; Díaz, P.; Fabiano, F.; Aedo, I. Engaging Users with an AR Pervasive Game for Personal Urban Awareness. In Proceedings of the ACM International Conference Proceeding Series, Gipuzkoa, Spain, 25–28 June 2019. [Google Scholar] [CrossRef] [Green Version]
  138. Savela, N.; Oksanen, A.; Kaakinen, M.; Noreikis, M.; Xiao, Y. Does Augmented Reality Affect Sociability, Entertainment, and Learning? A Field Experiment. Appl. Sci. 2020, 10, 1392. [Google Scholar] [CrossRef] [Green Version]
  139. Bellei, E.A.; Biduski, D.; Brock, L.A.; Patricio, D.I.; Souza, J.D.L.; De Marchi, A.C.B.; De Rieder, R. Prior Experience as an Influencer in the Momentary User Experience: An Assessment in Immersive Virtual Reality Game Context. In Proceedings of the 2018 20th Symposium on Virtual and Augmented Reality (SVR), Foz do Iguaçu, Brazil, 29 October–1 November 2018; pp. 1–9. [Google Scholar]
  140. Kusumaningsih, A.; Kurniawati, A.; Angkoso, C.V.; Yuniarno, E.M.; Hariadi, M. User Experience Measurement on Virtual Dressing Room of Madura Batik Clothes. In Proceedings of the 2017 International Conference on Sustainable Information Engineering and Technology, SIET 2017, Batu, Indonesia, 24–25 November 2017; pp. 203–208. [Google Scholar] [CrossRef]
  141. Irshad, S.; Rambli, D.R.A. Preliminary User Experience Framework for Designing Mobile Augmented Reality Technologies. In Proceedings of the 2015 4th International Conference on Interactive Digital Media (ICIDM), Bandung, Indonesia, 1–5 December 2015; pp. 1–4. [Google Scholar] [CrossRef]
  142. Ghazwani, Y.; Smith, S. Interaction in Augmented Reality: Challenges to Enhance User Experience. In Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations, Sydney, NSW, Australia, 14–16 February 2020; ACM: New York, NY, USA, 2020; pp. 39–44. [Google Scholar]
  143. Kim, S.J.; Dey, A.K. Augmenting Human Senses to Improve the User Experience in Cars: Applying Augmented Reality and Haptics Approaches to Reduce Cognitive Distances. Multimed. Tools Appl. 2016, 75, 9587–9607. [Google Scholar] [CrossRef]
  144. Parmaxi, A.; Demetriou, A.A. Augmented Reality in Language Learning: A State-of-the-art Review of 2014–2019. J. Comput. Assist. Learn. 2020, 36, 861–875. [Google Scholar] [CrossRef]
  145. Trista, S.; Rusli, A. Historiar: Experience Indonesian History through Interactive Game and Augmented Reality. Bull. Electr. Eng. Inform. 2020, 9, 1518–1524. [Google Scholar] [CrossRef]
  146. Mulloni, A.; Seichter, H.; Schmalstieg, D. User Experiences with Augmented Reality Aided Navigation on Phones. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, Basel, Switzerland, 26–29 October 2011; pp. 229–230. [Google Scholar] [CrossRef]
  147. Li, X.; Xu, B.; Teng, Y.; Ren, Y.; Hu, Z. Comparative Research of AR and VR Technology Based on User Experience. In Proceedings of the 2014 International Conference on Management Science & Engineering 21th Annual Conference Proceedings, Helsinki, Finland, 17–19 August 2014; pp. 1820–1827. [Google Scholar]
  148. Nielsen, J.; Budiu, R.; Riders, N. Mobile Usability; New Riders: Berkeley, CA, USA, 2013; ISBN 9780321884480. [Google Scholar]
  149. Krug, S. Don’t Make Me Think, Revisited: A Common Sense Approach to Web Usability. Choice Rev. Online 2014, 51, 51–6218. [Google Scholar] [CrossRef]
  150. Lóbach, B. Diseño Industrial Bases Para La Configuración De Los Productos Industriales; Gustavo Gili, S.A.: Barcelona, Spain, 1981; ISBN 8425210321. [Google Scholar]
  151. Kramer, L.J.; Bailey, R.E.; Prinzel, L.J. Commercial Flight Crew Decision Making During Low-Visibility Approach Operations Using Fused Synthetic and Enhanced Vision Systems. Int. J. Aviat. Psychol. 2009, 19, 131–157. [Google Scholar] [CrossRef]
  152. Bonin-Font, F.; Massot Campos, M.; Burguera, A.B. ARSEA: A Virtual Reality Subsea Exploration Assistant. IFAC-PapersOnLine 2018, 51, 26–31. [Google Scholar] [CrossRef]
  153. Merenda, C.; Kim, H.; Tanous, K.; Gabbard, J.L.; Feichtl, B.; Misu, T.; Suga, C. Augmented Reality Interface Design Approaches for Goal-Directed and Stimulus-Driven Driving Tasks. IEEE Trans. Vis. Comput. Graph. 2018, 24, 2875–2885. [Google Scholar] [CrossRef] [PubMed]
  154. HO, C.; Spence, C. The Multisensory Driver. Implications for Ergonomic Car Interface Design; TAYLOR & FRANCIS GROUP: London, UK, 2017; ISBN 9780754670681. [Google Scholar]
  155. Haas, M.W. Virtually-Augmented Interfaces for Tactical Aircraft. Biol. Psychol. 1995, 40, 229–238. [Google Scholar] [CrossRef]
  156. Sanchez Riera, A. Evaluación de La Tecnología de Realidad Aumentada Móvil En Entornos Educativos Del Ámbito de La Arquitectura y La Edificación; Universidad Politécnica de Calalunya: Barcelona, Spain, 2013. [Google Scholar]
  157. Stübl, G.; Heindl, C.; Ebenhofer, G.; Bauer, H.; Pichler, A. Lessons Learned from Human Pose Interaction in an Industrial Spatial Augmented Reality Application. Procedia Comput. Sci. 2023, 217, 912–917. [Google Scholar] [CrossRef]
  158. van Lopik, K.; Schnieder, M.; Sharpe, R.; Sinclair, M.; Hinde, C.; Conway, P.; West, A.; Maguire, M. Comparison of In-Sight and Handheld Navigation Devices toward Supporting Industry 4.0 Supply Chains: First and Last Mile Deliveries at the Human Level. Appl. Ergon. 2020, 82, 102928. [Google Scholar] [CrossRef] [PubMed]
  159. Liu, Y.; Sathishkumar, V.; Manickam, A. Augmented Reality Technology Based on School Physical Education Training. Comput. Electr. Eng. 2022, 99, 107807. [Google Scholar] [CrossRef]
  160. Rusli, R.; Nalanda, D.A.; Tarmidi, A.D.V.; Suryaningrum, K.M.; Yunanda, R. Augmented Reality for Studying Hands on the Human Body for Elementary School Students. Procedia Comput. Sci. 2023, 216, 237–244. [Google Scholar] [CrossRef]
  161. Khoong, Y.M.; Luo, S.; Huang, X.; Li, M.; Gu, S.; Jiang, T.; Liang, H.; Liu, Y.; Zan, T. The Application of Augmented Reality in Plastic Surgery Training and Education: A Narrative Review. J. Plast. Reconstr. Aesthetic Surg. 2023, 82, 255–263. [Google Scholar] [CrossRef]
  162. Schlick, C.; Daude, R.; Luczak, H.; Weck, M.; Springer, J. Head-Mounted Display for Supervisory Control in Autonomous Production Cells. Displays 1997, 17, 199–206. [Google Scholar] [CrossRef]
  163. Rohling, A.J.; Neto, V.V.G.; Ferreira, M.G.V.; Dos Santos, W.A.; Nakagawa, E.Y. A Reference Architecture for Satellite Control Systems. Innov. Syst. Softw. Eng. 2019, 15, 139–153. [Google Scholar] [CrossRef]
  164. Garcés, L.; Martínez-Fernández, S.; Oliveira, L.; Valle, P.; Ayala, C.; Franch, X.; Nakagawa, E.Y. Three Decades of Software Reference Architectures: A Systematic Mapping Study. J. Syst. Softw. 2021, 179, 111004. [Google Scholar] [CrossRef]
  165. Nakagawa, E.Y.; Guessi, M.; Maldonado, J.C.; Feitosa, D.; Oquendo, F. Consolidating a Process for the Design, Representation, and Evaluation of Reference Architectures. In Proceedings of the 2014 IEEE/IFIP Conference on Software Architecture, Sydney, Australia, 7–11 April 2014; pp. 143–152. [Google Scholar]
  166. Rossi, M.; Papetti, A.; Germani, M.; Marconi, M. An Augmented Reality System for Operator Training in the Footwear Sector. Comput. -Aided Des. Appl. 2020, 18, 692–703. [Google Scholar] [CrossRef]
  167. Guest, W.; Wild, F.; Di Mitri, D.; Klemke, R.; Karjalainen, J.; Helin, K. Architecture and Design Patterns for Distributed, Scalable Augmented Reality and Wearable Technology Systems. In Proceedings of the 2019 IEEE International Conference on Engineering, Technology and Education (TALE), Yogyakarta, Indonesia, 10–13 December 2019; pp. 1–8. [Google Scholar]
  168. Tkachuk, M.; Vekshyn, O.; Gamzayev, R. Architecting for Adaptive Resource Management in Mobile Augmented Reality Systems: Models, Metrics and Prototype Software Solutions; Springer: Berlin/Heidelberg, Germany, 2017; pp. 17–35. ISBN 9783319699646. [Google Scholar]
  169. Villanueva, I.; Clemente, O. Arquitectura de Software Para El Desarrollo de Aplicaciones Sensibles Al Contexto, Universidad Nacional Francisco Villarreal. 2017. Available online: http://repositorio.unfv.edu.pe/handle/20.500.13084/1669 (accessed on 14 August 2021).
Figure 1. Conceptual flow and results of the research framework.
Figure 1. Conceptual flow and results of the research framework.
Mathematics 11 02834 g001
Figure 2. Search string executed in this study.
Figure 2. Search string executed in this study.
Mathematics 11 02834 g002
Figure 3. Search protocol and selection of the primary studies of this research.
Figure 3. Search protocol and selection of the primary studies of this research.
Mathematics 11 02834 g003
Figure 4. Types of AR identified in the studies.
Figure 4. Types of AR identified in the studies.
Mathematics 11 02834 g004
Figure 5. FCM analysis indicators: (a) Outdegree indicator vs. the deviation of the 87 attributes; (b) Box plot of the Centrality indicator of the 87 attributes of this study.
Figure 5. FCM analysis indicators: (a) Outdegree indicator vs. the deviation of the 87 attributes; (b) Box plot of the Centrality indicator of the 87 attributes of this study.
Mathematics 11 02834 g005
Figure 6. Best weighted attributes according to the Centrality criterion of the FCM method.
Figure 6. Best weighted attributes according to the Centrality criterion of the FCM method.
Mathematics 11 02834 g006
Figure 7. Cluster data with k = 2: (a) Distances between centers of final clusters; (b) Number of cases per cluster.
Figure 7. Cluster data with k = 2: (a) Distances between centers of final clusters; (b) Number of cases per cluster.
Mathematics 11 02834 g007
Figure 8. Cluster data with k = 9: (a) Distances between centers of final clusters; (b) Number of cases per cluster.
Figure 8. Cluster data with k = 9: (a) Distances between centers of final clusters; (b) Number of cases per cluster.
Mathematics 11 02834 g008
Figure 9. Attributes of cluster 1.
Figure 9. Attributes of cluster 1.
Mathematics 11 02834 g009
Figure 10. Attributes of cluster 2.
Figure 10. Attributes of cluster 2.
Mathematics 11 02834 g010
Figure 11. Attributes of cluster 3.
Figure 11. Attributes of cluster 3.
Mathematics 11 02834 g011
Figure 12. Attributes of cluster 4.
Figure 12. Attributes of cluster 4.
Mathematics 11 02834 g012
Figure 13. Attributes of cluster 5.
Figure 13. Attributes of cluster 5.
Mathematics 11 02834 g013
Figure 14. Attributes of cluster 6.
Figure 14. Attributes of cluster 6.
Mathematics 11 02834 g014
Figure 15. Attributes of cluster 7.
Figure 15. Attributes of cluster 7.
Mathematics 11 02834 g015
Figure 16. Attributes of cluster 8.
Figure 16. Attributes of cluster 8.
Mathematics 11 02834 g016
Figure 17. Attributes of cluster 9.
Figure 17. Attributes of cluster 9.
Mathematics 11 02834 g017
Figure 18. Ranking obtained through the DEMATEL method.
Figure 18. Ranking obtained through the DEMATEL method.
Mathematics 11 02834 g018
Figure 19. Map of influence relationship among attributes according to DEMATEL.
Figure 19. Map of influence relationship among attributes according to DEMATEL.
Mathematics 11 02834 g019
Figure 20. Influential network relation map A01 vs. cluster 2.
Figure 20. Influential network relation map A01 vs. cluster 2.
Mathematics 11 02834 g020
Figure 21. Influential network relation map A01 vs. cluster 5.
Figure 21. Influential network relation map A01 vs. cluster 5.
Mathematics 11 02834 g021
Figure 22. Reference architecture, base usage scenario for quality attributes of user experience.
Figure 22. Reference architecture, base usage scenario for quality attributes of user experience.
Mathematics 11 02834 g022
Table 1. Comparative scale used by the experts applying the FCM method.
Table 1. Comparative scale used by the experts applying the FCM method.
ScaleAbbreviationValue
Very highVH9
HighH7
MediumM5
LowL3
Very lowVL1
Table 2. Comparative scale used by experts applying the DEMATEL method.
Table 2. Comparative scale used by experts applying the DEMATEL method.
ScaleValue
No influence0
Low influence1
Medium influence2
High influence3
Very high influence4
Table 3. Studies of the industrial application area.
Table 3. Studies of the industrial application area.
StudyIdentified AttributesType of AR
AR-based interaction for human-robot collaborative manufacturing [67]Safety, Information Processing, Ergonomics, Autonomy, Competence,
Relatedness
HMD
End-Users’ augmented reality utilization for architectural design review [68]Satisfaction, Physical Demand, Perceived usefulness, Perceived ease of use
Combining Embedded Computation and Image Tracking for Composing Tangible Augmented Reality [69]Usability
The Effect of Camera Height, Actor Behavior, and Viewer Position on the User Experience of 360° Videos [70]Camera height, Actor behavior, Viewer position
Augmented reality user interface evaluation performance measurement of HoloLens, Moverio and mouse input [20]Performance, Satisfaction, Cognitive demand
Optical-Reflection Type 3D Augmented Reality Mirrors [71]Space perception
Ethnographic study of a commercially available augmented reality HMD app for industry work instruction [11]Task Performance:
Accuracy, Task completion, Consistency, Time taken
DMove: Directional Motion-based Interaction for Augmented Reality Head-Mounted Displays [19]Correctness
Comparing HMD-Based and Paper-Based Training [15]Performance, Task completion, Cognitive demand
Comparative Reality: Measuring User Experience and Emotion in Immersive Virtual Environments [12]Mental effort, Engagement, Task Difficulty, Comfortability, Biofeedback
Overcoming Issues of 3D Software Visualization through Immersive Augmented Reality [22]Navigability, Task completion, Correctness, Space Perception, Engagement
A study on user experience evaluation of glasses-type wearable device with built-in bone conduction speaker: Focus on the Zungle Panther [23]Usability, Aesthetic
Visual User Experience Difference: Image compression impacts on the quality of experience in augmented binocular vision [72]Image quality metrics
Study on assessing user experience of augmented reality applications [7]PerformanceMobile
Holistic User eXperience in Mobile Augmented Reality Using User eXperience Measurement Index [9]Attractiveness, Efficiency, Perspicuity
Dependability, Stimulation, Novelty
Emotions detection of user experience (UX) for mobile augmented reality (mar) applications [73]Engagement
Attractiveness of augmented reality to consumers [74]Engagement
CryptoAR Wallet: A Blockchain Cryptocurrency Wallet Application that Uses Augmented Reality for On-chain User Data Display [75]Usability
Fire in Your Hands: Understanding Thermal Behavior of Smartphones [76]Coefficient of thermal spreading
User experience problems in immersive virtual environments [77]Immersion, Satisfaction, Credibility,
Naturalness
User experience and user acceptance of an augmented reality based knowledge-sharing solution in industrial maintenance work [78]Satisfaction, Usability, Usefulness
Measuring user experience of mobile augmented reality systems through non-instrumental quality attributes [13]Aesthetics
User experience in mobile augmented reality: Emotions, challenges, opportunities and best practices [79]Engagement, Cognitive demand
UX design principles for mobile augmented reality applications [80]Space Perception
Multi-layered mobile augmented reality framework for positive user experience [81]Ergonomics, Usability
Enhancing user experience through physical interaction in handheld Augmented Reality [82]Task completion
Augmented Reality Interfaces [83]Usability
A case study on user experience (UX) evaluation of mobile augmented reality prototypes [84]Ease of use, Naturalness, Novelty
MARTI: Mobile Augmented Reality Tool for Industry [85]Space Perception, Aesthetics, Ergonomics
AVIKOM: towards a mobile audiovisual cognitive assistance system for modern manufacturing and logistics [86]Usability, Ergonomics
Mixed prototypes for the evaluation of usability and user experience: simulating an interactive electronic device [87]Usability, Performance, Task completion, Satisfaction, ErgonomicsPAR
Kinetic AR: A Framework for Robotic Motion Systems in Spatial Computing [88]UsabilitySAR
Interactive Spatial Augmented Reality in Collaborative Robot Programming: User Experience Evaluation [89]Space Perception
Enhancing user experiences of mobile-based augmented reality via spatial augmented reality: Designs and architectures of projector-camera devices [90]Novelty, Practicality
Table 4. Experts panel profiles.
Table 4. Experts panel profiles.
Academic DegreesSoftware Development with ARExperience in the Productive Sector with ARAcademic Experience
Eng.MasterPhDAverageAverageAverage
20%20%60%13.4 years12.2 years13.4 years
Table 5. Analysis indicators of the 87 attributes resulting from the IMS and SRMS matrix.
Table 5. Analysis indicators of the 87 attributes resulting from the IMS and SRMS matrix.
IDAttributeIMSSRMS
AverageScore MaxScore MinOutdegreeIndegreeCentrality
A01Usability [14,16,23,69,75,78,81,83,87,91,92,93,94,95,96,97,98,99]9.09953.4053.40106.80
A02Usefulness [78,100]7.09367.5267.52135.04
A03Efficiency [9,101,102,103,104,105,106,107,108]7.89552.4652.46104.92
A04Effectiveness [103,107,109]8.69744.9644.9689.92
A05Navigability [16,21,102,107,110,111,112,113,114,115,116,117]8.29739.6439.6479.28
A06Accessibility [118]6.29166.7266.72133.44
A07Perspicuity [9,101,102,104,105,106]6.69167.6467.64135.28
A08Dependability [9,101,102,104,105,106]7.49546.1246.1292.24
A09Coefficient of thermal spreading [76]3.45166.4066.40132.80
A10Interactivity [118]7.89563.1063.10126.20
A11Safety [67]7.09367.5267.52135.04
A12Practicality [90]5.09164.8864.88129.76
A13Performance [7,11,15,16,20,87,107,119,120,121]7.49549.2049.2098.40
A14Accuracy [11,122]7.89756.5656.56113.12
A15Task completion [11,12,15,16,22,82,87,107,116,119,120]5.89168.1468.14136.28
A16Consistency [11]6.69167.6467.64135.28
A17Time taken/task completion time [11,120]6.69356.2256.22112.44
A18Accomplish the task [93,120]6.69167.8267.82135.64
A19Correctness [16,19,21,22,103,111,123,124]7.49367.2267.22134.44
A20Reliability [108,118,125]6.29151.5451.54103.08
A21Space Perception [12,14,22,71,80,85,89,126,127,128]5.89166.4666.46132.92
A22Image quality [72,129,130,131]7.09352.5652.56105.12
A23Visual clarity [118]7.49566.4066.40132.80
A24Camera height [70]7.89750.7650.76101.52
A25Actor behavior [70]8.29765.0065.00130.00
A26Viewer position [70]6.69167.6467.64135.28
A27Identifiability [17,118]7.49745.8445.8491.68
A28Representation [100]5.49164.0864.08128.16
A29Transparency [132]4.27165.9065.90131.80
A30Comprehensibility [108]8.69753.8853.88107.76
A31Comprehensivity [118]7.49558.4858.48116.96
A32Memorability [118]5.89359.4659.46118.92
A33Information Processing [67]5.89168.1468.14136.28
A34Engagement [12,22,73,74,79,95,103,119,127,133,134,135,136,137]7.09166.7266.72133.44
A35Satisfaction [20,68,77,78,87,95,100,103,107,109,119,133,134]7.09166.7266.72133.44
A36Perceived satisfaction [138]6.69167.6467.64135.28
A37Enjoyment [8,138,139]7.49164.5264.52129.04
A38Enjoy [140]7.09166.7266.72133.44
A39Stimulation [9,17,101,102,104,105,106,108] 5.89164.5664.56129.12
A40Pleasant [98]8.69764.5264.52129.04
A41Frustration [107,121,139]6.29143.3843.3886.76
A42Valence [135,140]5.49162.4062.40124.80
A43Importance [132,140]6.29163.4663.46126.92
A44Arousal [135,140]5.47160.9460.94121.88
A45Impressed [140]6.69164.5264.52129.04
A46Connectedness [125]5.09164.8864.88129.76
A47Comfortability [12]7.89559.9059.90119.80
A48Biofeedback, [12,18,119,127]5.09164.8864.88129.76
A49Competence [67]5.49165.4865.48130.96
A50Continuance intention [8]4.69162.5462.54125.08
A51Aesthetics [13,141]7.49367.2267.22134.44
A52Attractiveness [9,101,102,104,105,106,108]5.09165.0065.00130.00
A53Ergonomics [67,81,85,87]7.49164.5264.52129.04
A54Ease-of-Interaction [142]7.89567.8267.82135.64
A55Cognitive demand [12,15,20,79,107,121,124,139]6.69167.6467.64135.28
A56Cognitive load [143]6.69167.6467.64135.28
A57Physical Demand [68,107,121]7.89756.5656.56113.12
A58Temporal demand [107]6.29552.8652.86105.72
A59Effort [107,121]5.89166.4666.46132.92
A60Learning performance [92,138,144] 7.89756.5656.56113.12
A61Novelty [9,90,101,102,104,105,106]7.09555.2255.22110.44
A62Innovation [84,108] 7.89754.2854.28108.56
A63Perceived usefulness [68,145]8.29745.4445.4490.88
A64Perceived ease of use [68,145]9.09953.4053.40106.80
A65Easy to use [84,100]8.29765.0065.00130.00
A66Willingness to Use [95]6.29163.4663.46126.92
A67Affordance [146]7.49558.4858.48116.96
A68Price Value [16,147]5.09164.8864.88129.76
A69Social interaction [95]5.09164.8864.88129.76
A70Social presence [8,95]5.49165.4865.48130.96
A71Social behavior [138]5.49165.4865.48130.96
A72Social Richness [95]6.29361.4261.42122.84
A73Social Realism [95]5.89164.3864.38128.76
A74Achievement [8]6.69545.3645.3690.72
A75Self-presentation [8]5.49165.4865.48130.96
A76Captivation [125]7.89558.1858.18116.36
A77Immersion [77]6.29158.4258.42116.84
A78Immersion aspect [96]5.89160.6260.62121.24
A79Fantasy [8]5.49165.8665.86131.72
A80Escapism [8]5.89166.4666.46132.92
A81Attention [140,143]6.69553.0453.04106.08
A82Intuitiveness [121]8.29750.2850.28100.56
A83Naturalness [77,84]5.87164.5264.52129.04
A84Sense of realism [122]7.89558.1858.18116.36
A85Autonomy [67]6.27342.6842.6885.36
A86Relatedness [67]6.69167.6467.64135.28
A87Credibility [77] 8.29765.0065.00130.00
Table 6. Importance values obtained from the FZMS.
Table 6. Importance values obtained from the FZMS.
Importance: 1.0Importance: 0.8Importance: 0.7Importance: 0.6Importance: 0.5Importance: 0.4Importance: 0.3Importance: 0.2
A 01 A 64 A 04 A 30 A 34 A 35 A 37 A 38 A 40 A 53 A 83 A 85 A 02 A 03 A 06 A 07 A 10 A 11 A 16 A 18 A 19 A 20 A 22 A 26 A 36 A 41 A 43 A 44 A 45 A 47 A 51 A 54 A 55 A 56 A 66 A 76 A 77 A 84 A 86 A 05 A 08 A 09 A 13 A 15 A 17 A 21 A 23 A 25 A 28 A 31 A 33 A 39 A 42 A 49 A 59 A 63 A 65 A 67 A 70 A 71 A 73 A 75 A 78 A 79 A 80 A 82 A 87 A 12 A 29 A 32 A 46 A 48 A 50 A 52 A 61 A 68 A 69 A 72 A 14 A 24 A 57 A 60 A 62 A 74 A 81 A 58 A 27
Table 7. Attributes that entered phase 4 with DEMATEL.
Table 7. Attributes that entered phase 4 with DEMATEL.
IDAttribute
A01Usability
A03Efficiency
A05Navigability
A07Perspicuity
A08Dependability
A13Performance
A15Task completion
A19Correctness
A21Space Perception
A22Image quality
A34Engagement
A35Satisfaction
A39Stimulation
A48Biofeedback
A51Aesthetics
A52Attractiveness
A55Cognitive demand
A53Ergonomics
A61Novelty
Table 8. Initial direct relationships matrix resulting from the evaluation average by the panel of experts.
Table 8. Initial direct relationships matrix resulting from the evaluation average by the panel of experts.
A01A03A05A07A08A13A15A19A21A22A34A35A39A48A51A52A53A55A61
A010.03.63.83.03.22.83.23.62.61.63.43.23.62.43.23.03.03.62.0
A033.60.02.63.23.43.03.22.81.81.42.63.02.63.22.41.82.24.01.6
A053.83.60.03.22.43.23.03.42.22.63.63.43.42.03.22.83.23.61.6
A073.63.03.20.03.01.63.23.22.21.22.83.43.22.42.82.82.63.22.0
A083.83.22.63.20.02.42.83.02.01.03.23.22.62.02.82.82.42.62.2
A133.24.03.01.83.00.02.63.01.62.03.63.42.62.01.81.82.23.21.4
A153.03.02.62.02.42.40.02.21.61.02.22.83.02.61.21.62.62.81.4
A193.63.22.83.23.22.02.80.01.81.23.43.43.02.63.43.22.43.01.8
A213.02.62.81.82.61.61.82.60.01.22.42.43.22.43.22.42.23.02.0
A223.42.42.62.61.62.61.62.81.40.04.02.83.82.64.04.01.63.41.4
A343.01.81.82.23.02.02.43.21.62.40.03.63.22.42.63.42.83.02.2
A353.43.03.02.42.81.82.82.82.21.84.00.03.82.83.02.22.83.22.0
A393.62.02.22.42.61.42.82.82.22.43.43.40.03.02.42.42.63.82.2
A482.42.01.22.21.41.42.22.21.81.61.82.62.60.01.62.42.02.62.0
A513.22.61.83.02.42.02.22.83.43.23.83.24.02.40.04.03.02.61.6
A523.02.42.23.62.81.83.02.23.02.44.04.04.02.84.00.03.02.82.0
A533.22.83.22.62.03.22.62.62.61.63.03.63.43.22.42.20.03.01.6
A552.82.82.02.61.61.62.22.82.21.42.62.62.82.63.22.42.60.02.2
A612.42.22.41.83.41.61.82.02.02.43.42.83.02.62.62.62.02.60.0
Table 9. Normalized direct relationships matrix fragment resulting from the transformation of matrix A .
Table 9. Normalized direct relationships matrix fragment resulting from the transformation of matrix A .
A01A15A34A35A39A51A53
A010.000.060.060.060.070.060.05
A150.050.000.040.050.050.020.05
A340.050.040.000.070.060.050.05
A350.060.050.070.000.070.050.05
A390.070.050.060.060.000.040.05
A510.060.040.070.060.070.000.05
A530.060.050.050.070.060.040.00
Table 10. Total relationships matrix fragment resulting from the transformation of matrix M .
Table 10. Total relationships matrix fragment resulting from the transformation of matrix M .
A01A15A34A35A39A51A53
A010.430.400.480.470.490.420.39
A150.370.260.350.360.370.290.30
A340.420.340.360.420.420.360.34
A350.450.370.450.380.450.390.36
A390.430.350.420.420.370.360.34
A510.460.360.460.450.470.350.37
A530.440.360.430.440.440.370.30
Table 11. Attributes classified by cause or effect according to DEMATEL.
Table 11. Attributes classified by cause or effect according to DEMATEL.
IDAttributeD + RD − RGroup
A22Image quality11.482.26CAUSE
A61Novelty10.901.37
A05Navigability14.011.18
A13Performance11.921.10
A52Attractiveness14.080.71
A21Space Perception11.550.66
A53Cognitive demand13.260.39
A07Perspicuity13.560.32
A51Aesthetics14.130.23
A08Dependability13.320.16
A19Correctness14.07−0.01EFFECT
A03Efficiency13.76−0.25
A01Usability15.72−0.41
A15Task completion12.26−0.88
A35Satisfaction14.89−0.95
A34Engagement14.49−1.37
A39Stimulation14.68−1.37
A48Biofeedback11.57−1.39
A55Ergonomics13.87−1.76
Table 12. Weighting coefficient of the 19 attributes.
Table 12. Weighting coefficient of the 19 attributes.
A01A03A05A07A08A13A15A19A21A22A34A35A39A48A51A52A53A55A61
15.7313.7614.0613.5613.3211.9712.3014.0711.5711.7014.5514.9214.7511.6514.1414.0913.2613.9810.99
Table 13. Standardized coefficient of the 19 attributes.
Table 13. Standardized coefficient of the 19 attributes.
A01A03A05A07A08A13A15A19A21A22A34A35A39A48A51A52A53A55A61
0.06180.05410.05530.05330.05240.04710.04830.05530.04550.04600.05720.05870.05800.04580.05560.05540.05210.05500.0432
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gutiérrez, L.E.; Samper, J.J.; Jabba, D.; Nieto, W.; Guerrero, C.A.; Betts, M.M.; López-Ospina, H.A. Combined Framework of Multicriteria Methods to Identify Quality Attributes in Augmented Reality Applications. Mathematics 2023, 11, 2834. https://doi.org/10.3390/math11132834

AMA Style

Gutiérrez LE, Samper JJ, Jabba D, Nieto W, Guerrero CA, Betts MM, López-Ospina HA. Combined Framework of Multicriteria Methods to Identify Quality Attributes in Augmented Reality Applications. Mathematics. 2023; 11(13):2834. https://doi.org/10.3390/math11132834

Chicago/Turabian Style

Gutiérrez, Luz E., José Javier Samper, Daladier Jabba, Wilson Nieto, Carlos A. Guerrero, Mark M. Betts, and Héctor A. López-Ospina. 2023. "Combined Framework of Multicriteria Methods to Identify Quality Attributes in Augmented Reality Applications" Mathematics 11, no. 13: 2834. https://doi.org/10.3390/math11132834

APA Style

Gutiérrez, L. E., Samper, J. J., Jabba, D., Nieto, W., Guerrero, C. A., Betts, M. M., & López-Ospina, H. A. (2023). Combined Framework of Multicriteria Methods to Identify Quality Attributes in Augmented Reality Applications. Mathematics, 11(13), 2834. https://doi.org/10.3390/math11132834

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop