Next Article in Journal
Advances and Applications of Three-Dimensional-Printed Patient-Specific Chest Phantoms in Radiology: A Systematic Review
Previous Article in Journal
An Enhanced Aircraft Carrier Runway Detection Method Based on Image Dehazing
Previous Article in Special Issue
Evaluation of a Smart Audio System Based on the ViP Principle and the Analytic Hierarchy Process Human–Computer Interaction Design
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Construction and Ranking of Usability Indicators for Medical Websites Based on Website User Experience

Department of Industrial Design, Hanyang University, ERICA Campus, Ansan 15588, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(13), 5465; https://doi.org/10.3390/app14135465
Submission received: 30 May 2024 / Revised: 15 June 2024 / Accepted: 20 June 2024 / Published: 24 June 2024

Abstract

:
In the era of digitalization, medical websites have rapidly expanded their healthcare market share due to their convenience. However, with this user-base expansion, issues with poor user experience have surfaced. To address this, we developed and ranked usability indicators for medical websites, aiming to improve their design and development from a user experience perspective, thereby improving user satisfaction and the website’s usability. Initially, we reviewed the relevant literature and summarized 30 usability indicators. Subsequently, we formed a Delphi panel of 20 experts and preliminarily identified 24 usability indicators through the Delphi survey method. Using data from 300 valid user surveys, we applied the Exploratory Factor Analysis (EFA) method to categorize these 24 indicators into four groups. Finally, we assessed the relative importance and priorities of these indicators using the Analytic Hierarchy Process (AHP) method. The results showed that, in terms of criterion layer weight priorities, Trust and Security (0.5494), Basic Performance (0.2710), and Features and Technology (0.1355) exhibited higher proportions. For the solution layer, Property Protection (0.1894), Credibility (0.1852), Privacy Protection (0.1194), Effectiveness (0.0932), and Findability (0.0579) exhibited higher weight proportions. The findings of this study will assist in future usability assessments and enhancements of medical websites. By optimizing the usability, we can both advance the digitalization of medical websites and improve the usability of medical websites, and enhance the service experience and satisfaction of your users.

1. Introduction

In the digital age, the Internet has become an indispensable dimension of global infrastructure [1]. In the field of healthcare, the internet has demonstrated unique value and potential. For example, with the widespread adoption of mobile internet technology, users can access medical information and a variety of services anytime and anywhere through mobile devices, significantly enhancing the accessibility and convenience of medical websites [2]. According to a report by GlobalMed, nearly three-quarters of millennials prefer the convenience and immediacy of teleconsultations over in-person appointments [3], indicating a substantial user market for telemedicine. Furthermore, telemedicine can provide better services for patients in regions with poor medical conditions, significantly alleviating the imbalance in medical resource allocation and improving public health levels [4]. Research by Gao, J., Fan, C., et al., highlights that telemedicine offers a feasible solution to the unequal distribution of healthcare resources, making it an increasingly popular option for bridging the gap in healthcare service capacity and quality between urban and rural areas [5]. During the COVID-19 pandemic, the public’s demand for online remote medical services surged dramatically. Governments and relevant departments launched policies to support the development of online healthcare services, leading to rapid market recognition and the acceptance of this emerging service model [6]. According to McKinsey & Company, online medical trends have stabilized 38 times higher than pre-pandemic levels [7]. In addition, the online medical market is expected to grow to USD 225 billion by 2030 [8].
Despite the number of medical website users rapidly increasing [9] and the market showing strong growth momentum, a series of existing issues have been exposed. These include disorganized medical web pages [10] on which users struggle to find the required information [11], insufficient user-friendliness [12], complex content information [13], and the lack of secure payment capabilities on medical webpages [14]. All these issues severely affect user experience and satisfaction.
According to research by Gale, J.J., and Black, K.C., the usability of online healthcare directly impacts user engagement and satisfaction, as well as the ability of the service to achieve its goals [11,15]. Therefore, to address these issues and enhance user experience, it is crucial to improve the usability of these medical websites. However, current research on medical websites mainly focuses on usability testing [16,17,18], their acceptance [19,20,21], and telemedicine services [19,22,23]. Although the aforementioned studies positively promote the development of medical websites, research on the development and prioritization of usability metrics for online medical websites from a user experience perspective is quite limited.
In light of this, the aim of this study is to develop a set of user experience-based usability metrics for medical websites. This involves systematically identifying and prioritizing key usability issues to address those that most significantly impact user experience. By doing so, we seek to enhance website utilization and enable a broader user base to access online medical services. The results of this study are anticipated to provide practical references and guidance for the design of future medical websites. Furthermore, by improving user experience, the study aims to increase user engagement and retention with online medical websites, thereby promoting the healthy development of digital healthcare.

2. Research Methodology

In this study, we selected usability indicators related to medical websites by analyzing the literature on website interaction, usability, web usability, and online medical services, we have compiled indicators of usability for medical websites. Then, the final indicators were determined using the Delphi method. Thereafter, dimensionality reduction was performed using Exploratory Factor Analysis (EFA). Finally, weights were assigned to these indicators using the Analytic Hierarchy Process (AHP). A flow chart of our methodology is shown in Figure 1.

2.1. Literature Review

2.1.1. Websites User Experience

With the technological revolution and the rise of the internet, websites designed with user experience in mind have been shown to significantly enhance user satisfaction and loyalty [24]. Consequently, the concept of user experience (UX) has become increasingly crucial [25].
According to ISO 9241-210 [26], user experience encompasses a user’s perceptions and responses before, during, or after using a product, system, or service. These experiences can be direct, as in the operation of device interfaces, or indirect, such as the feelings, thoughts, and perceptions elicited by interacting with a website [25]. Hussain et al. define user experience in terms of the emotions and behaviors people exhibit when interacting with a page [27]. In terms of websites, user experience refers to whether the website is easy to navigate, whether the information is clearly presented in an easy-to-understand language, and whether the design effectively supports users in completing tasks [28]. The research of Zlokazova, T., Blinnikova, I., et al. shows that the structure and format of the web page also affect user experience [29]. The studies by Casalo, L., and Flavian, C., suggest that considering user experience, website design should be simple, direct, and easy to use [30]. Alben’s research suggests that a website’s user experience is influenced not only by technical and objective factors but also by the user’s emotional state; thus, according to the literature review, user experience encompasses Effectiveness, Efficiency, Readability, Screen Design and Layout, and Satisfaction.

2.1.2. Usability

Usability was first introduced in the 1970s, with the concept varying among researchers and target groups [31].
IEEE Std.610.12 (1990) defines usability as an attribute that facilitates system input and output and makes it easy to learn how to operate the system [32]. Nielsen (1993) defines usability as how easily users can utilize system functionalities, setting five criteria for usability evaluation: Learnability, Efficiency, Memorability, Errors, and Satisfaction [33]. Duma and Redish (1993) define usability as the degree to which a user of a product can quickly and easily complete tasks, considering good usability as user-centered, effective, efficient, and above all easy to use [34]. ISO 9241-11 (1998) [35] describes it as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use. Iwarsson et al. (2003) view usability as the ability of users to instinctively and effectively use a product, interface, or system [36]. Hu (2006) [37] considers usability as the degree to which specific users can efficiently, effectively, and satisfactorily implement a system, product, or service in a specific environment.
Numerous studies measure product usability based on these attributes, which serve as benchmarks in the evaluation process, such as those by Man Lee and Maeng Ho Kim on the usability of smart home apps [38] and Zhang Chi and Chung Gunjang on the factors influencing user experience of smartphone travel apps [39]. Thus, in this paper, we also use these standards as benchmarks for measuring the usability indicators of medical websites.
Moreover, the literature indicates that the concept of usability has evolved with time. Where it was initially mainly considered to be a product or system’s ease of use and learning, usability has now expanded to include personal evaluations and users’ subjective feelings. In particular, modern usability indicators not only focus on technical operability but also emphasize user experience, making the concept more comprehensive [40]. Therefore, in this paper, we also consider the characteristics of user experience in constructing usability indicators for medical websites.

2.1.3. Web Page Usability

Web page usability refers to the functionality of a website stemming from a design approach that focuses on user needs. In this area, a user-centered design process is employed to ensure that websites are efficient and easy to use for users [41].
Lindgaard (1994) defines the field of website usability evaluation and proposes usability assessment criteria for different website developmental stages. These criteria include Navigation, Screen Design and Layout, Terminology, Feedback, Consistency, Sensory Forms, Redundancy, User Control, and Task Conformity [42]. Richardson, B. and Campbell-Yeo, M., et al. describe the usability of a page through a framework of attributes, including usability, usefulness, desirable, findability, accessibility, credibility, and value [43]. Lee and Kozar propose 10 dimensions for assessing the usability and user experience of websites, i.e., Consistency, Navigability, Supportability, Learnability, Simplicity, Interactivity, Emotional Engagement, Credibility, Content Relevance, and Readability [44]. Through a series of three studies, Palmer, J.W. demonstrates that website usability and success are commonly associated with download speed, layout organization, information ordering, the type and amount of content, interactivity, customization, and responsiveness [45]. These indicators are widely used in web usability research. Therefore, in this study, we also adopted these assessment methods as the evaluation benchmarks for our research methodology.

2.1.4. Online Medical Services

Online medical services are a form of telemedicine that provides healthcare services via the Internet. Consultations and treatments are provided through online consultations for individuals who are unable or unwilling to visit medical facilities due to time or location constraints. These services can help treat various conditions and improve access to high-quality medical care in remote areas [46,47].
Medical websites are considered convenient and efficient platforms [48], similar to primary care providers [49]. This is because they enable patients to access required medical services from anywhere at any time. Moreover, medical websites can reduce costs, enhance patient engagement, and facilitate easier access to information resources [50].
However, as noted in a study by Meszaros, J. and Buchalcevova, A. [51], there are also downsides to Internet-based services, such as the risk of information leakage, financial risks, performance risks, and ineffective information. Christensen et al. believe that the credibility of medical websites determines patient choices [52]. Similarly, studies by Eysenbach G, Powell J, Bernstam EV, and Walji MF highlight consumer concerns about the quality of online health information [53,54]. In summary, we considered convenience, efficiency, privacy protection, property protection, effectiveness, credibility, and user engagement as the usability benchmarks for online medical services.

2.2. The Delphi Survey Method

The Delphi method was developed in the 1950s to obtain reliable consensus opinions from a group of experts through a series of questionnaires [55,56,57]. It has been applied in many health-related fields, including clinical medicine and public health research [58,59].
In the Delphi method, questionnaires are scored anonymously, and participants are encouraged to add new ideas, amend existing responses, or suggest the removal of redundant answers across several iterations until a consensus is reached [52,60]. Typically, a carefully selected anonymous panel of experts undergoes two to three rounds of structured surveys, concluding when consensus is achieved [61,62].
The Delphi method is a commonly used research technique, widely regarded as a part of survey research [63,64]. This paper chooses the Delphi method not only because it is extensively applied in health science research to determine priorities and reach consensus on important issues, addressing fundamental problems in healthcare [65], but also due to its features of anonymity, controlled feedback, flexible statistical analysis options, and the ability to gather participants from different geographical regions [66]. These characteristics enable experts from diverse fields to conduct a comprehensive and multi-faceted evaluation of the research subject.
The number of members in a Delphi panel is generally between 8 and 20 [65]. In the study by L., Taylor, H., and Reyes, H., the number of Delphi experts was 12 [67]. In the study by Zhang, Y., Hamzah, H., and Adam, M., the number of Delphi experts was 15 [68]. To maximize the diversity of the sample, in this Delphi survey, twenty experts were invited to participate, including eight industrial designers with over 5 years of experience, two physicians with over 10 years of experience, four web designers with over 5 years of experience, and six interaction designers with over 5 years of experience. The survey was conducted anonymously using the Questionnaire Star (wenjuanxing: https://www.wjx.cn/ accessed on 15 April 2024) software over two rounds on 20 April 2024 and 23 April 2024, with consensus among the experts being reached by the end of the second round.

2.3. Exploratory Factor Analysis

Exploratory Factor Analysis (EFA) was initially proposed by Charles Spearman in 1904 [69]. It is a multivariate technique that addresses questions related to the possibility that several underlying variables explain many individual variables [70].
This study employs Exploratory Factor Analysis (EFA) because it allows researchers to identify underlying dimensions or factors within a dataset and decompose items into discrete dimensions that can be summed or aggregated [71]. This aligns with the needs of the research, making EFA the chosen method.
In this study, we utilized the usability indicators derived from the Delphi method to create an online survey on the Questionnaire Star platform, collecting data from users with actual experience for the Exploratory Factor Analysis (EFA). This online survey was conducted from 1 May 2024 to 3 May 2024, during which 327 questionnaires were distributed. Subsequently, 300 were judged to be valid and 27 invalid. The survey explored users’ opinions on the importance of design indicators for medical websites. A Likert 5-point scale was used for measurement.

2.4. Analytic Hierarchy Process

The Analytic Hierarchy Process (AHP) is a popular group decision-making method that has been applied in various fields [72], including healthcare [73,74,75], education [76,77], and business [78,79]. The AHP is used to evaluate options, allocate resources, compare benefits and costs, and perform system management [80].
In the product development process, making the right decisions is crucial, as inaccurate decisions can lead to product redesigns. An effective tool for determining the most suitable decision-making scheme is the Analytic Hierarchy Process (AHP) [81]. According to the research by Nukman, Y., Ariff, H., et al. AHP has been applied in nearly all decision-related applications [81]. Therefore, this study employs the AHP method to continue decision-making regarding the importance of various indicators.
In this study, we first constructed a three-level framework of indicators, namely, the overall goal level, the criteria level, and the alternatives level. Subsequently, a pairwise comparison matrix questionnaire was developed based on the structural hierarchy. This questionnaire was administered to 15 industry experts with more than five years of experience [82], including two web designers, eight industrial designers, and five interaction designers. The Saaty 1–9 scale method [83] was employed to score the indices in the matrix. The questionnaire was distributed from 5 May 2024 to 7 May 2024, achieving a 100% return and efficiency rate.

3. Research Execution and Analysis

3.1. Derivation of Usability Indicators

Following the literature review on user experience, usability, web page usability, and online medical services, redundant and semantically similar indicators were removed. Ultimately, a set of 30 usability indicators was compiled, as shown in Table 1.
Based on these indicators, we conducted a survey, the results of which are presented in the following table. Data analysis for this survey was performed using SPSS 27.0 (https://www.ibm.com/support/pages/downloading-ibm-spss-statistics-27, accessed on 30 April 2024). According to the research by Preece, J., Rogers, Y., and others, indicators were considered to have high consensus among experts when the p-value (p) was less than 0.05, the mean (M) was greater than 3.5, and the coefficient of variation (CV) was less than 0.3 [84]. Therefore, these criteria were used to evaluate the data from this survey. It is important to note that when calculating the coefficient of variation (CV), the standard deviation (SD) must be divided by the mean (M). Therefore, the SD values have been included in the table. After the first round of the Delphi survey, while the scores for Terminology met the standards, Readability was found to encompass the meaning of Terminology; therefore, Terminology was removed and not included in the second round of voting. Following the completion of the second round, a consensus was reached among the experts, and no third round of survey testing was conducted. Both rounds of the survey achieved a 100% response rate. The Delphi survey data are shown in Table 2.
Ultimately, after two rounds of the Delphi survey and subsequent analysis, the indicators obtained are as shown in Table 3.

3.2. Dimension Reduction and Naming of Usability Indicators

Before conducting an Exploratory Factor Analysis (EFA), it is essential to assess the validity and reliability of the data obtained from the online survey to ensure their suitability. A Kaiser–Meyer–Olkin (KMO) value greater than 0.7 and Bartlett’s test of sphericity significance value less than 0.05 indicate good validity [82]. A Cronbach’s alpha coefficient between 0.7 and 0.95 suggests good reliability of the scale [85,86].
Consequently, in this study, we calculated the KMO value, Bartlett’s test of sphericity, and Cronbach’s alpha coefficient for the survey indicators using SPSS 27.0, as shown in Table 4. The results yielded a KMO of 0.923, Bartlett’s test of sphericity significance < 0.05, and a Cronbach’s alpha of 0.923. These results confirmed that the survey data were both reliable and valid, making it suitable for EFA analysis.
In the Exploratory Factor Analysis (EFA), the criteria for factor extraction were set such that only factors with eigenvalues (λ) greater than 1 were considered, and any factor composed solely of a single item was excluded. If the loading difference between two items on the same factor was less than 0.05, one of the items was removed and the analysis was re-run. Items associated with a factor with a communality of less than 0.4 or a maximum loading of less than 0.35 were also excluded [87]. After multiple rounds of selection and rotation, four common factors with eigenvalues greater than 1 were ultimately identified, and 24 indicators related to the usability of medical websites were retained.
The aforementioned methodology determined the proportion of each factor in the total variance, as shown in Table 5. The scree plot illustrated in Figure 2 further elucidates the contribution of each factor to the total variance. The rotated component matrix in Table 6 provides a detailed depiction of the correlations between the extracted factors (such as B1, B2, etc.), and the indicators (such as Effectiveness, Convenience, etc.).
Based on the results of the rotated component matrix and the characteristics of the indicators in each dimension, we denoted the reduced common factors as follows:
The first group of common factors, B1, includes the following indicators: Effectiveness, Convenience, Usability, Efficiency, Errors, Satisfaction, and Learnability. These indicators primarily measure the basic experience and efficacy of users when using products or services. They relate to the fundamental functions of the product and are, thus, named Basic Performance.
The second group of common factors, B2, comprises the following indicators: Feedback, Navigation, Screen Design and Layout, Consistency, User-Centered, and Interactivity. These indicators focus on the quality of product design and aspects of user interaction and are, therefore, named Design and Interface.
The third group of common factors, B3, includes the following indicators: Accessibility, Remote Presentation, Findability, Unnecessariness, Controllability, and Responsiveness. These indicators focus on the technical and functional aspects of the product, ensuring that technological support meets user needs; hence, they are named Features and Technology.
The fourth group of common factors, B4, contains the following indicators: Readability, Credibility, Privacy Protection, Content Relevance, and Property Protection. These indicators relate to the trust and security users feel towards the product or service, including the protection of user information, the relevance, and the accuracy of content, and are, thus, named Trust and Security.

3.3. Calculation of Usability Metric Weights

3.3.1. Building the Hierarchical Model

The entire hierarchical model is divided into three levels: the overall objective layer, the criterion layer, and the solution layer, as shown in Table 7.

3.3.2. Constructing the Judgment Matrix

The hierarchical model was input into the YAAHP 10.1 (https://www.metadecsn.com/yaahp/, accessed on 30 April 2024) software to conduct verification based on the AHP hierarchical model. Following the verification of the hierarchical model, a judgment questionnaire was developed. The scoring criteria for the questionnaire are shown in Table 8.
After scoring, the results of the questionnaire, which was completed by 15 experts, were converted into a judgment matrix. This matrix was used to conduct pairwise comparisons of the various indicators at each level. The method for constructing the judgment matrix is as follows:
A = a i j n × n = a 11 a 12 a 1 n a 21 a 22 a 2 n a n 1 a n 2 a n n
In the matrix, aij denotes the outcome of comparing the significance of indicators i and j within the same subgroup. Here, aij is greater than zero, aij is the reciprocal of aji, and ajj equals 1, for, i, j = 1, 2, 3, …, n, where n represents the total number of subgroups included in A.

3.3.3. Calculation of Weights

By standardizing the judgment matrix through its eigenvector computation, we progressively determined the relative importance of each component. The weight values accumulated at each hierarchical level were measured in relation to the overarching objective and were calculated progressively from the upper to lower tiers. Within the framework of a layered decision-making process, the weight designated to each level underpins the assessment of the elements’ relative significance within that level [88].
The procedure for computing the sorting method for individual levels is outlined as follows:
(1)
Calculate the product of each row’s indicators in the judgment matrix Mi. m is the total number of indicators in the judgment matrix.
M i = j = 1 m a i j j = 1 , 2 , , m
(2)
Calculate the nth root of Mi.
W i = M i n   ( i = 1 , 2 , , n )
(3)
Normalize Wi to obtain the eigenvector ωi.
ω i = W i j = 1 m W j j = 1 , 2 , , m
(4)
The formula for the maximum value of the judgment matrix is as follows:
λ m a x = i = 1 n A ω i n ω i
(5)
Consistency Test
The Consistency Ratio (CR) is defined as the ratio of the Consistency Index (CI) of the judgment matrix to the Random Consistency Index (RI). The judgment matrix is deemed consistent if the CR is less than 0.1. If the CR exceeds 0.1, the judgment matrix needs to be reconstructed [89].
The formula for calculating the CR is as follows:
CI = λ max n n 1
CR = CI RI
where the RI value in the equation is based on Table 9.
The Analytic Hierarchy Process (AHP) utilizes Saaty’s 1–9 scale for pairwise comparisons. Accordingly, we first organize the data scored by 15 experts using Saaty’s rating scale into Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15. The evaluation method can be illustrated using the primary indicators B1 and B2 from Table 10. The ratio of the B1 indicator on the vertical axis to the B1 indicator on the horizontal axis is 1 since they are the same indicator. The ratio of the B1 indicator on the vertical axis to the B2 indicator on the horizontal axis is 7, indicating that B1 is significantly more important when compared to B2. Conversely, the ratio of the B2 indicator on the vertical axis to the B1 indicator on the horizontal axis is 1/7, which also indicates that B2 is significantly less important. Similarly, in pairwise comparisons of B2 and B3 indicators, the ratio of the B2 indicator on the vertical axis to the B3 indicator on the horizontal axis is 1/5, showing that B3 is more important than B2. By analogy, all pairwise comparison data in the table are derived from experts’ evaluations of the importance of each pair of indicators using Saaty’s rating scale. Finally, based on the experts’ scores, Formulas (2)–(7) are applied to determine the weights at each level, which are then annotated in Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15.
According to the above rules, the criterion layer indicator weights are as follows:
Table 10. Weights of the criteria layer indicators data.
Table 10. Weights of the criteria layer indicators data.
AB1B2B3B4WiCRλmax
B11731/30.27100.07724.2063
B21/711/51/80.0442
B31/3511/50.1355
B438510.5494
According to the above rules, the solution layer indicator weights are determined as follows:
Table 11. Weights of the solution layer under the B1 indicator.
Table 11. Weights of the solution layer under the B1 indicator.
B1C1C2C3C4C5C6C7WiCRλmax
C115444330.34400.08927.7278
C21/511/51/2321/30.0760
C31/4512351/20.1785
C41/421/214320.1479
C51/41/31/31/4111/30.0482
C61/31/21/51/3111/30.0533
C71/3321/23310.1521
Table 12. Weights of the solution layer under the B2 indicator.
Table 12. Weights of the solution layer under the B2 indicator.
B2C8C9C10C11C12C13WiCRλmax
C811331/710.11720.06576.4142
C911321/830.1295
C101/31/311/31/81/20.0410
C111/31/2311/61/20.0720
C127886170.5529
C1311/3221/710.0873
Table 13. Weights of the solution layer under the B3 indicator.
Table 13. Weights of the solution layer under the B3 indicator.
B3C14C15C16C17C18C19WiCRλmax
C1411/21/51/31/41/20.05030.09316.5866
C15211/31/31/220.0970
C165315560.4272
C17331/51460.2352
C184251/4120.1282
C1921/21/61/61/210.0620
Table 14. Weights of the solution layer under the B4 indicator.
Table 14. Weights of the solution layer under the B4 indicator.
B4C20C21C22C23C24WiCRλmax
C2011/71/71/31/80.03580.03235.1445
C21712610.3370
C2271/2151/20.2172
C2331/61/511/60.0653
C24812610.3447
Considering that each indicator at a lower level operates within a framework set by a higher level, assessing the relative values of weights within a single layer alone is insufficient. To calculate the overall weights, we adopted a method from the literature [90], which involves multiplying the weights of lower-level indicators by the weights of their respective higher-level indicators. This approach determines the relative importance of each factor in the overall decision-making process. The formula for calculating the overall weights is as follows:
Let Wi represent the weight of a primary indicator, and wij represent the weight of the jth secondary indicator under the ith primary indicator. The composite weight OWij for the jth secondary indicator can be calculated using the formula below:
O W i j = W i × w i j
where i is the index for the primary indicators, and j is the index for the secondary indicators given i.
Based on Formula (8), the overall weights for all criteria on the solution level were calculated. The single-layer weight values and the total weight values for each are compiled in Table 15.
Table 15. Weight summary.
Table 15. Weight summary.
Criterion LayerWeightsRankSolution LayerWeightsRankOverall WeightsRank
B10.27102C10.344010.09324
C20.076050.020612
C30.178520.04846
C40.147940.04018
C50.048270.013117
C60.053360.014415
C70.152130.04127
B20.04424C80.117230.005221
C90.129520.005720
C100.041060.001824
C110.072050.003223
C120.552910.024411
C130.087340.003922
B30.13553C140.050360.006819
C150.097040.013116
C160.427210.05795
C170.235220.031910
C180.128230.017414
C190.062050.008418
B40.54941C200.035850.019713
C210.337020.18522
C220.217230.11943
C230.065340.03599
C240.344710.18941

4. Results and Discussion

4.1. Criterion Layer Weights

As shown in Figure 3, Trust and Security emerges as the most critical indicator within the criterion layer, followed by Basic Performance (B1, 0.2710), Features and Technology (B3, 0.1355), and Design and Interface (B2, 0.0442). This reflects the high level of concern users have regarding the protection of personal information and health data when using medical websites. Additionally, Basic Performance and Features and Technology ensure that the website both meets the basic operational needs of users and offers technological functionality and services. Although Design and Interface carry a lower weight, they still play a significant role in enhancing user experience, boosting user satisfaction, and establishing brand identity. Therefore, in the design and development of medical websites, there should be a greater emphasis on protecting user health information and providing an efficient and convenient service experience, rather than solely focusing on visual appeal.

4.2. Solution Layer Weights

As illustrated in Figure 4, within the Basic Functionality B1 level, Effectiveness (C1, 0.3440) holds the highest weight, followed by Learnability (C7, 0.1521), Easy to use (C3, 0.1785), and Efficiency (C4, 0.1479). Next, we have Satisfaction (C6, 0.0533), Errors (C5, 0.0482), and Convenience (C2, 0.0760). These rankings indicate that, within Basic Functionality, users primarily focus on whether the platform can accurately and effectively perform its intended functions. Additionally, users care about the ability to quickly learn how to use a platform and the efficiency of the service. Therefore, when optimizing the basic functionalities of a medical website, priority should be given to enhancing the platform’s effectiveness, ease of learning, and usability.
As shown in Figure 5, within the Design and Interface B2 level, the indicator User-Centric Design (C12, 0.5529) carries the highest weight, followed by Navigation (C9, 0.1295), Feedback (C8, 0.1172), Interactivity (C13, 0.0873), Consistency (C11, 0.0720), and finally, Screen Design and Layout (C10, 0.0410). These rankings highlight that, in Design and Interface, users are most concerned with whether the interface is user-centric, i.e., whether the website provides an interactive experience that meets user needs and expectations. Additionally, good navigation and timely feedback are crucial factors in enhancing usability, playing a significant role in ensuring a smooth and intuitive user experience. Therefore, when optimizing the design and interface of a medical website, the principles of user-centric design should be prioritized, along with ensuring good navigation and effective interactivity, to enhance the overall user experience.
As depicted in Figure 6, within the Features and Technology B3 level, Searchability (C16, 0.4272) holds the highest weight, followed by Redundancy (C17, 0.2352), Controllability (C18, 0.1282), Remote Presentation (C15, 0.0970), Responsiveness (C19, 0.0620), and finally, Accessibility (C14, 0.0503). These rankings indicate that, in Features and Technology, users are most concerned about their ability to quickly and accurately find the information and services they need. Additionally, users care about the redundancy and controllability of the webpage, i.e., avoiding redundant functions on the platform and the degree of control users have over platform operations, such as font size and selecting specific operational processes. Therefore, when optimizing the features and technology of a medical website, priority should be given to enhancing the searchability of information and reducing the redundancy of the interface.
As shown in Figure 7, within the Trust and Security B4 level, Property Protection (C24, 0.3447) has the highest weight, followed by Credibility (C21, 0.3370), Privacy Protection (C22, 0.2172), Content Relevance (C23, 0.0653), and finally, Readability (C20, 0.0358). These rankings demonstrate that, in Trust and Security, users are most concerned about the protection of their assets. Additionally, the credibility of the information provided by the website and privacy protection are key factors that users consider, which are directly related to the site’s reputation, user trust, and privacy. Therefore, when optimizing trust and security on a medical website, priority should be given to strengthening measures for property and privacy protection. It is also essential to ensure the credibility and relevance of the medical information provided, to enhance the overall trust and security felt by users.

4.3. Overall Weights

As shown in Figure 8, in the total weight analysis, Property Protection (C24, 0.1894) is considered the most important factor, indicating that ensuring the security of users’ financial assets is the most significant factor for medical websites. This is followed by Credibility (C21, 0.1852), which emphasizes the critical importance of establishing and maintaining users’ trust in medical websites. Privacy Protection (C22, 0.1194), ranking third, highlights the importance of safeguarding users’ personal data, a key component that cannot be overlooked for medical websites. Effectiveness (C1, 0.0932) and Findability (C16, 0.0579), ranking fourth and fifth, respectively, stress the importance of the efficient delivery of medical information and services and their accessibility to users. Effectiveness focuses on whether services meet the actual needs of users, while Findability concerns whether users can easily locate the information and features they need. Lower-ranking indicators, such as Screen Design and Layout (C10, 0.0018), Consistency (C11, 0.0032), and Interactivity (C13, 0.0039), performed poorly. The low importance of these indicators might suggest that, although they have some impact on the overall user experience, they are not the primary factors considered by users when assessing the usability of medical websites.
These findings provide valuable insights for the future design and improvement of online medical platforms. To enhance user satisfaction and overall website usability, developers and designers should focus on these high-weight indicators. Balancing and optimizing these factors will help create safer, more trustworthy, and user-friendly medical websites.

5. Conclusions

In this study, we aimed to develop and rank a framework of usability indicators for medical websites, grounded in user experience. Initially, 30 potential usability indicators related to user experience, usability, and medical websites were selected from the literature. Through two rounds of Delphi expert panel surveys, 24 key indicators were selected. These indicators were then reduced and categorized into four primary categories using Exploratory Factor Analysis and were subsequently named. The Analytic Hierarchy Process (AHP) was applied to hierarchically organize these indicators and calculate their weights, reflecting the level of user attention to each indicator during usage. The results of this study serve as a framework for evaluating the usability of medical websites, which is crucial for assessing and improving their usability.
However, to address the rapidly changing usability trends and user needs in the digital healthcare field, we have also considered relevant dynamic demands while developing the usability evaluation criteria. Future research can investigate the current literature in the online healthcare domain to identify and summarize the latest indicators that align with the current needs of the healthcare field and its users. Based on this, reasonable additions and adjustments can be made to the usability evaluation system summarized in this study. This approach ensures that the evaluation system maintains the practicality and authority of the usability criteria, thereby better adapting to the ever-changing digital healthcare environment and user expectations.
Despite this, our study does have certain limitations. For example, there was a limited number of experts involved in the survey, which may have resulted in an insufficiently diverse array of opinions. Moreover, the selection of usability assessment indicators for medical websites was solely based on a review of the literature. Considering these issues, future research should consider increasing the number of experts involved in the surveys and expanding the range of selected indicators. The goal would be to more comprehensively and systematically integrate and collect a wider spectrum of opinions and indicators.

Author Contributions

Conceptualization, X.L.; methodology, X.L.; software, X.L.; validation, X.L.; formal analysis, X.L.; investigation, X.L.; resources, X.L.; data curation, X.L.; writing—original draft preparation, X.L.; writing—review and editing, X.L.; visualization, X.L.; supervision, K.P.; project administration, K.P.; funding acquisition, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data used during the current study are available from the first author upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gammon, K. Networking: Four ways to reinvent the Internet. Nature 2010, 463, 602–604. [Google Scholar] [CrossRef] [PubMed]
  2. Kim, H.; Powell, M.P.; Bhuyan, S.S. Seeking Medical Information Using Mobile Apps and the Internet: Are Family Caregivers Different from the General Public? J. Med. Syst. 2017, 41, 38. [Google Scholar] [CrossRef] [PubMed]
  3. 9 Statistics about Telemedicine. Available online: https://www.now-health.com/en/blog/9-statistics-about-telemedicine/ (accessed on 14 June 2024).
  4. Zhang, J.; Lu, Q.; Shi, L. The influence of telemedicine on capacity development in public primary hospitals in China: A scoping review. Clin. eHealth 2022, 5, 91–99. [Google Scholar] [CrossRef]
  5. Gao, J.; Fan, C.; Chen, B.; Fan, Z.; Li, L.; Wang, L.; Ma, Q.; He, X.; Zhai, Y.; Zhao, J. Telemedicine is becoming an increasingly popular way to resolve the unequal distribution of healthcare resources: Evidence from China. Front. Public Health 2022, 10, 916303. [Google Scholar] [CrossRef] [PubMed]
  6. Digital Health Market by Revenue Model (Subscription, Pay Per Service, Free Apps), Technology (Wearables, mHealth, Telehealthcare, RPM, LTC Monitoring, Population Health Management DTX), EHR, Healthcare Analytics, Eprescribing & Region—Global Forecast to 2028. Available online: https://www.marketsandmarkets.com/Market-Reports/digital-health-market-45458752.html?gad_source=1&gclid=CjwKCAjwoa2xBhACEiwA1sb1BECnMy1wtATuBO12_HT8NVYmwEDXLAHdbWcgsDwvmWEn09EZZXqqyhoCry0QAvD_BwE (accessed on 30 April 2024).
  7. Telehealth: A Quarter-Trillion-Dollar Post-COVID-19 Reality? Available online: https://www.mckinsey.com/industries/healthcare/our-insights/telehealth-a-quarter-trillion-dollar-post-covid-19-reality (accessed on 1 May 2024).
  8. Global Online Medical Market by Type (Type I, Type II), by Application (Application I, Application II), by Geographic Scope and Forecast. Available online: https://www.verifiedmarketreports.com/product/online-medical-market-size-and-forecast/ (accessed on 30 April 2024).
  9. Quévat, A.; Heinze, A. The digital transformation of preventive telemedicine in France based on the use of connected wearable devices. Glob. Bus. Organ. Excell. 2020, 39, 17–27. [Google Scholar] [CrossRef]
  10. Benigeri, M.; Pluye, P. Shortcomings of health information on the Internet. Health Promot. Int. 2003, 18, 381–386. [Google Scholar] [CrossRef] [PubMed]
  11. Pandolfini, C.; Impicciatore, P.; Bonati, M. Parents on the Web: Risks for quality management of cough in children. Pediatrics 2000, 105, A1–A8. [Google Scholar] [CrossRef] [PubMed]
  12. Cline, R.J.W.; Haynes, K.M. Consumer health information seeking on the Internet: The state of the art. Health Educ. Res. 2001, 16, 671–692. [Google Scholar] [CrossRef] [PubMed]
  13. Pautler, S.E.; Tan, J.K.; Dugas, G.R.; Pus, N.; Ferri, M.; Hardie, W.; Chin, J.L. Use of the internet for self-education by patients with prostate cancer. Urology 2001, 57, 230–233. [Google Scholar] [CrossRef]
  14. The Common Problems on Medical Website. Available online: https://magesolution.com/common-problems-on-medical-website/ (accessed on 14 June 2024).
  15. Gale, J.J.; Black, K.C.; Calvano, J.D.; Fundingsland, E.L., Jr.; Lai, D.; Silacci, S.; He, S. An analysis of US academic medical center websites: Usability study. J. Med. Internet Res. 2021, 23, e27750. [Google Scholar] [CrossRef]
  16. Saad, M.; Zia, A.; Raza, M.; Kundi, M.; Haleem, M. A comprehensive analysis of healthcare websites usability features, testing techniques and issues. IEEE Access 2022, 10, 97701–97718. [Google Scholar] [CrossRef]
  17. Davis, D.; Jiang, S. Usability testing of existing type 2 diabetes mellitus websites. Int. J. Med. Inform. 2016, 92, 62–72. [Google Scholar] [CrossRef] [PubMed]
  18. Maramba, I.; Chatterjee, A.; Newman, C. Methods of usability testing in the development of eHealth applications: A scoping review. Int. J. Med. Inform. 2019, 126, 95–104. [Google Scholar] [CrossRef] [PubMed]
  19. Tang, Y.; Yang, Y.T.; Shao, Y.F. Acceptance of online medical websites: An empirical study in China. Int. J. Environ. Res. Public Health 2019, 16, 943. [Google Scholar] [CrossRef] [PubMed]
  20. Miao, M.; Lu, Y.; Shuai, Q. Chinese Users’ Acceptance of Medical Health Websites Based on Tam. Pak. J. Stat. 2014, 30, 923–938. [Google Scholar]
  21. Tao, D.; Yuan, J.B.; Shao, F.M.; Li, D.; Zhou, Q.B.; Qu, X. Factors affecting consumer acceptance of an online health information portal among young internet users. CIN Comput. Inform. Nurs. 2018, 36, 530–539. [Google Scholar] [CrossRef] [PubMed]
  22. Aiken, A.R.A.; Starling, J.E.; Gomperts, R. Factors associated with use of an online telemedicine service to access self-managed medical abortion in the US. JAMA Netw. Open 2021, 4, e2111852. [Google Scholar] [CrossRef] [PubMed]
  23. AlDossary, S.; Martin-Khan, M.G.; Bradford, N.K.; Smith, A.C. A systematic review of the methodologies used to evaluate telemedicine service initiatives in hospital facilities. Int. J. Med. Inform. 2017, 97, 171–194. [Google Scholar] [CrossRef] [PubMed]
  24. Kaya, B.; Behravesh, E.; Abubakar, A.M.; Kaya, O.S.; Orús, C. The Moderating Role of Website Familiarity in the Relationships Between e-Service Quality, e-Satisfaction and e-Loyalty. J. Internet Commer. 2019, 18, 369–394. [Google Scholar] [CrossRef]
  25. Hartson, R.; Pyla, P.S. The UX Book: Process and Guidelines for Ensuring a Quality User Experience; Elsevier: Amsterdam, The Netherlands, 2012. [Google Scholar]
  26. Law, E.L.C.; Roto, V.; Hassenzahl, M.; Vermeeren, A.P.; Kort, J. Understanding, scoping, and defining user experience: A survey approach. In Proceedings of the SIGCHI 27th International Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; pp. 719–728. [Google Scholar]
  27. Hussain, J.; Khan, W.A.; Hur, T.; Bilal, H.S.M.; Bang, J.; Hassan, A.U.; Afzal, M.; Lee, S. A multimodal deep log-based user experience (UX) platform for UX evaluation. Sensors 2018, 18, 1622. [Google Scholar] [CrossRef]
  28. What Is Website User Experience (UX) and How Do You Improve It? Available online: https://www.qualtrics.com/experience-management/customer/website-user-experience/ (accessed on 21 May 2024).
  29. Zlokazova, T.; Blinnikova, I.; Grigorovich, S.; Burmistrov, I. Search Results on Flight Booking Websites: Displaying Departure and Return Flights on a Single Page vs. Two Consecutive Pages. In Human-Computer Interaction 2019-INTERACT 2019; Springer International Publishing: Cham, Switzerland, 2019; Volume 11749, pp. 668–671. [Google Scholar]
  30. Casalo, L.; Flavian, C.; Guinaliu, M. The role of perceived usability, reputation, satisfaction and consumer familiarity on the website loyalty formation process. Comput. Hum. Behav. 2008, 24, 325–345. [Google Scholar] [CrossRef]
  31. Folmer, E.; Bosch, J. Architecting for usability: A survey. J. Syst. Softw. 2004, 70, 61–78. [Google Scholar] [CrossRef]
  32. IEEE Std 610.12-1990; IEEE Standard Glossary of Software Engineering Terminology. IEEE: Manhattan, NY, USA, 1990; pp. 1–84. [CrossRef]
  33. Nielsen, J. Usability Engineering; Morgan Kaufmann, AP Professional, Inc.: Boston, MA, USA, 1993; p. 121. [Google Scholar]
  34. Corry, M.D.; Frick, T.W.; Hansen, L. User-centered design and usability testing of a web site: An illustrative case study. Educ. Technol. Res. Dev. 1997, 45, 65–76. [Google Scholar] [CrossRef]
  35. Abran, A.; Khelifi, A.; Suryn, W.; Seffah, A. Usability Meanings and Interpretations in ISO Standards. Softw. Qual. J. 2003, 11, 325–338. [Google Scholar] [CrossRef]
  36. Iwarsson, S.; Ståhl, A. Accessibility, usability, and universal design—Positioning and definition of concepts describing person-environment relationships. Disabil. Rehabil. 2003, 25, 57–66. [Google Scholar] [CrossRef] [PubMed]
  37. Hu, F. The studies of eye tracking and usability test. In Proceedings of the 7th International Conference on Computer-Aided Industrial Design & Conceptual Design, Hangzhou, China, 17–19 November 2006. [Google Scholar]
  38. Lee, M.; Kim, M.-H. Development of Evaluation Metrics for Usability Evaluation of Smart Home App Design. J. Korean Content Soc. 2019, 19, 249–258. [Google Scholar]
  39. Zhang, Z.; Zhang, C. A Study on the Importance of Factors Influencing User Experience in Mobile Travel Applications. Form. Media Stud. 2023, 26, 1–8. [Google Scholar] [CrossRef]
  40. Jeong, Y.S. A Study on Usability Evaluation Scale of Mobile Navigation Using AHP Technique. Master’s Thesis, Hansung University Graduate School, Seoul, Republic of Korea, 2013; p. 13. [Google Scholar]
  41. The Principles of Website Usability. Available online: https://99designs.com/blog/web-digital/website-usability-principles/ (accessed on 30 April 2024).
  42. Lingaard, G. Usability Testing and System Evaluation: A Guide for Designing Useful Computing Systems; Chapman & Hall: London, UK, 1994. [Google Scholar]
  43. Richardson, B.; Campbell-Yeo, M.; Smit, M. Mobile application user experience checklist: A tool to assess attention to core UX principles. Int. J. Hum. Comput. Interact. 2021, 37, 1283–1290. [Google Scholar] [CrossRef]
  44. Lee, Y.; Kozar, K.A. Understanding of website usability: Specifying and measuring constructs and their relationships. Decis. Support Syst. 2012, 52, 450–463. [Google Scholar] [CrossRef]
  45. Palmer, J.W. Web site usability, design, and performance metrics. Inf. Syst. Res. 2002, 13, 151–167. [Google Scholar] [CrossRef]
  46. Rothe, C.; Schunk, M.; Sothmann, P.; Bretzel, G.; Froeschl, G.; Wallrauch, C.; Zimmer, T.; Thiel, V.; Janke, C.; Guggemos, W.; et al. Transmission of 2019-NCoV Infection from an Asymptomatic Contact in Germany. N. Engl. J. Med. 2020, 382, 970–971. [Google Scholar] [CrossRef] [PubMed]
  47. Singhal, T. A Review of Coronavirus Disease-2019 (COVID-19). Indian J. Pediatr. 2020, 87, 281–286. [Google Scholar] [CrossRef] [PubMed]
  48. iResearch Inc. 2016 Report on Improving the Health of Chinese Internet Users. 2016. Available online: http://report.iresearch.cn/report/201603/2561.shtml (accessed on 29 April 2024).
  49. Jones, A.L.; Cochran, S.D.; Leibowitz, A.; Wells, K.B.; Kominski, G.; Mays, V.M. Usual Primary Care Provider Characteristics of a Patient-Centered Medical Home and Mental Health Service Use. J. Gen. Intern. Med. 2015, 30, 1828–1836. [Google Scholar] [CrossRef] [PubMed]
  50. Website For Hospitals—8 Benefits You Should Know. Available online: https://medium.com/@arrowmarketing360/website-for-hospitals-8-benefits-you-should-know-61b8be0e75dc (accessed on 29 February 2024).
  51. Meszaros, J.; Buchalcevova, A. Introducing OSSF: A framework for online service cybersecurity risk management. Comput. Secur. 2017, 65, 300–313. [Google Scholar] [CrossRef]
  52. Christensen, H.; Murray, K.; Calear, A.L.; Bennett, K.; Bennett, A.; Griffiths, K.M. Beacon: A web portal to high-quality mental health websites for use by health professionals and the public. Med. J. Aust. 2010, 192 (Suppl. S11), S40–S44. [Google Scholar] [CrossRef] [PubMed]
  53. Eysenbach, G.; Powell, J.; Kuss, O.; Sa, E.R. Empirical studies assessing the quality of health information for consumers on the world wide web: A systematic review. JAMA 2002, 287, 2691–2700. [Google Scholar] [CrossRef] [PubMed]
  54. Bernstam, E.V.; Walji, M.F.; Sagaram, S.; Sagaram, D.; Johnson, C.W.; Meric-Bernstam, F. Commonly cited website quality criteria are not effective at identifying inaccurate online information about breast cancer. Cancer 2008, 112, 1206–1213. [Google Scholar] [CrossRef] [PubMed]
  55. Goodman, C.M. The Delphi technique: A critique. J. Adv. Nurs. 1987, 12, 729–734. [Google Scholar] [CrossRef]
  56. Hasson, F.; Keeney, S.; McKenna, H. Research guidelines for the Delphi survey technique. J. Adv. Nurs. 2000, 32, 1008–1015. [Google Scholar] [CrossRef]
  57. Hsu, C.C.; Sandford, B.A. The Delphi technique: Making sense of consensus. Pract. Assess. Res. Eval. 2007, 12, 10. [Google Scholar]
  58. Moher, D.; Schulz, K.F.; Simera, I.; Altman, D.G. Guidance for developers of health research reporting guidelines. PLoS Med. 2010, 7, e1000217. [Google Scholar] [CrossRef] [PubMed]
  59. Taylor, E. We Agree, Don’t We? The Delphi Method for Health Environments Research. Health Environ. Res. Des. J. 2020, 13, 11–23. [Google Scholar] [CrossRef] [PubMed]
  60. McMillan, S.S.; King, M.; Tully, M.P. How to use the nominal group and Delphi techniques. Int. J. Clin. Pharm. 2016, 38, 655–662. [Google Scholar] [CrossRef] [PubMed]
  61. Niederberger, M.; Spranger, J. Delphi Technique in Health Sciences: A Map. Front. Public Health 2020, 8, 457. [Google Scholar] [CrossRef] [PubMed]
  62. Jünger, S.; Payne, S.A.; Brine, J.; Radbruch, L.; Brearley, S.G. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: Recommendations based on a methodological systematic review. Palliat. Med. 2017, 31, 684–706. [Google Scholar] [CrossRef] [PubMed]
  63. Kalaian, S.; Kasim, R.M. Terminating sequential Delphi survey data collection. Pract. Assess. Res. 2012, 17, 1–11. [Google Scholar]
  64. Shariff, N. Utilizing the Delphi survey approach: A review. J. Nurs. Care Qual. 2015, 4, 246–251. [Google Scholar] [CrossRef]
  65. Shang, Z. Use of Delphi in health sciences research: A narrative review. Medicine 2023, 102, e32829. [Google Scholar] [CrossRef] [PubMed]
  66. Rowe, G.; Wright, G. The Delphi technique as a forecasting tool: Issues and analysis. Int. J. Forecast 1999, 15, 353–375. [Google Scholar] [CrossRef]
  67. Davis, L.; Taylor, H.; Reyes, H. Lifelong learning in nursing: A Delphi study. Nurse Educ. Today 2014, 34, 441–445. [Google Scholar] [CrossRef]
  68. Zhang, Y.; Hamzah, H.; Adam, M. A Framework for Smart City Streetscape (SCS) Design Guidelines for Urban Sustainability: Results from a Systematic Literature Review and a Delphi Process. Environ. Dev. Sustain. 2023, 1–32. Available online: https://scholar.google.com.hk/scholar?hl=zh-CN&as_sdt=0%2C5&q=Approaches+to+Quantitative+Research%E2%80%93Theory+and+its+Practical+Application%3A+A+Guide+to+Dissertation+Students&btnG= (accessed on 30 April 2024).
  69. Fabrigar, L.R.; Wegener, D.T. Exploratory Factor Analysis; Oxford University Press: Oxford, UK, 2011. [Google Scholar]
  70. Schreiber, J.B. Issues and recommendations for exploratory factor analysis and principal component analysis. Res. Soc. Adm. Pharm. 2021, 17, 1004–1011. [Google Scholar] [CrossRef] [PubMed]
  71. Hooper, D. Exploratory factor analysis. In Approaches to Quantitative Research–Theory and its Practical Application: A Guide to Dissertation Students; Oak Tree Press: Cork, Ireland, 2012; pp. 1–3. [Google Scholar]
  72. Basílio, M.P.; Pereira, V.; Costa, H.G.; Santos, M.; Ghosh, A. A Systematic Review of the Applications of Multi-Criteria Decision Aid Methods (1977–2022). Electronics 2022, 11, 1720. [Google Scholar] [CrossRef]
  73. Meniïz, B.; Özkan, E.M. Vaccine selection for COVID-19 by AHP and novel VIKOR hybrid approach with interval type-2 fuzzy sets. Eng. Appl. Artif. Intell. 2023, 119, 105812. [Google Scholar] [CrossRef] [PubMed]
  74. Park, S.; Kim, H.K.; Lee, M. An analytic hierarchy process analysis for reinforcing doctor–patient communication. BMC Prim. Care 2023, 24, 24. [Google Scholar] [CrossRef] [PubMed]
  75. Eriş, M.B.; Sezer, E.D.G.; Ocak, Z. Prioritization of the factors affecting the performance of clinical laboratories using the AHP and ANP techniques. Netw. Model. Anal. Health Inform. Bioinform. 2022, 12, 5. [Google Scholar] [CrossRef]
  76. Fahim, A.; Tan, Q.; Naz, B.; Ain, Q.U.; Bazai, S.U. Sustainable Higher Education Reform Quality Assessment Using SWOT Analysis with Integration of AHP and Entropy Models: A Case Study of Morocco. Sustainability 2021, 13, 4312. [Google Scholar] [CrossRef]
  77. Li, X.; Pei, Z. Improving effectiveness of online learning for higher education students during the COVID-19 pandemic. Front. Psychol. 2023, 13, 1111028. [Google Scholar] [CrossRef] [PubMed]
  78. Canco, I.; Kruja, D.; Iancu, T. AHP, a reliable method for quality decision making: A case study in business. Sustainability 2021, 13, 13932. [Google Scholar] [CrossRef]
  79. Brauner, P.; Philipsen, R.; Calero Valdez, A.; Ziefle, M. What happens when decision support systems fail? The importance of usability on performance in erroneous systems. Behav. Inf. Technol. 2019, 38, 1225–1242. [Google Scholar] [CrossRef]
  80. Saaty, T.L. The Analytic Hierarchy Process, New York; McGrew, H., Ed.; Paperback (1996, 2000), International, Translated to Russian, Portuguese and Chinese; RWS Publications: Pittsburgh, PA, USA, 1980; Volume 9, pp. 19–22. [Google Scholar]
  81. Nukman, Y.; Ariff, H.; Salit, M.S. Use of analytical hierarchy process (AHP) for selecting the best design concept. J. Teknol. 2009, 49, 1–18. [Google Scholar]
  82. Chen, T.; Luh, D.; Hu, L.; Liu, J.; Chen, H. Sustainable Design Strategy of Regional Revitalization Based on AHP–FCE Analysis: A Case Study of Qianfeng in Guangzhou. Buildings 2023, 13, 2553. [Google Scholar] [CrossRef]
  83. Saaty, T.L. What Is the Analytic Hierarchy Process? Springer: Berlin/Heidelberg, Germany, 1988.
  84. Preece, J.; Rogers, Y.; Sharp, H. Interaction Design Beyond Human-Computer Interaction; Wiley: Southern Gate Chichester, UK, 2002. [Google Scholar]
  85. Nunnally, J.; Bernstein, L. Psychometric Theory; McGraw-Hill Higher, Inc.: New York, NY, USA, 1994. [Google Scholar]
  86. DeVellis, R. Scale Development: Theory and Applications; Sage: Thousand Oaks, CA, USA, 2003. [Google Scholar]
  87. Tibeica, S.C.; Baciu, E.R.; Lupu, I.C.; Balcos, C.; Luchian, I.; Budala, D.G.; Tibeica, A.; Surlari, Z.; Carausu, E.M. Creating and Validating a Questionnaire for Assessing Dentists’ Self-Perception on Oral Healthcare Management—A Pilot Study. Healthcare 2024, 12, 933. [Google Scholar] [CrossRef] [PubMed]
  88. Dhurkari, R.K. Strategic Pricing Decision Using the Analytic Hierarchy Process. J. Revenue Pricing Manag. 2022, 22, 85–100. [Google Scholar] [CrossRef]
  89. Liu, M.; Zhu, X.; Chen, Y.; Kong, Q. Evaluation and Design of Dining Room Chair Based on Analytic Hierarchy Process (AHP) and Fuzzy AHP. BioResources 2023, 18, 2574–2588. [Google Scholar] [CrossRef]
  90. Lv, C.-M.; Wang, S.; Tang, Y.-H.; Huang, Y.-J. Research on Product Usability Evaluation Indicators of Senior Social APP Based on Hierarchical Analysis Method. J. Mech. Des. 2019, 36, 174–177. [Google Scholar]
Figure 1. Flow chart of our methodology.
Figure 1. Flow chart of our methodology.
Applsci 14 05465 g001
Figure 2. Scree plot.
Figure 2. Scree plot.
Applsci 14 05465 g002
Figure 3. Criterion layer weights.
Figure 3. Criterion layer weights.
Applsci 14 05465 g003
Figure 4. B1 level weights.
Figure 4. B1 level weights.
Applsci 14 05465 g004
Figure 5. B2 level weights.
Figure 5. B2 level weights.
Applsci 14 05465 g005
Figure 6. B3 level weights.
Figure 6. B3 level weights.
Applsci 14 05465 g006
Figure 7. B4 level weights.
Figure 7. B4 level weights.
Applsci 14 05465 g007
Figure 8. Overall weights.
Figure 8. Overall weights.
Applsci 14 05465 g008
Table 1. Indicators after Delphi questionnaire.
Table 1. Indicators after Delphi questionnaire.
Compilation of Usability Indicators for Medical Websites
IndicatorDescriptionIndicatorDescription
EffectivenessWhether the website helps users successfully achieve their goalsContent RelevanceWhether the content is relevant to users’ needs and searches
LearnabilityHow quickly a new user can learn to use the website’s functionsScreen Design and LayoutHow information is displayed on the screen
EfficiencyThe time and resources required to complete tasksReadabilityEase of understanding text, appropriate formatting
ControllabilityThe degree of control users has over website operationsTerminologyUnderstandability of professional medical terms used on the website
MemorabilityWhether users can easily remember how to use the websiteResponsivenessResponse speed and layout adaptability of the website on different devices
Task ConsistencyWhether different parts of the website maintain task consistencyFeedbackQuality of system feedback after user actions
ErrorFrequency and severity of errors encountered while using the websiteConvenienceConvenience and ease of use of the website
FindabilityWhether users can easily find the information they needConsistencyConsistency of interfaces and operations across different pages
SatisfactionUsers’ satisfaction with using the websitePrivacy ProtectionHow the website protects user information
AccessibilityAddressing the needs of people with disabilities and patients when accessing the webpageRemote PresentationAbility of the website to support remote services
User-CenteredWhether the website design considers users’ needs and experiencesProperty ProtectionWebsite safeguards users’ personal information and medical conditions
InteractivityFacilitates effective communication between users and the platformCustomizationProviding different experiences based on individual needs and preferences
Easy to useEase of use of the website for usersDownload SpeedSpeed at which website content is downloaded to the user’s device
CredibilityReliability and reputation of the website and its medical informationUnnecessarinessWhether the website has unnecessary content or features
NavigationHow smoothly users can move between systems and modulesSensory FormsWhether the web page design is visually appealing
Table 2. Data from two rounds of the Delphi survey.
Table 2. Data from two rounds of the Delphi survey.
N = 20Round One Delphi DataRound Two Delphi Data
IndicatorMSDCVMSDCV
Effectiveness4.7000.4700.1004.2500.4440.104
Learnability4.7500.4440.0934.2500.4440.104
Efficiency4.6000.5030.1094.1500.3660.088
Memorability2.6000.9950.383---
Error3.6500.8130.2234.1000.4470.109
Satisfaction4.4000.5030.1144.3500.4890.112
User-Centered4.9000.3080.0634.6000.5030.109
Easy to use4.2000.4100.0984.1000.4470.109
Navigation4.2500.5500.1294.0000.3240.081
Screen Design and Layout4.1500.7450.1803.7500.5500.147
Terminology4.1000.4470.109---
Feedback4.1500.7450.1803.8000.4100.108
Consistency4.2500.5500.1294.0000.4590.115
Remote Presentation3.2000.6160.1933.6500.4890.134
Customization1.7500.8510.486---
Sensory Forms3.2000.7680.2402.3500.6710.286
Unnecessariness3.9000.3080.0793.9500.2240.057
Controllability4.0500.3940.0973.9000.4470.115
Task Consistency2.3001.0310.448---
Findability4.6500.4890.1054.2500.5500.129
Accessibility4.6500.4890.1054.1000.5530.135
Interactivity3.8500.8750.2274.0000.3240.081
Credibility4.9000.3080.0634.9000.3080.063
Content Relevance4.3500.4890.1124.0500.2240.055
Readability4.6500.4890.1054.1000.3080.075
Responsiveness3.7500.9100.2433.5000.6070.173
Convenience3.9500.3940.1003.9500.2240.057
Privacy Protection4.9500.2240.0454.8500.3660.075
Property Protection4.9500.2240.0454.9000.3080.063
Download Speed1.7501.0200.583---
Table 3. Usability indicators for medical websites.
Table 3. Usability indicators for medical websites.
EffectivenessLearnabilityEfficiencyErrorSatisfactionUser-Centered
Easy to useNavigationScreen Design and LayoutFeedbackConsistencyRemote Presentation
UnnecessarinessControllabilityFindabilityAccessibilityInteractivityCredibility
Content RelevanceReadabilityResponsivenessConveniencePrivacy ProtectionProperty Protection
Table 4. Results of KMO and Bartlett’s test result.
Table 4. Results of KMO and Bartlett’s test result.
TestTest Value
KMO 0.924
Bartlett’s Approximate Chi-Square4255.067
Degrees of Freedom (df) 276
Significance0.00
Cronbach’s Alpha 0.921
Table 5. Total variance explained.
Table 5. Total variance explained.
Total Variance Explained
ElementInitial EigenvaluesSum of Squared Loadings for ExtractionRotated Sum of Squared Loadings
TotalPercentage of Variance ExplainedCumulative Percentage of Variance ExplainedTotalPercentage of Variance ExplainedCumulative Percentage of Variance ExplainedTotalPercentage of Variance ExplainedCumulative Percentage of Variance Explained
18.54735.61235.6128.54735.61235.6124.47218.63218.632
22.68311.17746.7892.68311.17746.7894.07516.97835.611
32.65511.06357.8522.65511.06357.8524.07216.96852.579
42.2649.43367.2852.2649.43367.2853.52914.70667.285
50.6142.56069.845
60.5892.45372.297
70.5652.35674.653
80.5082.11776.770
90.4902.04478.813
100.4601.91780.731
110.4351.81482.545
120.4191.74884.293
130.4081.69985.992
140.3911.62987.621
150.3841.60189.222
160.3621.50890.731
170.3401.41592.145
180.3161.31893.463
190.3101.29094.754
200.2861.19295.946
210.2701.12597.070
220.2631.09698.166
230.2250.93699.102
240.2160.898100.000
Table 6. Rotated component matrix.
Table 6. Rotated component matrix.
Rotated Component Matrix
Element
B1B2B3B4
Effectiveness: C10.812
Convenience: C20.770
Easy to use: C30.768
Efficiency: C40.763
Error: C50.757
Satisfaction: C60.745
Learnability: C70.739
Feedback: C8 0.797
Navigation: C9 0.785
Screen Design and Layout: C10 0.782
Consistency: C11 0.781
User-Centered: C12 0.780
Interactivity: C13 0.776
Accessibility: C14 0.821
Remote Presentation: C15 0.820
Findability: C16 0.801
Unnecessariness: C17 0.783
Controllability: C18 0.782
Responsiveness: C19 0.726
Readability: C20 0.816
Credibility: C21 0.807
Privacy Protection: C22 0.797
Content Relevance: C23 0.789
Property Protection: C24 0.787
Extraction Method: Principal Component Analysis. Rotation Method: Kaiser Normalization Varimax Method.
Table 7. AHP hierarchical decision-making framework.
Table 7. AHP hierarchical decision-making framework.
Overall Objective LayerCriterion LayerSolution Layer
Constructing Usability Indicators for Medical Websites AB1C1
C2
C3
C4
C5
C6
C7
B2C8
C9
C10
C11
C12
C13
B3C14
C15
C16
C17
C18
C19
B4C20
C21
C22
C23
C24
Table 8. Judgment matrix scoring criteria.
Table 8. Judgment matrix scoring criteria.
Saaty’s 1–9 Scale Assignment Method
ScaleMeaning
1Indicators i and j are of equal importance.
3Indicator i is slightly more important than indicator j.
5Indicator i is moderately more important than indicator j.
7Indicator i is strongly more important than indicator j.
9Indicator i is absolutely more important than indicator j.
2, 4, 6, 8The importance of the indicators falls between the above scales.
ReciprocalIf the comparison between factors i and j results in the judgment matrix entry Cij, then the comparison of factor j to i is given as Cij = 1/Cij.
Table 9. Random index (RI) value.
Table 9. Random index (RI) value.
n345678910
RI Value0.520.891.121.261.361.411.461.49
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, X.; Park, K. Construction and Ranking of Usability Indicators for Medical Websites Based on Website User Experience. Appl. Sci. 2024, 14, 5465. https://doi.org/10.3390/app14135465

AMA Style

Liu X, Park K. Construction and Ranking of Usability Indicators for Medical Websites Based on Website User Experience. Applied Sciences. 2024; 14(13):5465. https://doi.org/10.3390/app14135465

Chicago/Turabian Style

Liu, Xiaoxue, and Kyungjin Park. 2024. "Construction and Ranking of Usability Indicators for Medical Websites Based on Website User Experience" Applied Sciences 14, no. 13: 5465. https://doi.org/10.3390/app14135465

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop