Next Article in Journal
Analyzing the Effectiveness of Single Active Bridge DC-DC Converter under Transient and Load Variation
Previous Article in Journal
Examining the Impact of Teacher Learning Communities on Self-Efficacy and Professional Learning: An Application of the Theory-Driven Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Emotional Value in Online Education: A Framework for Service Touchpoint Assessment

College of Design and Art, Shaanxi University of Science and Technology, Xi’an 710016, China
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(6), 4772; https://doi.org/10.3390/su15064772
Submission received: 22 February 2023 / Revised: 3 March 2023 / Accepted: 6 March 2023 / Published: 8 March 2023

Abstract

:
To enhance the online education service experience, the emotional valence of the user was studied as an evaluation variable, and both qualitative and quantitative research were used to find how to evaluate online education service touchpoints. First, deconstruct the system service interface with the interactive touchpoint matrix, set service evaluation indicators from four aspects, visual guidance, learning resources, after-class evaluation, and interactive feedback, and build an online education service touchpoint evaluation system. Secondly, using Tencent Classroom as the target of research, an online education service rating experiment is created based on the two dimensions of emotional valence and perceptual cognition. With the aid of a questionnaire survey and analytic hierarchy process (AHP), a multidimensional evaluation of online education service touchpoints is accomplished using the learners’ emotional enjoyment, activation, dominance, touchpoint satisfaction, and importance as measuring indicators. Finally, concluding the assessment and optimization of online education service touchpoints, the evaluation data for the service are combined, and the evaluation results are generated using visual design. This study includes successful strategies and practical recommendations for boosting interest in e-learning services, user initiative, and excitement for learning.

1. Introduction

As the new norm for learning in the post-epidemic era, online education refers to individuals acquiring knowledge, enhancing skills, and expanding their perspectives over the Internet. It is an opportunity and a direction for transforming established education models [1,2,3]. Evaluation of online education services is the process of deconstructing, analyzing, and optimizing the online education system, which plays a crucial role in enhancing users’ learning efficiency and learning interest [4]. Since the online education service process involves multiple elements, such as platform interface, learning resources, and operating users, the key and prerequisite for improving online education services is to effectively sort and evaluate service elements, lock key opportunities, and clarify optimization directions [5,6].
Presently, local and international research on the evaluation of online education services can be divided into subjective and objective [7,8,9]. Among them, subjective evaluation mainly relies on the experience of education industry experts or designers for platform structure and optimization, which is prone to huge errors. Thus, objective evaluation methods are frequently employed in research, which is most evident in the two parts of system analysis and user research.
  • System analysis is largely concerned with the interface design and functional module analysis of the online education platform, including the evaluation of the platform’s interaction experience and research on the platform’s virtual space. Nian et al. [10] designed a new mobile education platform by analyzing the emotional states of students and investigating the effect of mobile online education platforms in fostering students’ independent learning. Using questionnaires, Hu et al. [11] investigated the elements that influence students’ engagement in peer-assisted English learning (PAEL) based on online education platforms and evaluated students’ online learning process from two perspectives: surface participation and deep participation. It is concluded that the online education platform has issues, including a low degree of knowledge structure, poor interaction quality, weak interaction intensity, and an inactive collaboration atmosphere. Amara et al. [12] combined augmented reality and virtual reality technologies to construct a 3D interactive e-learning platform, offering a dynamic interactive experience that merges real and virtual worlds and helps to boost students’ involvement and happiness with online learning. Using questionnaire surveys, Liu et al. [13] looked at the potential user needs of the online scene experience, combined text, and image information, and focused on the differences between the online education platform and the offline education space to improve the user’s pleasure with the online education platform, with industrial design students serving as a sample population for the study.
  • User research focuses mostly on analyzing user behavior and needs. In comparison to system analysis, it is more adaptable. Flow theory, distributed cognition, gamification design, and service co-creation are the study methodologies. Based on the successful DeLone and McLean model and process theory of IS, Li et al. [14] used MOOCs as the research object, constructed a model for optimizing system characteristics and process experience, and confirmed that flow experience and learning effects have a positive impact on continuous willingness. Han et al. [15] contributed to enhancing the concentration and immersion of online learners by utilizing flow theory. They extracted nine factors from the flow theory, including challenge, reaction speed, sense of control, and clear objectives, and used a structural equation model to examine the impact of these nine factors on user experience in the online education process. Yan et al. [16] introduced gamification theory to online education, focusing on the potential impact of differences in learning styles on learners’ behavioral intentions, and added “perceived learning tasks” as an external variable into the theoretical framework. Taking the CAID experimental course as the research object, Zhang et al. [17] optimized the online education user experience based on the service co-creation theory and built an online flipped classroom model according to the peak final decision, suggesting ways to enhance the online education service experience.
The preceding study on online education services has partially compensated for the inadequacies of subjective evaluation methodologies. However, prior studies have been centered on user behavior or stated wants, and there is a dearth of studies on the emotional and psychological needs of users. Consequently, this paper proposes an online education service touchpoint evaluation method based on user emotional valence and attempts to combine psychological research with service evaluation to investigate the relationship between online education services and user emotional valence across multiple dimensions. Based on an analysis of the two dimensions of emotional valence and perceptual cognition, a five-dimensional evaluation model is established by integrating the user’s emotional pleasure, arousal, and dominance with user satisfaction and touchpoint importance. Then, using the model as a guide, an evaluation strategy and procedure are developed, and the assessment of the online education service provider is concluded.

2. Research Ideas and Framework

2.1. Theoretical Basis

As a direct result of user perception and experience with a service, emotion is a significant component influencing user requirements and behavior choices. Emotional valence is the evaluation of emotional attributes, representing the user’s interest or rejection of a certain type of product or service, and can be divided into two categories: positive emotional experience and negative emotional experience [18,19]. Due to the involvement of multiple influencing factors, such as user synesthesia, external environment, and psychological effects, researchers have attempted to conduct user emotional valence research through multiple dimensions, including semantic scales, facial recognition, and physiological signal measurement [20,21,22]. Of these, PAD emotion theory is one of the hottest topics among academics.
The PAD emotion theory is a multidimensional emotion measuring model proposed by Mehrabine and Russell in 1974 [23], which holds that an individual’s emotional experience is not a single physiological response but rather a diverse emotional phenomenon. This theory divides human emotions into three basic dimensions: pleasure, arousal, and dominance. P stands for pleasure, which represents the ups and downs an individual experiences when receiving information; A stands for arousal, which refers to the level of physiological activity triggered by information stimulation; and D stands for dominance, which indicates the individual’s ability to control the information received.
The association between emotional valence and online education services is depicted in Figure 1. On the one hand, the experience of online education services demonstrates that interest guidance, thinking stimulation, and activity performance are significant aspects that influence the learning emotions of users [24]. It develops a correlative link with the three aspects of the PAD emotion theory, which are influenced by the user’s emotional ups and downs, level of cognitive activity, and capacity for information control. In contrast, system drainage, touchpoint effect, and learning efficiency are the primary indicators of online education service evaluation, which are interrelated with interest guidance, thinking stimulation, and activity performance via interface perception, content cognition, and knowledge acquisition. This demonstrates that user emotional valence can be incorporated into the evaluation of online education services.

2.2. PADSI Five-Dimensional Evaluation Model Construction

Based on the correlation between user emotional valence and online education service evaluation, user emotional pleasure, activation, and dominance are included in the online education service evaluation index system as the dimension of emotional valence. At the same time, by analogy to the traditional service evaluation method, user satisfaction and important indicators are selected as perceptual cognitive dimensions to jointly construct the PADSI five-dimensional evaluation model, as shown in Figure 2. The emotional valence dimension can be broken down as follows: the user’s level of emotional pleasure represents the extent to which students are engaged in and benefit from their online learning experiences; the level of arousal represents the effectiveness with which students apply their imagination and initiative to their online coursework; and the level of dominance represents the extent to which students accept and internalize what they learn. In the realm of perceptual cognition, touchpoint importance indicates the logic and necessity of a certain function, while user satisfaction indicates the degree to which students’ requirements are met by the online education platform.
In accordance with the model in Figure 2, we deconstruct the online education service system as follow: use the interactive contact matrix to convert the service contact into service evaluation indicators and use the SAM scale to evaluate the user’s emotional pleasure, activation, and dominance in the online learning process. Simultaneously, user satisfaction surveys and analytic hierarchy methods are utilized to evaluate the importance and satisfaction of user groups with the online education service system.
The advantage of the PADSI five-dimensional evaluation model over existing online education service evaluation methods is that it includes user emotional valence into service evaluation research and expands the diversity and intersectionality of service evaluation criteria and assessment methods. We offer novel suggestions for enhancing the learning efficacy, motivation, and independence of online education platform users.

2.3. Online Education Service Evaluation Process Design

The research process of online education service evaluation based on user emotional valence is shown in Figure 3.
  • Construction of the evaluation index system. We decompose user behavior by dividing online education service phases through system deconstruction and user behavior tracking techniques. We gather service touchpoints in the target system based on user behavior and then generate an interactive touchpoint matrix. The service touchpoints collected through cluster analysis are converted into relevant service evaluation indicators, and a reasonable evaluation index system is generated via expert analysis.
  • Process of the service evaluation experiment. Many service recipients of an online education platform are chosen to undertake an emotional valence analysis and a perceptual cognition study on the online education system. On the one hand, the emotional assessment questionnaire is constructed in conjunction with the SAM scale and the evaluation index system, and through the questionnaire survey, the emotional valence of service users is determined. On the other hand, with the help of the scale method and analytic hierarchy process, we evaluate the satisfaction and importance of the same group of people to complete a deeper perceptual cognition. The perceptual cognition outcomes of online education system services are derived from the two factors of user demands and user expectations.
  • Results of evaluation and optimization strategies. We summarize and evaluate the evaluation data to visualize the evaluation outcomes. Meanwhile, utilizing the collected data, a detailed analysis of the corresponding service touchpoints is conducted. We explore the relationships among user emotional valence, satisfaction, and importance using single-factor analysis, comparative evaluation, and inductive analysis to propose effective iterative strategies and optimization directions for the current online education system to enhance the learning experience and efficiency.
Both qualitative and quantitative research are used in the above evaluation process. Based on traditional service evaluation methods (statistical questionnaire method, observation method, test method, etc.), it integrates service design research instruments to satisfy the demand for diverse evaluation methods and indicators. In the construction phase of the evaluation index system, the interactive touchpoint matrix is used to capture user behavior, which is then converted into service touchpoints, thereby enhancing the evaluation index’s rationality. In addition, the incorporation of the SAM scale as an evaluation factor in the process of service evaluation experiment design and analysis increases the diversity of service indicators. Regarding evaluation results and optimization strategies, data visualization analysis charts are more descriptive and recognizable than statistical charts, such as data tables and line charts, thereby enhancing the evaluation system’s reliability.

3. Online Education Service Evaluation System Construction

3.1. Interaction Contact Matrix Output and Cluster Analysis

In service design research, the interaction touchpoint matrix, often known as the service touchpoint matrix, is a typical instrument. It is frequently used to connect the user experience process and provide a visual framework [25]. To understand the online education service experience process more clearly and comprehensively, this study introduces the interactive contact matrix to decompose the user experience process.
The research team selected Tencent Classroom as the subject of the evaluation experiment after conducting a comprehensive analysis of the cumulative downloads, registered users, click-through rate, and functional integrity of current online education platforms. As illustrated in Figure 4, the interactive contact matrix of the online education platform is formed based on the user’s behavior trajectory and learning process in the Tencent Classroom platform from the perspective of service recipients, and the contact cluster analysis is completed.
Figure 4 displays the interactive touchpoint matrix of the online education platform, which consists of four modules: stage division, behavior analysis, touchpoint collection, and touchpoint clustering. First, based on the overall service process and user behavior experience process, the online education service stage is divided into platform drainage, knowledge learning, and service expansion. Secondly, the research object is transitioned from user behavior to service touchpoints, and the corresponding service touchpoints are obtained according to the user experience process. Lastly, based on the interactive characteristics of the online education system, the service touchpoints are divided into four categories: visual guidance, learning resources, after-class evaluation, and interactive feedback, providing a basis for the construction of the service evaluation index system.

3.2. Service Evaluation Index Transformation and System Establishment

We carry out qualitative research on the obtained service touchpoints through expert consultation and group discussion, merge similar touchpoints, and transform them into service evaluation indicators: (1) color style, icon design, and transition animation touchpoints are merged into interface design indicators; (2) navigation design and interface partition touchpoints are merged into navigation setting indicators; (3) course appointment and class reminder touchpoints are merged into course reminder indicators; (4) course room and screen control touchpoints are merged into course video indicators; (5) course forums, postings, and interactive discussion touchpoints are merged into course interaction indicators; (6) on-site Q&A, question help, and content inquiry touchpoints are merged into online Q&A indicators.
Thus, the 28 service touchpoints in the interactive touchpoint matrix are converted into 19 service touchpoints evaluation indicators, the online education service touchpoints evaluation index system is constructed, and the index coding is completed, as shown in Table 1.

4. Online Education Service Evaluation Experiment

4.1. Evaluation of Experimental Design

The convenience sampling technique was used to select all first-year students majoring in design at the university where the research team is based, a total of 52 students (21 males and 31 females). The age range of the subjects was between 18 and 22 years old, all voluntarily participated in this experiment, agreed, and knew the content of the experiment. To reduce the impact of the small sample size on the research results, the selected samples all possess a certain knowledge base of aesthetics and interaction design, are all in-depth users of Tencent Classroom, and have a good understanding and manipulation ability of the platform, thereby enhancing the objectivity and rationality of the evaluation process.
Experimental materials include “New Concept English (Volume 1)”, “Design Introductory Classroom”, and “Aesthetic Principles” on the Tencent Classroom platform. The length of a single course ranges from 15 to 30 min, and the substance of the instructional video is comprehensive, while the categories are various, and the level of interest varies. The experimental protocol is detailed in Table 2.

4.2. Dimensions of Emotional Valence

Analysis of the emotional valence of online education service platform consumers. Its essence is to identify relevant variables for measuring user emotions to quantify user emotions. The steps are as follows:
  • Design of emotional evaluation questionnaire. To enhance the readability and effectiveness of the questionnaire, the SAM image emotion scale was introduced to evaluate user emotions. The SAM scale presents the pleasure, activation, and dominance of the user’s emotions in the form of images and focuses more on the subject’s feeling and perception of emotional reactions, thereby helping the subject to evaluate their own emotions more rapidly and intuitively, as depicted in Figure 5.
  • Emotional evaluation result output. Corresponding to the issues related to the design of evaluation indicators for online education platforms and combining them with the SAM scale, a 1–9 emotional evaluation questionnaire is generated, and the scoring level is directly proportional to the emotional valence. Using the task method to test the index emotion, the sample of the task scale is shown in Figure 6, and the results are shown in Table 3.

4.3. Dimensions of Perceptual Cognition

  • Evaluation of service touchpoints’ satisfaction
Together with 19 contact indicators, a satisfaction questionnaire was developed. The questionnaire used the Likert 5-level scale method, and the above-mentioned experimental subjects were invited to complete the questionnaire, and the satisfaction index set S = (S1, S2, ⋯, Sn) was obtained. Sn represents the nth experimental subject, and the mean value of user satisfaction of each index is calculated by the mean value method, which is regarded as the service contact point satisfaction.
2.
Evaluation of service touchpoints importance
We construct a hierarchical model, as shown in Figure 7, according to the target layer, standard layer, and index layer in Table 1.
Construct a judgment matrix. We compare the indicators of the same level in the online education platform to determine the importance of the indicators. At the same time, the 1–9 scale method is used to quantify the importance, and the geometric mean of each element score is used as the final score to construct the online education service contact point judgment matrix, U = [ U i j ] m m . Taking the learning resource indicators in the criterion layer as an example, the contact judgment matrix after assembly is as follows:
U = [ 1 3.3098 5.856 7.9373 7.5446 0.3021 1 4.1618 5.8857 3.0274 0.1708 0.2403 1 3.6628 0.9306 0.126 0.1699 0.273 1 0.3433 0.1325 0.3303 1.0746 2.913 1 ]
Furthermore, the weight vector is solved. We multiply the values of each row in the judgment matrix, U , to obtain the multiplication result, U j , and calculate the corresponding weight coefficient, namely:
U j = c = 1 m U j c                               j = 1 , 2 , , m
a j = U j m                               j = 1 , 2 , , m
ω j = a j j = 1 m a j                               j = 1 , 2 , , m
In the formula: U j c is the value of row j and column c in the judgment matrix, U ; m is the number of evaluation indicators in each row of the judgment matrix; a j is the geometric mean of each row of the judgment matrix; ω j is the weight obtained after the normalization of the geometric mean.
The evaluation results are calculated with the help of yaaph software, and the weight vector ω = ( ω 1 , ω 2 , , ω m ) is obtained. We perform a consistency check on it and calculate the consistency ratio C R :
C R = C I R I
In the formula: C R is the consistency ratio; C I = ( λ m a x m ) ( m 1 ) , λ m a x is the maximum eigenvalue of the judgment matrix; R I is the random consistency index.
From the calculation result of the Formula (5), C R = 0.0216 < 0.1 , which meets the consistency test, and the obtained data meet the requirements. Therefore, based on the research results of perceptual cognition dimensions, the satisfaction and importance of online education service touchpoints are shown in Table 4.

5. Evaluation Results and Optimization Strategies

5.1. Service Evaluation Data Integration

We integrate the results of emotion evaluation experiments. We comprehensively compare the user’s emotional pleasure, activation, dominance, touch point satisfaction, and touch point importance, and sort the touch point importance, as shown in Table 5.

5.2. Visual Output of Evaluation Results

As illustrated in Figure 8, the visualization approach is used to process the received data and output the assessment findings of online education service touchpoints to enable the observation of the interaction between various elements. We enhance the visibility of the model, classify the importance of service touchpoints, and divide them into five levels according to the weight order. At the same time, different colors are used to distinguish various indicators, which is convenient for users to observe the correlation between indicators and provides a basis for comprehensive analysis and optimization of service touchpoints.

5.3. Results Analysis and Optimization Suggestions

Figure 8 depicts a service evaluation system for online education that consists of six modules: service touchpoints, importance, pleasure, arousal, dominance, and satisfaction. It depicts the status quo of Tencent Classroom’s online education platform’s service contact points and the impact of various factors on touch points. The mode of operation is as follows: (1) For a single touchpoint, the user can acquire the PADSI evaluation result of the desired touchpoint based on the touchpoint number in the chart to decide whether to optimize the service or maintain the current service. (2) From the chart, users can observe the evaluation curves and shifting trends of all touchpoints in various evaluation modules. In addition, the touchpoints with the lowest scores are sealed for analysis, and the contact evaluation curve has been optimized. (3) Users can macroscopically analyze the relationship between changes in various touchpoint curves and combine the importance levels to formulate systematic online education service strategies.
We use the Nielsen Norman model to evaluate the results of the evaluations conducted on the interaction layer, journey layer, and relationship layer with the help of five experts (teachers with design expertise and experience using online education platforms). The evaluation turned up positive findings, and the resulting conclusions can provide useful guidance for bettering the quality of online education services. As a consequence, the following conclusions can be drawn from examining the evolution of different indicators in the online education system service evaluation results:
  • In the process of online education services, the average value of user pleasure, activation, and dominance is basically consistent with the changing trend of satisfaction, showing a positive correlation. This suggests that user satisfaction has a substantial effect on emotional valence. High levels of service touchpoint satisfaction are associated with high levels of user pleasure, arousal, and dominance of that touchpoint and vice versa. Therefore, designers might begin with user emotions and optimize the lower emotional valence points, such as T4 Ad Recommendation, T8 Extended Reading, T17 Homework Correction, etc., to increase user pleasure.
  • The importance of learning resource touchpoints is generally high. Among them, T5 Course Videos and T6 Courseware Materials are important touch points at level 5, and T7 Bibliography and T9 After-school Materials are important touch points at level 4, indicating that the course quality and class experience of online education platforms is the primary factors affecting their service quality and the core touchpoint that draws users into the system. From the values of emotional valence and satisfaction, among the current system service touchpoints, only T5 Course Videos and T9 After-school Materials have high user emotional valence and satisfaction, while the remaining three touchpoints are all below the median value. Hence, the iteration of the follow-up service system should maximize the contact points of learning materials to suit the needs of users. For instance, we provide courseware materials for students to review after class; consolidate knowledge and supplement notes; provide electronic versions of bibliographies related to course learning, which are convenient for students to download and share; publish class practice materials to the course platform to assist students in studying after class.
  • Visual guidance touchpoints are the least important. Among them, the emotional valence and user satisfaction of the T1 Interface Design and T2 Navigation Settings are high, indicating that most users are satisfied with the current system interface design and visual navigation and can maintain it. While the user’s emotional pleasure and activation degree of the T3 Course Reminder is above the average value, the user’s emotional dominance and satisfaction are lower, indicating that the current course reminder function makes users have a strong sense of being manipulated, resulting in low user satisfaction. In the follow-up design, attention should be paid to reducing the compulsion of course reminders and adopting a more personalized and customizable way to remind users to complete online learning tasks.
  • The variation points in the service evaluation results deserve attention. For example, the user satisfaction and emotional pleasure of T14 Student Notes and T17 Homework Correction are low, while the emotional activation is above average. It demonstrates that the functions of student notes and homework comments can easily stimulate users’ learning initiative and activity; however, the current system is not optimal for these two functions, and the user experience is subpar; therefore, the subsequent design must be optimized and improved. Moreover, the dominance of T17 Homework Correction is low, indicating that it is difficult for users to subjectively communicate homework comments during the learning process and that this function is poorly controlled. Corresponding channels can be added during platform iterations to meet user needs in a timely and effective manner. T18 Online Q&A provides a high amount of emotional enjoyment, activation, and domination for the user, but the satisfaction is the lowest. It shows that the current online education platform teachers’ question-answering function modules are not well constructed, and most of them are in the one-way message reply to mode. Customers are dissatisfied due to their inability to resolve uncertainty and challenges faced in course learning in a timely and effective manner. In terms of importance, T17 ≈ T18 > T14. In the subsequent design, therefore, online Q&A and assignment commenting should be addressed first, and student notes should be introduced as a new function to differentiate the platform.

6. Conclusions

This study extends the existing service evaluation methods in multiple dimensions by analyzing the affective valence and behavior of users. In addition, the visualization research method is incorporated into the data analysis, and an evaluation method for online education service touchpoint based on the affective valence of the user is proposed. On the one hand, service design thinking is integrated into the construction of the service evaluation index system, and an online education service evaluation index system is established. On the other hand, through the service evaluation experiment, multi-dimensional evaluation is carried out on the user’s emotional pleasure, activation, dominance, contact satisfaction, and importance to obtain a more instructive service contact optimization strategy.
The five-dimensional evaluation model proposed by PADSI satisfies the current user-centered design objectives, takes user emotions and user needs as research objects, and provides feasible ideas for online education service evaluation research. In addition, based on the versatility of the research results and strategies, this research can provide effective assistance for the future development of Tencent Classroom and similar platforms’ online education service experience satisfaction, user enthusiasm, and interaction fluency.
Due to time and energy limitations, this article merely indicates that there is a favorable association between user emotional changes and satisfaction. In follow-up research, more refined experiments and scales should be designed for the two to further explore the factors that affect their changes and explore design opportunities.

Author Contributions

X.H. provided the research idea and the purpose of this research; N.S. designed the study, analyzed the data, and wrote the initial draft preparation; X.H. supervised, corrected, and revised this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Social Science Fund of China Project (22BSH122), the Shaanxi Provincial Department of Science and Technology Project (2022GY-329), and the Shaanxi Art and Science Planning Project: (2022HZ1642).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Ethics Committee of the College of Design and Art, Shaanxi University of Science and Technology (2022-06-01).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Butnaru, G.I.; Nita, V.; Anichiti, A.; Brinza, G. The Effectiveness of Online Education during COVID-19 Pandemic-A Comparative Analysis between the Perceptions of Academic Students and High School Students from Romania. Sustainability 2021, 13, 5311. [Google Scholar] [CrossRef]
  2. Fernandez-Batanero, J.M.; Montenegro-Rueda, M.; Fernandez-Cerero, J.; Tadeu, P. Online education in higher education: Emerging solutions in crisis times. Heliyon 2022, 8, e10139. [Google Scholar] [CrossRef] [PubMed]
  3. Kim, G.-C.; Gurvitch, R. Online Education Research Adopting the Community of Inquiry Framework: A Systematic Review. Quest 2020, 72, 395–409. [Google Scholar] [CrossRef]
  4. Sun, A.; Chen, X. Online education and its effective practice: A research review. J. Inf. Technol. Educ. 2016, 15, 157–190. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Park, S.-Y.; Pan, Y.-H. A preliminary study on the evaluation method of online design education. Des. Manuf. 2022, 16, 7–12. [Google Scholar]
  6. Wong, M.S.; Jackson, S. User Satisfaction Evaluation of Malaysian e-Government Education Services. In Proceedings of the 2017 International Conference on Engineering, Technology and Innovation (ICE/ITMC), Madeira, Portugal, 27–29 June 2017; IEEE: New York, NY, USA, 2017; pp. 531–537. [Google Scholar]
  7. Granić, A.; Ćukušić, M. Usability testing and expert inspections complemented by educational evaluation: A case study of an e-learning platform. J. Educ. Technol. Soc. 2011, 14, 107–123. [Google Scholar]
  8. Sangrà, A.; Vlachopoulos, D.; Cabrera, N. Building an inclusive definition of e-learning: An approach to the conceptual framework. Int. Rev. Res. Open Distrib. Learn. 2012, 13, 145–159. [Google Scholar] [CrossRef] [Green Version]
  9. Bozkurt, A.; Bozkaya, M. Evaluation criteria for interactive e-books for open and distance learning. Int. Rev. Res. Open Distrib. Learn. 2015, 16, 58–82. [Google Scholar] [CrossRef]
  10. Nian, L.-H.; Wei, J.; Yin, C.-B. The promotion role of mobile online education platform in students’ self-learning. Int. J. Contin. Eng. Educ. Life Long Learn. 2019, 29, 56–71. [Google Scholar] [CrossRef]
  11. Hu, H.; Wang, X.; Zhai, Y.; Hu, J. Evaluation of factors affecting student participation in peer-assisted English learning based on online education platform. Int. J. Emerg. Technol. Learn. 2021, 16, 72–87. [Google Scholar] [CrossRef]
  12. Amara, K.; Zenati, N.; Djekoune, O.; Anane, M.; Aissaoui, I.K.; Bedla, H.R. i-DERASSA: E-learning Platform based on Augmented and Virtual Reality interaction for Education and Training. In Proceedings of the 2021 International Conference on Artificial Intelligence for Cyber Security Systems and Privacy (AI-CSP), El Oued, Algeria, 20–21 November 2021; IEEE: New York, NY, USA, 2021; pp. 1–9. [Google Scholar]
  13. Liu, Z.; Han, Z. Exploring Trends of Potential User Experience of Online Classroom on Virtual Platform for Higher Education during COVID-19 Epidemic: A Case in China. In Proceedings of the IEEE International Conference on Teaching, Assessment, and Learning for Engineering (IEEE TALE), Takamatsu, Japan, 8–11 December 2020; IEEE Shikoku Sect. Electr Network: New York, NY, USA, 2020; pp. 742–747. [Google Scholar]
  14. Li, M.F.; Wang, T.; Lu, W.; Wang, M.K. Optimizing the Systematic Characteristics of Online Learning Systems to Enhance the Continuance Intention of Chinese College Students. Sustainability 2022, 14, 11774. [Google Scholar] [CrossRef]
  15. Han, J.D.; Wang, Y. User Experience Design of Online Education Based on Flow Theory. In Proceedings of the 12th Int Conf on Appl Human Factors and Ergon (AHFE)/Virtual Conf on Usabil and User Experience, Human Factors and Wearable Technologies, Human Factors in Virtual Environm and Game Design, and Human Factors and Assist Technol, Electr Network, 25–29 July 2021; Electr Network: New York, NY, USA, 2021; pp. 219–227. [Google Scholar]
  16. Yan, H.Q.; Zhang, H.F.; Su, S.D.; Lam, J.F.I.; Wei, X.Y. Exploring the Online Gamified Learning Intentions of College Students: A Technology-Learning Behavior Acceptance Model. Appl. Sci. 2022, 12, 12966. [Google Scholar] [CrossRef]
  17. Zhang, C.; Xiao, C. Research on Course Experience Optimization of Online Education Based on Service Encounter. Design, User Experience, and Usability: UX Research and Design. In Proceedings of the 10th International Conference, DUXU 2021, Held as Part of the 23rd HCI International Conference, HCII 2021, Virtual Event, 24–29 July 2021; Proceedings, Part I. Springer: Berlin/Heidelberg, Germany, 2021; pp. 652–665. [Google Scholar]
  18. Rasmussen, A.S.; Berntsen, D. Emotional valence and the functions. Mem. Cogn. 2009, 37, 477–492. [Google Scholar] [CrossRef] [PubMed]
  19. Meiselman, H.L. A review of the current state of emotion research in product development. Food Res. Int. 2015, 76, 192–199. [Google Scholar] [CrossRef]
  20. Chamberlain, L.; Broderick, A.J. The application of physiological observation methods to emotion research. Qual. Mark. Res. Int. J. 2007, 10, 199–216. [Google Scholar] [CrossRef] [Green Version]
  21. Tian, M.; Yanjun, X. The study of emotion in tourist experience: Current research progress. Tour. Hosp. Prospect. 2019, 3, 82. [Google Scholar]
  22. Zhao, Y.; Xie, D.; Zhou, R.; Wang, N.; Yang, B. Evaluating Users’ Emotional Experience in Mobile Libraries: An Emotional Model Based on the Pleasure-Arousal-Dominance Emotion Model and the Five Factor Model. Front. Psychol. 2022, 13, 942198. [Google Scholar] [CrossRef] [PubMed]
  23. Zhang, H.; Yin, J.; Zhang, X. The study of a five-dimensional emotional model for facial emotion recognition. Mob. Inf. Syst. 2020, 2020, 8860608. [Google Scholar] [CrossRef]
  24. Chen, T.; Peng, L.; Jing, B.; Wu, C.; Yang, J.; Cong, G. The impact of the COVID-19 pandemic on user experience with online education platforms in China. Sustainability 2020, 12, 7329. [Google Scholar] [CrossRef]
  25. Clatworthy, S. Service Innovation Through Touch-points: Development of an Innovation Toolkit for the First Stages of New Service Development. Int. J. Des. 2011, 5, 15–28. [Google Scholar]
Figure 1. The association between emotional valence and online education services.
Figure 1. The association between emotional valence and online education services.
Sustainability 15 04772 g001
Figure 2. The PADSI five-dimensional evaluation model.
Figure 2. The PADSI five-dimensional evaluation model.
Sustainability 15 04772 g002
Figure 3. Online education service evaluation process.
Figure 3. Online education service evaluation process.
Sustainability 15 04772 g003
Figure 4. Clustering of online education touchpoints.
Figure 4. Clustering of online education touchpoints.
Sustainability 15 04772 g004
Figure 5. SAM image emotional scale.
Figure 5. SAM image emotional scale.
Sustainability 15 04772 g005
Figure 6. Sample online education emotional evaluation questionnaire.
Figure 6. Sample online education emotional evaluation questionnaire.
Sustainability 15 04772 g006
Figure 7. Hierarchical structure model of online education services.
Figure 7. Hierarchical structure model of online education services.
Sustainability 15 04772 g007
Figure 8. Evaluation results of online education system service touchpoints.
Figure 8. Evaluation results of online education system service touchpoints.
Sustainability 15 04772 g008
Table 1. Evaluation index system of online education platform service touchpoints.
Table 1. Evaluation index system of online education platform service touchpoints.
TargetGuidelineTouchpointIndexIndicator InterpretationNumber
Evaluation elements of online education platform servicesVisual
Guidance
Color StyleInterface DesignInterface color style, icon design, transition animation, and other visual factorsT1
Icon Design
Transition
Animation
Navigation SettingsNavigation SettingsNavigation bar location and content classificationT2
Functional Area
Course
Appointment
Course ReminderSchedule design, class reminder, course appointment, etc.T3
Class Reminder
Ad
Recommendation
Ad RecommendationPlatform course recommendation and carousel advertisementT4
Learning
Resources
Course RoomCourse VideosCourse quality and typeT5
Interface Control
CoursewareCoursewareMaterials such as courseware text or slidesT6
BibliographyBibliographyCourse-related booksT7
Extended ReadingExtended ReadingCourse-related materials or similar coursesT8
After-school
Materials
After-school MaterialsExercises or cases involved in the courseT9
After-school
Assessment
Unit TestUnit TestAfter the unit is testedT10
HomeworkHomeworkSingle-session post-test or practice questionsT11
Final ExaminationFinal ExaminationPost-course assessment or final assignmentT12
Answer CollectionAnswer CollectionAnswers to the text class of after-school exercisesT13
Student NotesStudent NotesIn-class or after-class key notesT14
Interactive
Feedback
Course ForumCourse InteractionCourse-related forums or online social groupsT15
Post Published
Interactive
Discussion
Share LessonsShare LessonsCourse link, poster, or trial sharingT16
Homework
Correction
Homework CorrectionJob accuracy and completion analysisT17
Q&AOnline Q&ATeachers answer course-related questions onlineT18
Questions for Help
Content Query
QuestionnaireQuestionnaireCourse Experience and Satisfaction SurveyT19
Table 2. Experimental design of online education service evaluation.
Table 2. Experimental design of online education service evaluation.
Evaluation SystemEmotional Valence DimensionPerceptual Cognitive Dimension
Experimental MaterialsTencent ClassroomTencent Classroom
Sample SelectionNew Concept English (Volume 1)
Design Introductory Classroom
Principles of Aesthetics
New Concept English (Volume 1)
Design Introductory Classroom
Principles of Aesthetics
Evaluation MethodUndergraduate students
majoring in design
Undergraduate students
majoring in design
Comment ContentPAD emotion theory, SAM scaleSatisfaction QuestionnaireAnalytic Hierarchy Matrix
Data RangePleasure, Arousal, DominanceSatisfactionImportance
Evaluation System1–91–50–1
Table 3. Emotional evaluation result.
Table 3. Emotional evaluation result.
NumberIndexPleasureArousalDominance
T1Interface Design6.145.007.00
T2Navigation Settings5.055.266.57
T3Course Reminder5.295.572.81
T4Ad Recommendation2.053.002.10
T5Course Videos7.246.867.05
T6Courseware3.524.622.95
T7Bibliography3.813.526.71
T8Extended Reading3.102.825.76
T9After-school Materials5.676.707.00
T10Unit Test4.546.833.51
T11Homework6.236.014.67
T12Final Examination6.875.452.10
T13Answer Collection3.002.992.05
T14Student Notes3.005.677.43
T15Course Interaction3.422.654.51
T16Share Lessons7.236.877.50
T17Homework Correction2.436.713.05
T18Online Q & A7.856.787.54
T19Questionnaire5.033.562.13
Table 4. Satisfaction and importance of online education service touchpoints.
Table 4. Satisfaction and importance of online education service touchpoints.
No.SatisfactionImportanceNo.SatisfactionImportanceNo.SatisfactionImportance
T13.240.013T81.830.0225T152.110.0243
T23.190.0243T93.650.0493T164.100.0212
T32.520.0203T103.540.0316T171.570.0768
T41.760.0031T113.380.0709T182.160.089
T53.430.2966T122.450.019T192.650.0084
T61.860.1358T132.140.0334
T71.920.0524T141.980.0091
Table 5. Integration of online education service touch point evaluation data.
Table 5. Integration of online education service touch point evaluation data.
IndexPleasureArousalDominanceSatisfactionImportanceImportance Ranking
(1–9)(1–5)(0–1)
T1 Interface Design6.145.007.003.240.01315
T2 Navigation Settings5.055.266.573.190.024310
T3 Course Reminder5.295.572.812.520.020313
T4 Ad Recommendation2.053.002.101.760.003118
T5 Course Videos7.246.867.053.430.29661
T6 Courseware3.524.622.951.860.13582
T7 Bibliography3.813.526.711.920.05246
T8 Extended Reading3.102.825.761.830.022511
T9 After-school Materials5.676.707.003.650.04937
T10 Unit Test4.546.833.513.540.03169
T11 Homework6.236.014.673.380.07095
T12 Final Examination6.875.452.102.450.01914
T13 Answer Collection3.002.992.052.140.03348
T14 Student Notes3.005.677.431.980.009116
T15 Course Interaction3.422.654.512.110.024310
T16 Share Lessons7.236.877.504.100.021212
T17 Homework Correction2.436.713.051.570.07684
T18 Online Q & A7.856.787.542.160.0893
T19 Questionnaire5.033.562.132.650.008417
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

He, X.; Song, N. Emotional Value in Online Education: A Framework for Service Touchpoint Assessment. Sustainability 2023, 15, 4772. https://doi.org/10.3390/su15064772

AMA Style

He X, Song N. Emotional Value in Online Education: A Framework for Service Touchpoint Assessment. Sustainability. 2023; 15(6):4772. https://doi.org/10.3390/su15064772

Chicago/Turabian Style

He, Xuemei, and Ning Song. 2023. "Emotional Value in Online Education: A Framework for Service Touchpoint Assessment" Sustainability 15, no. 6: 4772. https://doi.org/10.3390/su15064772

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop