*Article* **Construction of a Tangible VR-Based Interactive System for Intergenerational Learning**

**Chao-Ming Wang 1,\* , Cheng-Hao Shao <sup>2</sup> and Cheng-En Han <sup>1</sup>**


**Abstract:** The recent years have witnessed striking global demographic shifts. Retired elderly people often stay home, seldom communicate with their grandchildren, and fail to acquire new knowledge or pass on their experiences. In this study, digital technologies based on virtual reality (VR) with tangible user interfaces (TUIs) were introduced into the design of a novel interactive system for intergenerational learning, aimed at promoting the elderly people's interactions with younger generations. Initially, the literature was reviewed and experts were interviewed to derive the relevant design principles. The system was constructed accordingly using gesture detection, sound sensing, and VR techniques, and was used to play animation games that simulated traditional puppetry. The system was evaluated statistically by SPSS and AMOS according to the scales of global perceptions of intergenerational communication and the elderly's attitude via questionnaire surveys, as well as interviews with participants who had experienced the system. Based on the evaluation results and some discussions on the participants' comments, the following conclusions about the system effectiveness were drawn: (1) intergenerational learning activities based on digital technology can attract younger generations; (2) selecting game topics familiar to the elderly in the learning process encourages them to experience technology; and (3) both generations are more likely to understand each other as a result of joint learning.

**Keywords:** e-learning; interactive system; intergenerational learning; virtual reality; tangible user interfacing; traditional puppetry

### **1. Introduction**

### *1.1. Background*

According to the World Health Organization (WHO), people aged 65 or above are considered elderly. A country with the percentage of its population aged 65 and over rising to 7% is called an "aging society", and when the percentage reaches 14% and 20%, it will then be considered as an "aged society" and a "super-aged society", respectively [1]. Based on the governmental statistics of 2014, the number of people aged over 65 in Taiwan reached 1.49 million in 1993. This number accounted for more than 7% of the total population then, making Taiwan an aging society, as defined by the WHO. In addition, a 2018 report published by the government revealed that the elderly accounted for 14.1% of Taiwan's total population then, making the country an "aged society". This figure is projected to reach 20.6% by 2026, meaning that Taiwan will become a "super-aged society" at that time. Furthermore, in recent years Taiwan has been affected by a declining fertility rate, leading to even faster demographic shifts. Similar situations can be found in a great number of other countries in the world. With the advent of the aging population in these countries, it has become a constant global issue to explore methods that can help educate the elderly.

Another issue in aging societies is the serious inadequacy of intergeneration interactions. In countries such as the United States, the United Kingdom, the Netherlands, and

**Citation:** Wang, C.-M.; Shao, C.-H.; Han, C.-E. Construction of a Tangible VR-Based Interactive System for Intergenerational Learning. *Sustainability* **2022**, *14*, 6067. https://doi.org/10.3390/su14106067

Academic Editor: Hao-Chiang Koong Lin

Received: 13 April 2022 Accepted: 11 May 2022 Published: 17 May 2022

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

Sweden, the elderly and young people tend to have fewer interactions and varying opinions [2] due to their different growth backgrounds and inadequate contacts. In addition, the young people in today's societies tend to stereotype the elderly. For instance, Hall and Batey [3] revealed that most children think ill of the elderly due to the latter's declining physical function.

In response to the increasing number of senior citizens, many developed countries and international organizations have begun to formulate policies that emphasize the great importance of education for the elderly. Many countries have put forward learning methods to promote lifelong learning for the elderly, among which intergenerational learning is considered one of the best ways to bridge the generational gap. Ames and Youatt [4] believed that in the face of a larger number of older adults, the age gap has a greater impact on the older and younger generations. Hence, intergenerational programs have been found to be an effective method in providing education and service planning. Kaplan [5] also mentioned that intergenerational learning should be designed to use the advantages of one generation to meet the needs of the other to achieve meaningful and sustained resource exchange and learning between the two generations. Furthermore, Souza and Grundy [6] found that employing intergenerational interactions allowed the elderly to be more confident and positive, and so improved their self-value and happiness. Based on the perspective of the young, it was assumed that young people should be given effective channels to better exchange and cooperate with the elderly [7].

### *1.2. Research Motivation and Issues*

In recent years, in many conventional learning activities, digital technologies have been introduced as auxiliary tools, among which virtual reality (VR) and augmented reality (AR) have been widely used. In particular, Ainge [8] suggested that students who have used VR are better at recognizing geometric shapes than those who have not, and that students in general had a positive attitude toward a VR-based learning environment. Virvou and Katsionis [9] also found that VR educational games could effectively arouse students' interest in learning and had better effects on education than other types of educational software. Meanwhile, with the developments in science and technology, VR has also been particularly adopted to help educate the elderly. For instance, MyndVR [10], a company that integrates VR with elderly health care, is assisting the elderly with diseases such as Alzheimer's and dementia by enabling them to create and experience a fulfilling and meaningful life in their later years. According to many participants, their aging diseases were relieved, and their physical health was improved. Dennis Lally, the cofounder of Rendever [11], developed a VR-based memory therapy system that brings back fond memories virtually and recreates meaningful scenes or places that older adults want to visit. Benoit et al. [12] combined VR and memory therapy by presenting scenes familiar to participants in a virtual environment with image rendering. Moreover, Manera et al. [13] argued that VR allowed patients with cognitive impairment and dementia to have more fun when performing tasks.

Ultimately, technology-based intergenerational activities can arouse more interest among the younger generations than conventional ones [14,15]. Cases related to conventional intergenerational activities indicated that due to familiarity, the elderly are more likely to have a sense of achievement in conventional activities, such as woodchopping, farming, and cooking [16]. However, only a few intergenerational activities feature combinations of technology and conventional activities, notwithstanding those based on VR and TUIs. In this study, TUIs and VR were integrated into conventional intergenerational activities to make the connections between both generations more interactive and recreational. Under the above premises, the following questions were put forward in this study. (1) Does the application of VR and TUIs to intergenerational learning facilitate emotion exchange and experience-sharing between the two generations? (2) What are the steps in setting up a VR-based system for interactive intergenerational learning for different learning tasks?

### *1.3. Literature Review*

The literature on the elderly's physical and mental health, intergenerational learning, VR, and TUI design is reviewed in this section.

### 1.3.1. Physical and Mental Health of the Elderly

According to a report by the European Union, more than half of the world's population will comprise 48-year-olds or above by 2060, indicating that aging will keep accelerating in the next few decades [17]. Meanwhile, Taiwan has an aging population and a declining birth rate, giving the older and younger generations fewer opportunities to interact than they did in the past. Many studies showed that college students generally had a slightly negative view of the elderly, especially concerning the latter's physical and mental health [18–20]. However, it is worth noting that the two generations have fewer interactions. As the elderly age with time, they continue to suffer from physical and mental functional decline and become subject to increasing pain, discomfort, and inconvenience in life, giving rise to psychological changes. Furthermore, aging can be both physiological and psychological. In terms of physiology, Zajicek [21] held that the elderly could not see clearly and would easily become tired due to vision and memory degeneration. In addition, they easily forget how to operate a computer. With respect to their psychological conditions, the elderly are more likely to feel depressed in hard times, such as when they lose family members, friends, social roles, and physical functions. Thus, according to Shibata and Wada [22], some recreational activities may be adopted as an option when communicating with the elderly. Chatman [23] also found that some of the elderly in a community were so afraid of being sent to nursing institutions after retirement that they did not want to share their health conditions with others, even pretending to look healthy.

Research on the aging of seniors revealed that despite physical and mental changes of the elderly people, an aging society is blessed with a remarkable advantage—abundant older human resources. The most valuable thing that the elderly have lies in their wealth of work and life experiences, considering they lived through diverse situations. If older people can participate in more meaningful activities, their experiences can be shared with younger people to promote social progress. Moreover, their physical and mental health will be improved, reducing medical expenditures.

### 1.3.2. Intergenerational Learning

Intergenerational learning programs were implemented in 1963 when the P.K. Yonge Developmental Research School in the United States developed the "Adopt a Grandparent Program". Thereafter, many colleges and universities have begun studying and implementing intergenerational learning programs. In response to generational estrangement caused by population aging, intergenerational learning, which refers to the establishment of mutual learning between the older and younger generations, has emerged globally [5]. Intergenerational learning has been regarded as an informal activity passed down from one generation to another for centuries—a culture bridging tradition and modernization [24]. In modern times, influenced by an increasingly complex society, intergenerational learning is no longer limited to the family, but its influence extends to society and improves the overall social value [25].

Ames and Youatt [4] proposed a comprehensive selection model for intergenerational learning and service activities in classifying conventional intergenerational learning, which can help planners design more diverse and richer learning activities. The model can be divided into three parts: the middle generation, program categories, and selection criteria. In addition, Ohsako [26] divided intergenerational learning into four models/profiles: (1) older adults serving/mentoring/tutoring children and the youth; (2) children and the youth serving/teaching older adults; (3) children, the youth, and older adults serving the community/learning together for a shared task; and (4) children, the youth, and older adults engaged in informal leisure/unintentional learning activities.

Over the past few years, the degree of interaction has been considered a classification criterion. Kaplan [5] thought that this criterion could more effectively explain the positive or negative results of intergenerational learning, and classified intergenerational programs and activities accordingly into the following seven different levels of intergenerational engagement, ranging from initiatives (point #1 of the below) to those that promote intensive contact and ongoing opportunities for intimacy (point #7 of the below): (1) learning about other age groups; (2) seeing the other age group but at a distance; (3) meeting each other; (4) annual or periodic activities; (5) demonstration projects; (6) ongoing intergenerational programs; and (7) ongoing, natural intergenerational sharing, support, and communication. Ames and Youatt [4] put forward the most comprehensive selection model of intergenerational learning and service activities, while Ohsako [26], from a different perspective, enabled planners to design diverse and interesting activities. On the other hand, Kaplan [5] used the interaction degree as a classification criterion, and discussed how interactive methods are generated and how deep and sustainable interactions are conducted based on commonality.

Furthermore, Ames and Youatt [4] found that conventional intergenerational learning activities were mostly service-oriented. Thus, the value and significance of activities to participants must be taken into account when selecting and evaluating the topics involved in the intergenerational activities. A good intergenerational program should not only meet the expected goals, but also provide balanced and diverse activities to participants. Moreover, it is important to introduce digital technology to make intergenerational activities more interactive and recreational for the two generations.

Finally, it is worth noticing that some scholars have investigated the applications of group learning or education from wider points of view. For example, Kyrpychenko et al. [27] studied the structure of communicative competence and its formation while teaching a foreign language to higher education students. The results of the questionnaire survey of the students' responses provided grounds for the development of experimental methods for such competence formations by future studies. Kuzminykh et al. [28] investigated the development of competence in teaching professional discourse in educational establishments, and showed that the best approach was to adopt a model consisting of two stages based on self-education and group education. The research results revealed that communicative competence may be achieved through a number of activities that may be grouped under four generic categories. Singh et al. [29] proposed an intelligent tutoring system named "Seis Tutor", that can offer a learning environment for face-to-face tutoring. The performance of the system was evaluated in terms of personalization and adaptation through a comparison with some existing tutoring systems, leading to a conclusion that 73.55% of learners were strongly satisfied with artificial intelligence features. To improve early childhood education for social sustainability in the future, Oropilla and Odegaard [30] suggested the inclusion of intentional intergenerational programs in kindergartens, and presented a framework that featured conflicts and opportunities within overlapping and congruent spaces to understand conditions for various intergenerational practices and activities in different places, and to promote intergenerational dialogues, collaborations, and shared knowledge.

### 1.3.3. Virtual Reality

The concept of virtual reality (VR) was first proposed in 1950, but was not materialized until 1957. Heilig [31] developed Sensorama, the first VR-based system with sight, hearing, touch, and smell senses, as well as 3D images. In 1985, Lanier [32] expressed that VR must be generated on a computer with a graphics system and various connecting devices in order to provide immersive interactive experiences. Burdea [33] proposed the concept of the 3I VR pyramid, and maintained that VR should have three elements: immersion, imagination, and interaction. Currently, VR can be classified into six categories according to design technology and user interfaces: (1) desktop VR; (2) immersion VR; (3) projection VR; (4) simulator VR; (5) telepresence VR; and (6) network VR; therefore, in recent years, many experts and scholars assumed that the VR technology can improve participants' attitude toward and interest in learning, and that the interactions in learning tasks can be strengthened in an immersive environment to improve learning effects [9,34].

Many applications have been developed using VR technology in the past. A specific direction of VR applications for human welfare is the use of VR in the healthcare domain. In this research direction, Nasralla [35] studied the construction of sustainable patient-rehabilitation systems with IoT sensors for the development of virtual smart cities. The research results showed that the proposed approach could be useful in achieving sustainable rehabilitation services. In addition, Sobnath et al. [36] advocated the use of AI, big data, high-bandwidth networks, and multiple devices in a smart city to improve the life of visually impaired persons (VIPs) by providing them with more independence and safety. Specifically, the uses of strong ICT infrastructure with VR/AR and various wearable devices can provide VIPs with a better quality of life.

Table 1 summarizes the following key points in this study, based on relevant cases and the literature integrating both VR and the elderly: (1) VR is proven effective and has a positive effect on improving the body and cognition of the elderly; (2) VR-based learning activities can effectively enhance the learners' interest and help them make more progress; and (3) older people can adapt to VR, which can stimulate their memory according to their familiarity with a given scene, thereby achieving the effect of memory therapy [12]. As Davis mentioned regarding the technology-acceptance model (TAM) model [37–39], users' acceptance of science and technology is affected by "external factors", such as their living environment, learning style, and personal characteristics. Thus, it is impossible to determine whether the above method can have the same effect on the elderly in a specific society or country. In order to determine the usefulness and acceptability of VR for older Taiwanese people, Syed Abdul et al. [40] invited 30 older people over 60 years old in Taiwan to experience nine different VR games within six weeks, with each experience lasting 15 min. Then, they analyzed the users' performances with a scale based on the TAM model by Davis. The results showed that the elderly enjoyed the experience, finding VR useful and easy to use, which indicated that the elderly held a positive attitude toward the new technology.


**Table 1.** Relevant cases in which VR was used for the elderly.

### 1.3.4. Tangible User Interfaces 1.3.4. Tangible User Interfaces

George and David,

Rendever, 2018 [11]

2017 [41]

The use of tangible user interfaces (TUIs) is a brand-new user-interfacing concept proposed by Ishii and Ullmer [42]. Unlike graphical user interfaces (GUIs), TUIs emphasize using common objects in daily life as the control interface, making the control action beyond screen manipulations. They allow users to operate the interface more intuitively by moving, grabbing, flipping, and knocking, and in other ways that people think are feasible to control the human–computer interface. Furthermore, TUIs grant a tangible form to utilize digital information or run programs [43]. The use of tangible user interfaces (TUIs) is a brand-new user-interfacing concept proposed by Ishii and Ullmer [42]. Unlike graphical user interfaces (GUIs), TUIs emphasize using common objects in daily life as the control interface, making the control action beyond screen manipulations. They allow users to operate the interface more intuitively by moving, grabbing, flipping, and knocking, and in other ways that people think are feasible to control the human–computer interface. Furthermore, TUIs grant a tangible form to utilize digital information or run programs [43].

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 6 of 45

elderly to slow down their mental aging and treat relevant diseases.

After performing practical tasks in a 3D VR-based environment, the elderly held a more positive

A memory therapy that used 3D technology to recreate meaningful scenes or places the elderly want to visit was developed, providing immersive experiences through

attitude toward VR.

VR.

without any external

Users wore a head-

a controller to move around, and located their hometowns in a 3D street

mounted display and used

Users could see images by wearing a simple headmounted VR display without any external

operation.

view.

operation.

More specifically, TUIs enable digital information to show in a tangible form. A digital interface consists of two important components: input and output, also known as control and representation. Control refers to how users manipulate information, while representation refers to how information is perceived. The tangible form shown in the use of TUIs may be regarded as the digital equivalent to control and representation, and the tangible artifacts operated in applying TUIs may be considered as devices for displaying representation and control. In other words, TUIs combine tangible representation (e.g., objects that can be operated by hand) with digital representation (e.g., images or sounds), as shown in Figure 1. More specifically, TUIs enable digital information to show in a tangible form. A digital interface consists of two important components: input and output, also known as control and representation. Control refers to how users manipulate information, while representation refers to how information is perceived. The tangible form shown in the use of TUIs may be regarded as the digital equivalent to control and representation, and the tangible artifacts operated in applying TUIs may be considered as devices for displaying representation and control. In other words, TUIs combine tangible representation (e.g., objects that can be operated by hand) with digital representation (e.g., images or sounds), as shown in Figure 1.

**Figure 1.** The conceptual framework of TUIs. **Figure 1.** The conceptual framework of TUIs.

The concept of TUIs has been constantly discussed in man–machine interfacing seminars, and has been applied in various fields such as education and learning [44–46], music and entertainment [47–49], and professional solutions [50–52]. From the The concept of TUIs has been constantly discussed in man–machine interfacing seminars, and has been applied in various fields such as education and learning [44–46], music and entertainment [47–49], and professional solutions [50–52]. From the environmental psychology perspective, psychologists believed that TUIs had a tangible form and took advantage of the "affordance" of objects [53]. Some scholars, who employed the perceptualmotor theory as their research core, focused on user actions generated between them and TUIs, as well as on the dynamic presentation of TUIs [54–57].

Furthermore, TUIs provide a simpler and more intuitive way to help users accomplish goals. By combining physical manipulations with convenient digital technology, a TUI serves as a bridge connecting users and digital content. Moreover, TUIs have expanded the concept of interface design. Thus, scholars have conducted several discussions of the theoretical basis and scope of the conceptual framework of TUIs. In this study, it is proposed to apply TUIs in intergenerational learning activities, enabling users to intuitively operate interfaces and conduct their distinctive ways of presentation.

### *1.4. Brief Description of the Proposed Research and Paper Organization*

Figure 2 shows the framework of this study, which was mainly derived from the literature related to the elderly's learning and technology applications. The design concept and system development were determined through interviews with experts. After constructing

the system, actual intergenerational learning activities were held, and a questionnaire survey and participant interviews were conducted, with the results being analyzed and the effectiveness of the system evaluated. and system development were determined through interviews with experts. After constructing the system, actual intergenerational learning activities were held, and a questionnaire survey and participant interviews were conducted, with the results being analyzed and the effectiveness of the system evaluated.

Figure 2 shows the framework of this study, which was mainly derived from the literature related to the elderly's learning and technology applications. The design concept

environmental psychology perspective, psychologists believed that TUIs had a tangible form and took advantage of the "affordance" of objects [53]. Some scholars, who employed the perceptual-motor theory as their research core, focused on user actions generated between them and TUIs, as well as on the dynamic presentation of TUIs [54–

Furthermore, TUIs provide a simpler and more intuitive way to help users accomplish goals. By combining physical manipulations with convenient digital technology, a TUI serves as a bridge connecting users and digital content. Moreover, TUIs have expanded the concept of interface design. Thus, scholars have conducted several discussions of the theoretical basis and scope of the conceptual framework of TUIs. In this study, it is proposed to apply TUIs in intergenerational learning activities, enabling users to intuitively operate interfaces and conduct their distinctive ways of presentation.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 7 of 45

*1.4. Brief Description of the Proposed Research and Paper Organization* 

**Figure 2.** The framework of this study. **Figure 2.** The framework of this study.

In this study, the research was mainly focused on intergenerational learning, and a novel tangible VR-based interactive system was proposed for such activities between the two generations of grandparents and grandchildren. Although previous studies, as mentioned in the previous literature survey, had various designs of intergenerational learning activities, the special case of adopting a traditional show performance, namely, glove puppetry, as the activity for intergenerational learning between the generations was not seen in the existing studies, which highlights the unique value of the proposed system and the resulting findings of this study. In this study, the research was mainly focused on intergenerational learning, and a novel tangible VR-based interactive system was proposed for such activities between the two generations of grandparents and grandchildren. Although previous studies, as mentioned in the previous literature survey, had various designs of intergenerational learning activities, the special case of adopting a traditional show performance, namely, glove puppetry, as the activity for intergenerational learning between the generations was not seen in the existing studies, which highlights the unique value of the proposed system and the resulting findings of this study.

The remainder of this manuscript is organized as follows. The proposed methods are described in Section 2. In Section 3, the results of our study are presented. The discussions of the experimental results are described in Section 4. Finally, the conclusions and discussions about the merits of the proposed system are included in Section 5. The remainder of this manuscript is organized as follows. The proposed methods are described in Section 2. In Section 3, the results of our study are presented. The discussions of the experimental results are described in Section 4. Finally, the conclusions and discussions about the merits of the proposed system are included in Section 5.

### **2. Methods**

57].

**2. Methods**  A literature review, interviews with experts, the construction of a prototype system, a questionnaire survey, and interviews with participants were carried out in this study. Then, the effects of the system were analyzed. In particular, this study can be divided into four phases, as shown in Figure 3. In Phase I, the research objectives, scope, and methods were determined. Then, the relevant literature was analyzed, with key information on intergenerational learning, VR, and TUIs being collected. In Phase II, several experts in different fields were invited and interviewed to provide their comments on relevant topics that were determined according to the results of the literature review. The design A literature review, interviews with experts, the construction of a prototype system, a questionnaire survey, and interviews with participants were carried out in this study. Then, the effects of the system were analyzed. In particular, this study can be divided into four phases, as shown in Figure 3. In Phase I, the research objectives, scope, and methods were determined. Then, the relevant literature was analyzed, with key information on intergenerational learning, VR, and TUIs being collected. In Phase II, several experts in different fields were invited and interviewed to provide their comments on relevant topics that were determined according to the results of the literature review. The design principles for a VR-based intergenerational learning system using TUIs were derived based on the interview results. According to the principles, a prototype system for intergenerational learning was constructed, with relevant learning activities designed. In Phase III, users of two generations were invited to participate in the interactive intergenerational learning activities. In addition, comment data were collected through interviews with the participants and questionnaire surveys on the two aspects of intergenerational communication and the elderly's attitude. In Phase IV, the collected data were used for statistical evaluations of the effectiveness of the developed system using the SPSS and AMOS software packages, and conclusions and suggestions were put forward finally.

It is noteworthy that to use the proposed system, it requires the users' interactions through actually experiencing activities, so the users, especially the elderly, are required to have sound audio–visual ability, and be able to conduct hand movements, beat gongs and drums, and perform hand gestures.

finally.

and drums, and perform hand gestures.

principles for a VR-based intergenerational learning system using TUIs were derived based on the interview results. According to the principles, a prototype system for intergenerational learning was constructed, with relevant learning activities designed. In Phase III, users of two generations were invited to participate in the interactive intergenerational learning activities. In addition, comment data were collected through interviews with the participants and questionnaire surveys on the two aspects of intergenerational communication and the elderly's attitude. In Phase IV, the collected data were used for statistical evaluations of the effectiveness of the developed system using the SPSS and AMOS software packages, and conclusions and suggestions were put forward

It is noteworthy that to use the proposed system, it requires the users' interactions through actually experiencing activities, so the users, especially the elderly, are required to have sound audio–visual ability, and be able to conduct hand movements, beat gongs

**Figure 3.** The flowchart of this study. **Figure 3.** The flowchart of this study.

#### *2.1. Interviews with Experts 2.1. Interviews with Experts*

Interviews with experts, as a form of interviews, allow interviewers to communicate with interviewees to collect research data [58]. Three experts, who were prominent in the fields of the elderly's learning, children's education, and interaction design, were invited (see Table 2 for the experts' respective backgrounds). Experts were interviewed in a semistructural form, through which they expressed their opinions on four aspects: (1) the importance of intergenerational learning; (2) the design of intergenerational learning activities; (3) the introduction of digital technology into intergenerational learning; and Interviews with experts, as a form of interviews, allow interviewers to communicatewith interviewees to collect research data [58]. Three experts, who were prominent in the fields of the elderly's learning, children's education, and interaction design, wereinvited (see Table <sup>2</sup> for the experts' respective backgrounds). Experts were interviewed in a semistructural form, through which they expressed their opinions on four aspects: (1) the importance of intergenerational learning; (2) the design of intergenerational learning activities; (3) the introduction of digital technology into intergenerational learning; and (4) the introduction of VR into intergenerational activities. The results of the interviews were used as a reference for designing the desired system (see Section 3 for details).

**Table 2.** Backgrounds of experts accepting interviews in this study.


### *2.2. Prototype Development*

Boar's prototyping model [59] was utilized in this study to understand the users of a product in the shortest time. Through the constructed prototype, the users' responses were repeatedly evaluated, with their feedback serving as the basis for the designer to modify the product. In this way, the designer could have a clear idea of the users' needs and finally create products that met their expectations. Beaudouin Lafon and Mackay [60] believed that a prototype could allow for a concrete presentation of an abstract system, provide information during the design process, and help the designer choose the best design scheme. Thus, based on Boar's prototyping model, in this study the development of the desired intergenerational learning system based on VR and TUIs was planned in five steps, as follows: (1) planning and analyzing requirements; (2) designing the VRbased intergenerational learning system; (3) establishing a prototype of the VR-based intergenerational learning system; (4) evaluating and modifying the prototype system; and (5) completing the system development.

### *2.3. Questionnaire*

The questionnaire for the system-effectiveness evaluation was modified several times after system development and before the experiment commenced. Fine adjustments were made later according to the evaluated effectiveness of the system to increase the stability of the system's devices. The formal experimental questionnaire adopted in this study was divided into two parts. The first part adopted the Global Perceptions of Intergenerational Communication (GPIC) scale of McCann and Giles [61], in which questions were designed to learn how the two generations thought of each other in the intergenerational learning activities. In the second part, the Elderly's Attitude Scale proposed by Lu and Kao [62] was employed to determine children's attitudes toward the elderly in the experience. The answers to questions in the questionnaire were shown on a five-point Likert scale, with 1 meaning "strongly disagree" and 5 meaning "strongly agree" (see Section 3 for the detailed analysis and description of the questionnaire).

### *2.4. Interviews with Participants*

After conducting the questionnaire survey, five groups of 10 participants were randomly selected for interviews. This qualitative research method focused on the interviewees' self-perception and their descriptions of their life experiences. Particularly, researchers could comprehend the respondents' cognition of facts based on their responses [63]. Semistructural interviews are flexible, allowing interviewees to convey their most authentic cognition and feelings [64]. They were so adopted in this study to explore the emotional exchange between the two generations and understand their perceptions of the VR-based interactive intergenerational learning system.

In addition, based on the interaction design principles proposed by Verplank [65], the following topics were determined for use in the interview conducted in this study: (1) system operation; (2) experiences and feelings; and (3) emotions and experience transmission. As assumed by Verplank [65], the most important thing in interaction design was participants' physical and mental feelings; therefore, when designing an interaction device, designers must consider users' system operations (DO), experiences and feelings (FEEL), and emotions and experiences (KNOW). The detailed interviews with users are presented in Section 3 below.

### **3. Results**

### *3.1. Interviews with Experts*

The questions in the interviews with experts were based on the selection mode of intergenerational learning and service activities proposed by Ames and Youatt [4] and the technology-acceptance model by Davis [37–39] to ensure the topics and direction of this study. The questions mainly covered four dimensions. Dimension One was mainly concerned with the importance of intergenerational learning, including its significance and influence in today's society. Dimension Two involved the design of intergenerational learning activities, focusing on content presentation and other matters needing attention. Dimension Three involved the application of digital technology in intergenerational learning, focusing on the introduction method and views on the work. Dimension Four involved the introduction of VR to intergenerational activities, focusing on views of the interactive experiences, developments, and trends of VR-based intergenerational activities. Table 3 shows the outline and summary of the interviews with the experts.

**Table 3.** Outline and summary of the interviews with experts.


In the interviews with experts, four key points were summarized. First, intergenerational learning plays an important role for the elderly and the young, bridging the gap between them and allowing them to learn from each other. Second, owing to the changes in the times and backgrounds, the progress of technology has widened the gap between the two generations. Therefore, if the elderly can actively learn about technology products and have more conversations with their grandchildren, they will find it easier to develop closer relationships with the latter. Intergenerational learning activities based on digital technology are more appealing to the young. Third, when designing intergenerational activities, the backgrounds and interests of the two generations should also be taken into account, and their common points should be identified to ensure participants' full engagement in the interaction. Meanwhile, the difficulty of activities must be appropriate so that the two generations can have fun during the interaction. Fourth, concerning the interface design of intergenerational activities, the differences between the elderly and children regarding the font size, color, and visual and sound effects chosen must be considered.

In this study, the key factors of intergenerational learning, the introduction of technology, and the application of VR to system development and design were summarized through interviews with experts. According to the recording, analysis, and sorting of the interview information, the key points were used as a reference for the subsequent system design.

### *3.2. System Design*

In this study, a novel system for intergenerational learning activities was designed. By integrating conventional learning activities with the uses of VR and TUIs, the communication, learning, and interaction between the two generations were strengthened through multiple interaction techniques and immersive experiences.

### 3.2.1. The Concept of Design of Intergenerational Learning Activities

According to the literature review, it is known that the topics involved in intergenerational learning activities must be familiar to the elderly and interesting to the young in order to encourage them to have more interactions. Therefore, in this study the traditional *glove puppetry* was chosen as the topic for the intergenerational learning activities. Glove puppetry has a long history in Taiwan, reaching its peak between the 1950s and 1960s. Hence, the elderly are rather familiar with it. On the other hand, according to the interviews with experts conducted in this study, experts majoring in children's education deemed that children are easily attracted by cartoon characters and are generally interested in Muppets and dolls. Thus, they are likely to be curious about puppetry plays due to the cool sound and light effects created in such plays. Lastly, based on the selection model of intergenerational learning and service activities by Ames and Youatt [4], puppetry-based intergenerational activities suitable for this study were designed.

The theme of the designed activities was chosen to be "Recall the Play", in which the word "recall" in Chinese has the same pronunciation as the word "together", aiming at encouraging the elderly to perform glove puppetry together with the grandchildren. The elderly could recall the memories and feelings of watching puppet shows during their youth through interacting with their grandchildren. Meanwhile, sharing memories and stories of the past, as well as allusions to puppet shows, with grandchildren was a way of promoting cultural inheritance.

The activities were designed in such a way that they can be conducted through cooperation of the two generations, similar to the proscenium and backstage of a traditional open puppet-show theater, such as the example shown in Figure 4. In particular, while being assisted by instructions shown on a visible screen, a participant on the proscenium should make the corresponding gestures to manipulate the puppet in an animation. In the meantime, the other participant backstage must hit the corresponding musical instrument as instructed. If the participants follow the instructions correctly in time, the corresponding animation will be displayed correctly and smoothly.

time, the corresponding animation will be displayed correctly and smoothly.

time, the corresponding animation will be displayed correctly and smoothly.

elderly could recall the memories and feelings of watching puppet shows during their youth through interacting with their grandchildren. Meanwhile, sharing memories and stories of the past, as well as allusions to puppet shows, with grandchildren was a way of

elderly could recall the memories and feelings of watching puppet shows during their youth through interacting with their grandchildren. Meanwhile, sharing memories and stories of the past, as well as allusions to puppet shows, with grandchildren was a way of

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 13 of 45

The activities were designed in such a way that they can be conducted through cooperation of the two generations, similar to the proscenium and backstage of a traditional open puppet-show theater, such as the example shown in Figure 4. In particular, while being assisted by instructions shown on a visible screen, a participant on the proscenium should make the corresponding gestures to manipulate the puppet in an animation. In the meantime, the other participant backstage must hit the corresponding musical instrument as instructed. If the participants follow the instructions correctly in

The activities were designed in such a way that they can be conducted through cooperation of the two generations, similar to the proscenium and backstage of a traditional open puppet-show theater, such as the example shown in Figure 4. In particular, while being assisted by instructions shown on a visible screen, a participant on the proscenium should make the corresponding gestures to manipulate the puppet in an animation. In the meantime, the other participant backstage must hit the corresponding musical instrument as instructed. If the participants follow the instructions correctly in

promoting cultural inheritance.

promoting cultural inheritance.

**Figure 4.** Real proscenium and backstage performances of a traditional glove puppet show: (**a**) proscenium; (**b**) backstage. **Figure 4.** Real proscenium and backstage performances of a traditional glove puppet show: (**a**) proscenium; (**b**) backstage. proscenium; (**b**) backstage.

More specifically, regarding "Recall the Play" as a two-person cooperative

More specifically, regarding "Recall the Play" as a two-person cooperative experience system, its design concept is as illustrated in Figure 5, and mainly consists of two parts: *puppetry* and *sound effects*. In the puppetry part, an operator who is assumed to be on the proscenium wears an HTC VIVE headset to watch the performance of the role of a character, named *Good Man*, fighting another character, named *Evil Person*, in a puppet play from the first-person perspective, totally immersed in an on-the-spot experience. In addition, a Leap Motion sensor was mounted for gesture detection, allowing the operator to simulate the puppet gestures more intuitively. The sound effects of the second part were created to represent the soul of the puppet play in the real performance. An *interactive installation* with TUIs composed of a real drum, a gong, and a sound-sensing device was used to simulate the backstage performance in the real puppet show. The overall interactive scenario can be seen both on the headset screen and on an open system screen, which simulates what occurs during the performance in a traditional open puppet More specifically, regarding "Recall the Play" as a two-person cooperative experience system, its design concept is as illustrated in Figure 5, and mainly consists of two parts: *puppetry* and *sound effects*. In the puppetry part, an operator who is assumed to be on the proscenium wears an HTC VIVE headset to watch the performance of the role of a character, named *Good Man*, fighting another character, named *Evil Person*, in a puppet play from the first-person perspective, totally immersed in an on-the-spot experience. In addition, a Leap Motion sensor was mounted for gesture detection, allowing the operator to simulate the puppet gestures more intuitively. The sound effects of the second part were created to represent the soul of the puppet play in the real performance. An *interactive installation* with TUIs composed of a real drum, a gong, and a sound-sensing device was used to simulate the backstage performance in the real puppet show. The overall interactive scenario can be seen both on the headset screen and on an open system screen, which simulates what occurs during the performance in a traditional open puppet theater. experience system, its design concept is as illustrated in Figure 5, and mainly consists of two parts: *puppetry* and *sound effects*. In the puppetry part, an operator who is assumed to be on the proscenium wears an HTC VIVE headset to watch the performance of the role of a character, named *Good Man*, fighting another character, named *Evil Person*, in a puppet play from the first-person perspective, totally immersed in an on-the-spot experience. In addition, a Leap Motion sensor was mounted for gesture detection, allowing the operator to simulate the puppet gestures more intuitively. The sound effects of the second part were created to represent the soul of the puppet play in the real performance. An *interactive installation* with TUIs composed of a real drum, a gong, and a sound-sensing device was used to simulate the backstage performance in the real puppet show. The overall interactive scenario can be seen both on the headset screen and on an open system screen, which simulates what occurs during the performance in a traditional open puppet theater.

**Figure 5.** Schematic diagram of the proposed "Recall the Play" system. **Figure 5.** Schematic diagram of the proposed "Recall the Play" system. **Figure 5.** Schematic diagram of the proposed "Recall the Play" system.

Regarding the gesture-recognition method adopted in this research, two common gestures used in traditional glove puppet manipulation, namely, "rotation" and "flipping", were simulated, using position vector and rotation angle matching techniques (please see Algorithm 1) provided by the Leap Motion sensor to realize the gesture-recognition task. This is a feature of the gesture-recognition task implemented in this study. Since only two kinds of gestures needed to be classified, the classification process is not complicated, and in the actual operations, almost all classification outcomes were correct, yielding good user performance results.

### 3.2.2. System Architecture

For system development to simulate the real performance in a traditional open theater as described previously, the hardware of the proposed system was designed to include: (1) the software package Leap Motion for human gesture detection; (2) an HTC VIVE head-

set for immersed VR; (3) an Arduino chipset for sound-signal analysis; (4) a microphone module for sound input; (5) a drum and a gong for backstage environment simulation; (6) a PC for system control; and (7) a screen for system display, which is subsequently called the *system screen*. Note that there is another screen within the VIVE headset that is subsequently called the *headset screen*. The software used in the system include the components of Unity 3D, Blender, Adobe Illustrator, Adobe Photoshop, and Adobe after Effects. Figure 6 shows the architecture of the proposed "Recall the Play" system. microphone module for sound input; (5) a drum and a gong for backstage environment simulation; (6) a PC for system control; and (7) a screen for system display, which is subsequently called the *system screen*. Note that there is another screen within the VIVE headset that is subsequently called the *headset screen*. The software used in the system include the components of Unity 3D, Blender, Adobe Illustrator, Adobe Photoshop, and Adobe after Effects. Figure 6 shows the architecture of the proposed "Recall the Play" system.

Regarding the gesture-recognition method adopted in this research, two common gestures used in traditional glove puppet manipulation, namely, "rotation" and "flipping", were simulated, using position vector and rotation angle matching techniques (please see Algorithm 1) provided by the Leap Motion sensor to realize the gesturerecognition task. This is a feature of the gesture-recognition task implemented in this study. Since only two kinds of gestures needed to be classified, the classification process is not complicated, and in the actual operations, almost all classification outcomes were

For system development to simulate the real performance in a traditional open theater as described previously, the hardware of the proposed system was designed to include: (1) the software package Leap Motion for human gesture detection; (2) an HTC VIVE headset for immersed VR; (3) an Arduino chipset for sound-signal analysis; (4) a

**Figure 6.** The framework of the proposed "Recall the Play" system. **Figure 6.** The framework of the proposed "Recall the Play" system.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 14 of 45

correct, yielding good user performance results.

3.2.2. System Architecture

Two types of interfacing—gesture detection and the use of the interactive installation—constitute the main input interaction part of the system. Specifically, in the gesture-detection process, the Leap Motion sensor detects the operator's hand movements, identifies the motion type, and sends the corresponding signals to the PC end. Meanwhile, the VIVE headset receives infrared signals emitted by two position-fixed infrared transmitters, and sends them to the PC end for relative spatial positioning of the Two types of interfacing—gesture detection and the use of the interactive installation constitute the main input interaction part of the system. Specifically, in the gesture-detection process, the Leap Motion sensor detects the operator's hand movements, identifies the motion type, and sends the corresponding signals to the PC end. Meanwhile, the VIVE headset receives infrared signals emitted by two position-fixed infrared transmitters, and sends them to the PC end for relative spatial positioning of the operator. This scheme is needed for precise VR environment creation on the VIVE internal display.

operator. This scheme is needed for precise VR environment creation on the VIVE internal display. Regarding the interface when using the interactive installation, after receiving sound signals emitted by hitting the drum or gong, the microphone module first transmits them to the Arduino chip for preliminary analysis. Then, useful signals are filtered out and sent to the PC end for further processing. It was noteworthy that all signals are of the one-way Regarding the interface when using the interactive installation, after receiving sound signals emitted by hitting the drum or gong, the microphone module first transmits them to the Arduino chip for preliminary analysis. Then, useful signals are filtered out and sent to the PC end for further processing. It was noteworthy that all signals are of the one-way output style. Finally, the PC analyzes the received signals and generates corresponding animation effects on the VIVE headset screen and the system screen on an external monitor.

output style. Finally, the PC analyzes the received signals and generates corresponding animation effects on the VIVE headset screen and the system screen on an external monitor. The hardware devices used in the construction of the proposed system, as described above, are all replicable, and so the system essentially can be built in a DIY manner. In addition, the interactive interface operations of gong or drum beating, as well as the hand gestures, were simple to perform. Therefore, the cost of implementation of the system is inexpensive, and if the proposed system is to be used in future studies, there should be no technical problem with replications or applications.

### 3.2.3. Main Technologies

The main technologies used in the proposed "Recall the Play" system include two parts: gesture detection and sound sensing. Gesture detection mainly identifies the user's hand movements. The corresponding interactive script will take effect to play the correct animations if the movements and times are in line with the given instructions. On the other hand, the sound-sensing devices check whether the user hit the correct musical instrument in the interactive installation backstage. The corresponding interactive animation will start if the instrument and beating times are consistent with the given instructions. The development environment, as well as the hardware and software used, are shown in Figure 6. Gesture detection and sound-sensing techniques using the related devices are described respectively as follows.

### (1) Gesture Detection (1) Gesture Detection

devices are described respectively as follows.

3.2.3. Main Technologies

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 15 of 45

no technical problem with replications or applications.

The hardware devices used in the construction of the proposed system, as described above, are all replicable, and so the system essentially can be built in a DIY manner. In addition, the interactive interface operations of gong or drum beating, as well as the hand gestures, were simple to perform. Therefore, the cost of implementation of the system is inexpensive, and if the proposed system is to be used in future studies, there should be

The main technologies used in the proposed "Recall the Play" system include two parts: gesture detection and sound sensing. Gesture detection mainly identifies the user's hand movements. The corresponding interactive script will take effect to play the correct animations if the movements and times are in line with the given instructions. On the other hand, the sound-sensing devices check whether the user hit the correct musical instrument in the interactive installation backstage. The corresponding interactive animation will start if the instrument and beating times are consistent with the given instructions. The development environment, as well as the hardware and software used, are shown in Figure 6. Gesture detection and sound-sensing techniques using the related

Leap Motion, as shown in Figure 7, is a gesture sensor device developed by Ultraleap that was adopted in the proposed "Recall the Play" system. In order to capture the user's hand movements and identify the corresponding gestures in real time, two infrared cameras and three infrared LEDs in the Leap Motion sensor were used internally. The two internal infrared cameras are equipped in the device to simulate the binocular stereo vision of human eyes. By use of Leap Motion, hand movement data, especially those of the fingers, at 200 frames per second (FPS) can be acquired during the gesture-detection process and converted into the position coordinates of the hand and fingers. Specifically, in this study, when the user's hands were rotated and moved, Leap Motion is used to capture the desired data and upload them to the PC every two seconds. In turn, the PC converts the data into position coordinates, from which the corresponding gestures are defined according to the changes in the coordinate axes. Lastly, the PC compares the uploaded and the defined gesture values to scrutinize whether they match each other as the result of gesture detection and recognition. Leap Motion, as shown in Figure 7, is a gesture sensor device developed by Ultraleap that was adopted in the proposed "Recall the Play" system. In order to capture the user's hand movements and identify the corresponding gestures in real time, two infrared cameras and three infrared LEDs in the Leap Motion sensor were used internally. The two internal infrared cameras are equipped in the device to simulate the binocular stereo vision of human eyes. By use of Leap Motion, hand movement data, especially those of the fingers, at 200 frames per second (FPS) can be acquired during the gesture-detection process and converted into the position coordinates of the hand and fingers. Specifically, in this study, when the user's hands were rotated and moved, Leap Motion is used to capture the desired data and upload them to the PC every two seconds. In turn, the PC converts the data into position coordinates, from which the corresponding gestures are defined according to the changes in the coordinate axes. Lastly, the PC compares the uploaded and the defined gesture values to scrutinize whether they match each other as the result of gesture detection and recognition.

The proposed "Recall the Play" system mainly simulates the gestures of hand and finger manipulations of people using conventional puppets. The two major types of hand gestures selected for use in the system are *finger rotation* and *finger flipping*. These gestures are characterized by the rotations of palm joints. In order to more accurately distinguish the differences in finger rotations between the two gestures, the proposed system mainly use a rotation angle detection algorithm based on the quaternion and Euler's rotation The proposed "Recall the Play" system mainly simulates the gestures of hand and finger manipulations of people using conventional puppets. The two major types of hand gestures selected for use in the system are *finger rotation* and *finger flipping*. These gestures are characterized by the rotations of palm joints. In order to more accurately distinguish the differences in finger rotations between the two gestures, the proposed system mainly use a rotation angle detection algorithm based on the quaternion and Euler's rotation theorems. It is noted that these are commonly used principles, given that the manner in which they are expressed is simple.

Specifically, in Euler's rotation theorem, when an object rotates in an arbitrary 3D space and at least one point is fixed, it can be interpreted that the object rotates around the fixed axis. This concept and that of quaternion rotation were used for gesture detection in this study. Quaternion represents a state of rotation in a 3D space with four numbers, namely, the three position coordinates *x*, *y*, and *z,* as well as the rotation angle *w*. Figure 8 shows the schematic diagram of the quaternion rotation matrix. A quaternion can be expressed as: *q* = ((*x*, *y*, *z*), *θ*) = (*u*, *θ*), where the vector *u* = (*x*, *y*, *z*) represents the coordinate values *x*, *y*, and *z* of the X, Y, and Z axes, respectively, and *θ* is a real number representing the rotation angle. In Table 4, a simple demonstration of the previously described gesture-recognition process of the proposed system is shown, which depicts how to recognize a *finger-rotation gesture* and a *finger-flipping gesture*.

θ) = (*u*, θ

) = (*u*, 

*finger-rotation gesture* and a *finger-flipping gesture*.

*finger-rotation gesture* and a *finger-flipping gesture*.

*finger-rotation finger-flipping* .

values *x*, *y*, and *z* of the X, Y, and Z axes, respectively, and

*gesture*and a*gesture*

values *x*, *y*, and *z* of the X, Y, and Z axes, respectively, and

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 16 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW16 of 45

which they are expressed is simple.

,x FOR PEER REVIEWobject

expressed as: *q* = ((*x*, *y*, *z*),

expressed *q*((*x*, *z*), ) *u*,

expressed as: *q* = ((*x*, *y*, *z*),

**Figure 8.** The Schematic Diagram of the Quaternion Rotation Matrix. **Figure 8.** The Schematic Diagram of the Quaternion Rotation Matrix. **Figure 8.** The Schematic Diagram of the Quaternion Rotation Matrix. Schematic of the Rotation Matrix. **Figure** The Diagram of Quaternion Rotation

**Table 4.** An illustration of gesture recognition. **Table 4.** An illustration of gesture recognition. **Table 4.** An illustration of gesture recognition. An illustration of gesture recognition.**Table** gesture

Algorithm 1 shows the procedure of gesture detection implemented on the proposed system. The algorithm mainly detects a hand gesture (finger rotation or finger flipping) by matching the corresponding position coordinates and rotation angle values of the fingers at three checkpoints *A* through *C*. Note that at each checkpoint, a matching of the quaternion of the input hand movement with the reference one is conducted, which is called quaternion matching. Algorithm 1 shows the procedure of gesture detection implemented on the proposed system. The algorithm mainly detects a hand gesture (finger rotation or finger flipping) by matching the corresponding position coordinates and rotation angle values of the fingers at three checkpoints *A* through *C*. Note that at each checkpoint, a matching of the quaternion of the input hand movement with the reference one is conducted, which is called quaternion matching. Algorithm 1 shows the procedure of gesture detection implemented on the proposed The algorithm mainly detects a hand gesture (finger rotation finger flipping) by matching the corresponding position coordinates and angle of the checkpoints through *C*. Note each checkpoint, a matching the quaternion of the input hand movement with the reference one isconducted, which iscalledquaternion matching.*<sup>q</sup>*=0.8, −0.5), 0.0) *<sup>q</sup>*' ((0.8, 0.5, *q*=−0.7, 0.1), 0.7) *<sup>q</sup>*' <sup>=</sup> ((0.5, −0.5, 0.5)Algorithm 1 shows the procedure of gesture detection implemented on the proposed system. (finger or flipping) by rotation valuesof fingers at three that of the quaternion of the input hand with the reference one conducted, calledmatching.Algorithm 1 shows the procedure of gesture detection implemented on the proposed system. The algorithm mainly detects a hand gesture (finger rotation or finger flipping) by matching the corresponding position coordinates and rotation angle values of the fingers at three checkpoints *A* through *C*. Note that at each checkpoint, a matching of the quaternion of the input hand movement with the reference one is conducted, which is called quaternion matching.

theorems. It is noted that these are commonly used principles, given that the manner in

fixed axis. This concept and that of quaternion rotation were used for gesture detection in Quaternion a of in a 3D numbers,

this study. represents rotation with four namely, three position rotation angle shows diagram the can

position coordinates *x*, *y*, and *z,* well as the rotation angle *w*8 shows the schematic rotation A quaternion be expressed as: *q*= ((, ),

the rotation angle. In Table 4, a simple demonstration of the previously described gesturerecognition process of the proposed system is shown, which depicts how to recognize a

the rotation angle. In Table 4, a simple demonstration of the previously described gesturerecognition process of the proposed system is shown, which depicts how to recognize a

) = (*u*), the *<sup>u</sup>* (*x*, coordinate values *x*, *y*, and *<sup>z</sup>* of the X, Y, and Z axes, respectively, and is a real number representing the rotation angle. Table a demonstration of the previously described process of the proposed system is shown, depicts to recognize a

values and *z*the X, Y, and Z axes, respectively, and is a number representing the rotation angle. In Table 4, a simple demonstration of the previouslydescribed gesturerecognition shown, recognize

Specifically, in Euler's rotation theorem, when an object rotates in an arbitrary 3D space and at least one point is fixed, it can be interpreted that the object rotates around the fixed axis. This concept and that of quaternion rotation were used for gesture detection in this study. Quaternion represents a state of rotation in a 3D space with four numbers, namely, the three position coordinates *x*, *y*, and *z,* as well as the rotation angle *w*. Figure 8 shows the schematic diagram of the quaternion rotation matrix. A quaternion can be

Specifically, in Euler's rotation theorem, when an object rotates in an arbitrary 3D space and at least one point is fixed, it can be interpreted that the object rotates around the fixed axis. This concept and that of quaternion rotation were used for gesture detection in this study. Quaternion represents a state of rotation in a 3D space with four numbers, namely, the three position coordinates *x*, *y*, and *z,* as well as the rotation angle *w*. Figure 8 shows the schematic diagram of the quaternion rotation matrix. A quaternion can be

Specifically, an object in an arbitrary 3D at least one is it be that the object rotates around the

), where the vector *u* = (*x*, *y*, *z*) represents the coordinate

where the vector = *y*, *z*) the

), where the vector *u* = (*x*, *y*, *z*) represents the coordinate

is a real number representing

is a real number representing

θ

### **Algorithm 1.** Gesture Detection Algorithm.

**Algorithm 1.** Gesture Detection Algorithm. **Algorithm 1.**Gesture Detection Algorithm. **Input:** image frames of the finger movements of the operator on the proscenium captured by the cameras of the Leap Motion. **Output:** decision about whether a finger rotation or flipping gesture *G* has been shown by the operator on the proscenium where the checkpoints of *G* are *A*, *B* and *C* in order.

### **Method:**


### (2) Sound Sensing (2) Sound Sensing

microphone module.

waves.

Step 2. Judge whether *u* and

Step 3. Judge whether *u* and

Step 4. Judge whether *u* and

repeat this step.

**Method:** 

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 17 of 45

corresponding position vector *u* and rotation angle

θ

θ

θ

go to Step 3; else, repeat this step.

to Step 4; else, repeat this step.

**Input:** image frames of the finger movements of the operator on the proscenium captured by the

**Output:** decision about whether a finger rotation or flipping gesture *G* has been shown by the operator on the proscenium where the checkpoints of *G* are *A*, *B* and *C* in order.

Step 1. Detect the finger movements in each image frame *F* by Leap Motion and compute the

Step 5. Confirm that gesture *G* is completed and go back to Step 1 to detect another gesture.

θ

reach Checkpoint *A* by corresponding quaternion matching: if so,

reach Checkpoint *B* by corresponding quaternion matching: if so, go

reach Checkpoint *C* by quaternion matching: if so, go to Step 5; else,

of the quaternion of the hand in *F*.

**Algorithm 1.** Gesture Detection Algorithm.

cameras of the Leap Motion.

For the purpose of sound sensing, an Arduino chipset, as shown in Figure 9a, is connected to the microphone module, as shown in Figure 9b, to determine whether the user has beat a specified instrument (the drum or the gong). At the beginning of the interaction, the microphone module attached to the instrument begins to receive sound signals. Then, the signals are sent to the Arduino for noise filtering, where the real sound of the instrument being hit is examined further. If the volume of the real sound is found to be larger than a preset value, then a decision on drum or gong hitting is made, which is finally sent to the PC for further processing to generate the corresponding animation, as described subsequently. For the purpose of sound sensing, an Arduino chipset, as shown in Figure 9a, is connected to the microphone module, as shown in Figure 9b, to determine whether the user has beat a specified instrument (the drum or the gong). At the beginning of the interaction, the microphone module attached to the instrument begins to receive sound signals. Then, the signals are sent to the Arduino for noise filtering, where the real sound of the instrument being hit is examined further. If the volume of the real sound is found to be larger than a preset value, then a decision on drum or gong hitting is made, which is finally sent to the PC for further processing to generate the corresponding animation, as described subsequently.

**Figure 9.** Hardware used for sound processing: (**a**) the Arduino chipset; (**b**) the **Figure 9.** Hardware used for sound processing: (**a**) the Arduino chipset; (**b**) the microphone module.

In more detail, the data of analog sound waves collected by the microphone were converted in this study into radio waves to demonstrate the range of the sound waves. The signal values stand around 27 dB indoors. When an instrument (the drum or the gong) of the interactive installation of the proposed "Recall the Play" system, as shown in Figure 10, was hit, the signal value significantly jumped from 27 dB to more than 60 dB, as shown in Figure 11. Since the volume of different instruments is affected by their loudness, timbre, and other factors, the preset values for sound detection can be adjusted according to the changes in the volume and the value of each instrument to avoid mutual interference of different instruments. Furthermore, to avoid ambient noise coming from the surrounding environment, it can be seen in Figure 11 that the sound signals from ambient and instrument waves are quite different, and so the ambient noise is easy to filter out. Table 5 shows the average volume values generated when the drum and the gong are In more detail, the data of analog sound waves collected by the microphone were converted in this study into radio waves to demonstrate the range of the sound waves. The signal values stand around 27 dB indoors. When an instrument (the drum or the gong) of the interactive installation of the proposed "Recall the Play" system, as shown in Figure 10, was hit, the signal value significantly jumped from 27 dB to more than 60 dB, as shown in Figure 11. Since the volume of different instruments is affected by their loudness, timbre, and other factors, the preset values for sound detection can be adjusted according to the changes in the volume and the value of each instrument to avoid mutual interference of different instruments. Furthermore, to avoid ambient noise coming from the surrounding environment, it can be seen in Figure 11 that the sound signals from ambient and instrument waves are quite different, and so the ambient noise is easy to filter out. Table 5 shows the average volume values generated when the drum and the gong are played, and such values may be adopted as the threshold values for noise filtering. *Sustainability* **2022**, *14*, x FOR PEER REVIEW 18 of 45

played, and such values may be adopted as the threshold values for noise filtering.

**Figure 10.** The gong and drum of the interactive installation of the proposed "Recall the Play" system and the wiring for sound-signal detection and transmission. (**a**) The front view of the installation; (**b**) the wiring. **Figure 10.** The gong and drum of the interactive installation of the proposed "Recall the Play" system and the wiring for sound-signal detection and transmission. (**a**) The front view of the installation; (**b**) the wiring.

**Table 5.** Volume values of the instruments of the interactive installation.

*correct* instrument (drum or gong) is hit.

**(a) (b) Figure 11.** The data graphs of ambient and instrument waves: (**a**) ambient waves; (**b**) instrument

**Instrument Name Device Volume Value (dB)** 

Drum 30–37

Gong 40–48

Algorithm 2 shows the procedure of sound-detection algorithm implemented in this study for the proposed system. The algorithm mainly examines whether the volume of the sound continuously collected by the microphone exceeds a preset value and whether the sound matches that of the instrument specified by the system to ensure whether the

**(a) (b)** 

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 18 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 18 of 45

installation; (**b**) the wiring.

installation; (**b**) the wiring.

installation; (**b**) the wiring.

**Figure 10.** The gong and drum of the interactive installation of the proposed "Recall the Play" system and the wiring for sound-signal detection and transmission. (**a**) The front view of the

**Figure 10.** The gong and drum of the interactive installation of the proposed "Recall the Play" system and the wiring for sound-signal detection and transmission. (**a**) The front view of the

**Figure 10.** The gong and drum of the interactive installation of the proposed "Recall the Play" system and the wiring for sound-signal detection and transmission. (**a**) The front view of the

**(a) (b)** 

**(a) (b)** 

**Figure 11.** The data graphs of ambient and instrument waves: (**a**) ambient waves; (**b**) instrument waves. **Figure 11.** The data graphs of ambient and instrument waves: (**a**) ambient waves; (**b**) instrument waves. **Figure 11.** The data graphs of ambient and instrument waves: (**a**) ambient waves; (**b**) instrument waves. **Figure 11.** The data graphs of ambient and instrument waves: (**a**) ambient waves; (**b**) instrument waves.


**Table 5.** Volume values of the instruments of the interactive installation. **Table 5.** Volume values of the instruments of the interactive installation. **Table 5.** Volume values of the instruments of the interactive installation. **Table 5.** Volume values of the instruments of the interactive installation.

study for the proposed system. The algorithm mainly examines whether the volume of the sound continuously collected by the microphone exceeds a preset value and whether the sound matches that of the instrument specified by the system to ensure whether the *correct* instrument (drum or gong) is hit. Algorithm 2 shows the procedure of sound-detection algorithm implemented in this study for the proposed system. The algorithm mainly examines whether the volume of the sound continuously collected by the microphone exceeds a preset value and whether the sound matches that of the instrument specified by the system to ensure whether the *correct* instrument (drum or gong) is hit. Algorithm 2 shows the procedure of sound-detection algorithm implemented in this study for the proposed system. The algorithm mainly examines whether the volume of the sound continuously collected by the microphone exceeds a preset value and whether the sound matches that of the instrument specified by the system to ensure whether the *correct* instrument (drum or gong) is hit. Algorithm 2 shows the procedure of sound-detection algorithm implemented in this study for the proposed system. The algorithm mainly examines whether the volume of the sound continuously collected by the microphone exceeds a preset value and whether thesound matches that of the instrument specified by the system to ensure whether the *correct* instrument (drum or gong) is hit.

Algorithm 2 shows the procedure of sound-detection algorithm implemented in this

**Algorithm 2.** Sound Detection and Instrument Beating Verification.

**Input:** (1) the instruction *I* of hitting an instrument (the drum or the gong) specified by the icon appearing on the screens; (2) a preset value *s* for determining whether the instrument is hit *with sufficient loudness*, and (3) the reference signal pattern *P* of the instrument specified by *I*.

**Output:** the decision *success* or *failure*, indicating whether the instrument has been hit correctly or not.

**Method:**

	- (3) Types of gestures designed for generating animations

The proposed "Recall the Play" system designed for intergenerational learning combines VR and TUIs as the interface for interactive learning activities. It is noteworthy that

**Code** 

**D** 

**Gesture 4:**  *Gong beating* **Icon:** 

**D** 

**Operator:** The participant backstage **Attack 4:** 

*Gong beating* **Icon:** 

*Cold-palm-wind attack* 

are given in Algorithm 3.

The participant backstage **Attack 4:** 

*with blue light*

are given in Algorithm 3.

*Cold-palm-wind attack* 

*with blue light*

**Gesture and Corresponding Icon** 

**Code** 

the diverse interactions and immersive experiences provided by the system enhance the effects of learning and interaction between the two generations. Four types of interactive gestures, namely, *finger rotation*, *finger flipping*, *drum beating*, and *gong beating*, as shown in Table 6, were designed for use by the participants in the interactive learning activities. The total time duration of a cycle of interaction in the animation shown both on the headset screen and the system screen is 120 s. One of the icons representing the above-mentioned four types of gestures is shown *randomly* every 2 s on the screens for the two operators on the proscenium and backstage to see and follow. the headset screen and the system screen is 120 s. One of the icons representing the abovementioned four types of gestures is shown *randomly* every 2 s on the screens for the two operators on the proscenium and backstage to see and follow. **Table 6.** Four types of interactive gestures used in the proposed system. **Operator and Corresponding Attack in Animation Animation Scenario and Display on the Screens Explanations about the Gesture**  the headset screen and the system screen is 120 s. One of the icons representing the abovementioned four types of gestures is shown *randomly* every 2 s on the screens for the two operators on the proscenium and backstage to see and follow. **Table 6.** Four types of interactive gestures used in the proposed system. **Gesture and Corresponding Icon Operator and Corresponding Attack in Animation Animation Scenario and Display on the Screens Explanations about the Gesture** 

**Table 6.** Four types of interactive gestures used in the proposed system. 1. During the system-experiencing 1. During the system-experiencing

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 19 of 45

instrument specified by *I.*

*I* is not followed in time.

correctly or not.

**Method:** 

**Method:** 

**Algorithm 2.** Sound Detection and Instrument Beating Verification.

instrument specified by *I.*

*I* is not followed in time.

correctly or not.

**Input:** (1) the instruction *I* of hitting an instrument (the drum or the gong) specified by

**Algorithm 2.** Sound Detection and Instrument Beating Verification.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 19 of 45

**Output:** the decision *success* or *failure*, indicating whether the instrument has been hit

Step 1. Detect the sound *S* emitted around the interactive installation per second using the microphone and the Arduino chipset, and acquire its volume value *v*. Step 2. Examine whether *v* ≥ *s*: if so, go to Step 3; else, repeat this step until 20 sec. has

Step 3. If the sound *S* matches the reference signal pattern *P*, then decide that the correct

next sound detection and instrument beating verification.

(3) Types of gestures designed for generating animations

(3) Types of gestures designed for generating animations

passed, and at that time make the decision *failure* and exit, meaning the instruction

instrument (the drum or the gong) has been hit loudly enough, send the decision *success* to inform the PC for further processing, and jump to Step 1 to continue the

next sound detection and instrument beating verification.

The proposed "Recall the Play" system designed for intergenerational learning combines VR and TUIs as the interface for interactive learning activities. It is noteworthy that the diverse interactions and immersive experiences provided by the system enhance the effects of learning and interaction between the two generations. Four types of interactive gestures, namely, *finger rotation*, *finger flipping*, *drum beating*, and *gong beating*, as shown in Table 6, were designed for use by the participants in the interactive learning activities. The total time duration of a cycle of interaction in the animation shown both on

the icon appearing on the screens; (2) a preset value *s* for determining whether the instrument is hit *with sufficient loudness*, and (3) the reference signal pattern *P* of the

**Input:** (1) the instruction *I* of hitting an instrument (the drum or the gong) specified by

**Output:** the decision *success* or *failure*, indicating whether the instrument has been hit

Step 1. Detect the sound *S* emitted around the interactive installation per second using the microphone and the Arduino chipset, and acquire its volume value *v*. Step 2. Examine whether *v* ≥ *s*: if so, go to Step 3; else, repeat this step until 20 sec. has

Step 3. If the sound *S* matches the reference signal pattern *P*, then decide that the correct

passed, and at that time make the decision *failure* and exit, meaning the instruction

instrument (the drum or the gong) has been hit loudly enough, send the decision *success* to inform the PC for further processing, and jump to Step 1 to continue the

The proposed "Recall the Play" system designed for intergenerational learning combines VR and TUIs as the interface for interactive learning activities. It is noteworthy that the diverse interactions and immersive experiences provided by the system enhance the effects of learning and interaction between the two generations. Four types of interactive gestures, namely, *finger rotation*, *finger flipping*, *drum beating*, and *gong beating*, as shown in Table 6, were designed for use by the participants in the interactive learning activities. The total time duration of a cycle of interaction in the animation shown both on

the icon appearing on the screens; (2) a preset value *s* for determining whether the instrument is hit *with sufficient loudness*, and (3) the reference signal pattern *P* of the


are given in Algorithm 3.

are given in Algorithm 3.

pushes a *cold palm wind with blue light* toward the "Evil Person" in brown, to incur an

(4) System Flow of Intergenerational Learning via Puppetry

**Display:** 

(4) System Flow of Intergenerational Learning via Puppetry

the participant backstage has to hit

3. Pushing a cold palm wind to incur an internal injury in the enemy is a fourth typical attack in the puppet

2. The sound of the gong can be sensed by the system to incur an animation of a *cold-palm-wind attack with blue light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a cold palm wind to incur an internal injury in the enemy is a fourth typical attack in the puppet

internal injury in the enemy is a fourth typical attack in the puppet

2. The sound of the gong can be sensed by the system to incur an animation of a *cold-palm-wind attack with blue light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a cold palm wind to incur an internal injury in the enemy is a fourth typical attack in the puppet

show.

show.

the gong with a stick.

show.

show.

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

(4) System Flow of Intergenerational Learning via Puppetry

Person" in brown, to incur an

(4) System Flow of Intergenerational Learning via Puppetry

internal injury in him.

### (4) System Flow of Intergenerational Learning via Puppetry

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows are given in Algorithm 3.

### **Algorithm 3.** State Transitions of the System Flow of the Proposed System.

**Parameters:** (1) the game-playing score *s* which shows the play skill level with the value ranging from 0 to 60; (2) the cycle time *t*<sup>c</sup> of interaction; (3) the elapse time *t*<sup>e</sup> in a state. **Input:** the volume value *v* of the signal of the drum or gong of the interactive installation backstage collected continuously by the microphone module and filtered by the Arduino chipset, and a preset value *S* for determining whether the instrument is hit. **Output:** an ending picture of the system "Recall the Play," showing a stone tablet with inscription saying that the participants are "awarded" the title of "The Worldwide Supremacy," "The Grandmaster," or "The Promising Talent," depending on the game-playing score value *s* obtained by the participants. A. In the START state: (1) Set the *cycle time t*<sup>c</sup> = 120 s. *//Assume the time duration of a cycle of interaction to be t<sup>c</sup> = 120 s.* (2) Set the game-playing score *s* = 0. *//Use s to decide the game-play skill level every 2 s, resulting in 60 decisions in a cycle* (3) In this state, show a "welcome screen" on the system screen; and (i) **if** no sound signal from the interactive installation is detected, **then** follow flow T to stay in the START state; (ii) **if** successful detection of drum beating is completed twice by Algorithm 2, **then** follow flow A to get into the GAME state. B. In the GAME state: (1) Start the users' experiencing of the system, and show one of the icons of the four gestures, gong beating, drum beating, finger rotation, and finger flipping (denoted as *G*, *D*, *T*, *F*, respectively) randomly every 2 s on the headset screen. (2) Check the following cases: (i) **if** *G* is shown on the headset screen, **then** follow flow B to get into the Gong state; (ii) **if** *D* is shown on the headset screen, **then** follow flow C to get into the Drum state; (iii) **if** *T* or *F* is shown on the headset screen, **then** follow flow D to get into the Gesture state. C. In the Gong state: Detect the gong sound by Algorithm 2 using the input volume value *v*, and **if** the sound detection is successful, **then** (i) increment the game-playing score *s* by one, i.e., set *s* = *s* + 1; (ii) let the "Good Man" in the animation push a cold palm wind with blue light toward the "Evil Person" to injure him internally; and (iii) **if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E1 to get into the GAME state; **else** stay in the Gong state; **else if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E1 to get into the GAME state; *//Keep playing the game.* **else** follow flow R1 to get into the RESULT state. *//Go to end the game.* D. In the Drum state: Detect the drum sound by Algorithm 2 using the input volume value *v*, and **if** the sound detection is successful, **then** (i) increment the game-playing score *s* by one, i.e., set *s* = *s* + 1; (ii) let the "Good Man" in the animation push a hot palm wind with yellow light toward the "Evil Person" to injure him internally; and (iii) **if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E2 to get into the GAME state; **else** stay in the Drum state; **else if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E2 to get into the GAME state; *//Keep playing the game.* **else** follow flow R2 to get into the Result state. *//Go to end the game.* E. In the Gesture state: Detect the gesture of the operator on the proscenium by Algorithm 1, and **if** the *finger rotation* gesture is detected successfully, **then** (i) increment the game-playing score *s* by one, i.e., set *s* = *s* + 1; (ii) let the "Good Man" in the animation get close to the "Evil Person" to attack him, causing him to turn his body around; and (iii) **if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E3 to get into the GAME state; **else** stay in the Gesture state; **else if** the *finger flip* gesture is detected successfully, **then** (i) increment the game-playing score *s* by one, i.e., set *s* = *s* + 1; (ii) let the "Good Man" in the animation get close to the "Evil Person" to attack him, causing him to flip his body up; and (iii) **if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E3 to get into the GAME state; **else** stay in the Gesture state; **else if** the *elapse time t*<sup>e</sup> in this state ≥ 2 s and the *cycle time t*<sup>c</sup> < 120 s, **then** follow flow E3 to get into the GAME state; *//Keep playing the game.* **else** follow flow R3 to get into the Result state. *//Go to end the game.* F. In the Result State: (1) Use the game-playing score *s* to decide the game-play result, and show the *result picture* on the screens by checking the following cases: (i) **if** *s* > 48, **then** award the participants the title of "The Worldwide Supremacy"; (ii) **if** 24 < *s* ≤ 48, **then** award the participants the title of "The Grandmaster"; (iii) **if** *s* ≤ 24, **then** award the participants the title of "The Promising Talent." (2) Show a stone-tablet as the result picture with an inscription of the above-awarded title on the screen.

(3) End the game and exit.

**Stage No.** 

**1** 

**Illustration of** 

**Intermediate Result Interaction**

**Figure 12.** The state diagram of participants experiencing the proposed system. **Figure 12.** The state diagram of participants experiencing the proposed system. (3) End the game and exit.

### 3.2.4. Visual Design 3.2.4. Visual Design

**Algorithm 3.** State Transitions of the System Flow of the Proposed System. **Parameters:** (1) the game-playing score *s* which shows the play skill level with the value ranging from 0 to 60; (2) the cycle time *t*c of interaction; (3) the elapse time *t*e in a state. **Input:** the volume value *v* of the signal of the drum or gong of the interactive installation backstage collected continuously by the microphone module and filtered by the The visual designs of the two 3D characters—the Good Man and the Evil Person—and the scene of the environment where the war between them was held in the puppet show animations are shown in Figure 13. They were designed in such a way to simulate the images of the puppets and environment in the real traditional show *Wind and Clouds of the Golden Glove-Puppet Play* so as to arouse the memory of the elderly involved in using the proposed "Recall the Play" system. The visual designs of the two 3D characters—the Good Man and the Evil Person and the scene of the environment where the war between them was held in the puppet show animations are shown in Figure 13. They were designed in such a way to simulate the images of the puppets and environment in the real traditional show *Wind and Clouds of the Golden Glove-Puppet Play* so as to arouse the memory of the elderly involved in using the proposed "Recall the Play" system.

(2) Set the game-playing score *s* = 0. *//Use s to decide the game-play skill level every 2 sec., resulting in 60 decisions in a cycle*  (3) In this state, show a "welcome screen" on the system screen; and **Figure 13.** The 3D modeling of the characters and environment scenes of the puppet-show game played on the proposed "Recall the Play" system: (**a**) 3D model of the Evil Person; (**b**) 3D model of the Good Man; (**c**) 3D model of the environment scene. **Figure 13.** The 3D modeling of the characters and environment scenes of the puppet-show game played on the proposed "Recall the Play" system: (**a**) 3D model of the Evil Person; (**b**) 3D model of the Good Man; (**c**) 3D model of the environment scene.

(i) **if** no sound signal from the interactive installation is detected, **then** follow flow T to stay in the START state; 3.2.5. An Example of Results of Running the Proposed "Recall the Play" System 3.2.5. An Example of Results of Running the Proposed "Recall the Play" System

**Table 7.** List of intermediate interaction results of an example of running the proposed system.

(ii) **if** successful detection of drum beating is completed twice by Algorithm 2, **then** follow flow A to get into the GAME state. B. In the GAME state: The intermediate interaction results of an example of running the proposed "Recall the Play" system to perform a puppet show is shown in Table 7, which includes the intermediate animations of the major steps in the show. The intermediate interaction results of an example of running the proposed "Recall the Play" system to perform a puppet show is shown in Table 7, which includes the intermediate animations of the major steps in the show.

(1) Start the users' experiencing of the system, and show one of the icons of the four gestures, gong beating, drum beating, finger rotation, and finger flipping (denoted as *G*, *D*, *T*, *F*, respectively) randomly every 2 sec. on the headset screen.

(i) **if** *G* is shown on the headset screen, **then** follow flow B to get into the Gong

1. In the puppet-show game, a

**Explanation**

**Corresponding State in Algorithm 3**

START state

(ii) **if** *D* is shown on the headset screen, **then** follow flow C to get into the Drum

grandparent (called **A**) wore a VIVE headset to perform gestures, and a grandchild (called **B**) hit the drum

a "welcome screen" is shown on the

beating is completed twice, then the system proceeds to stage 2 to engage

3. If successful detection of drum

(iii) **if** *T* or *F* is shown on the headset screen, **then** follow flow D to get into the

Detect the gong sound by Algorithm 2 using the input volume value *v*, and

and gong.

screens.

the GAME state.

(2) Check the following cases:

**Involved Interaction Device**

Gesture state.

Drum

state;

state;

C. In the Gong state:

(detected by sound sensing)

**B**: Complete drum beating twice


(3) Types of gestures designed for generating animations

**Stage No.** 

**Gesture 3:**  *Drum beating* 

operators on the proscenium and backstage to see and follow.

**Intermediate Result Interaction**

up.

the effects of learning and interaction between the two generations. Four types of

**Operator:** The participant

**Table 6.** Four types of interactive gestures used in the proposed system.


typical attack in the puppet show.

fourth typical attack in the puppet

show.

**B**: Conduct drum beating repetitively. (verified by sound detection)

**B**: Conduct drum beating repetitively. (verified by sound detection)

show.

1. During the system experiencing process, if the gesture icon of *gong beating* appears on the system screen, the participant backstage has to hit

1. Detailed explanations given in the "Drum state" in Algorithm 3.

1. Detailed explanations given in the "Drum state" in Algorithm 3.

**D** 

internal injury in him.

**Display:** 

*Hot-palm-wind attack with yellow light*

**C**

**Icon:** 

**Gesture 3:**  *Drum beating* 

**Operator:** The participant backstage **Attack 3:**  *Cold-palm-wind attack* 

internal injury in him.

**Display:** 

**B**: Conduct gone beating repetitively. (verified by sound detection)

**B**: Conduct gone beating repetitively. (verified by sound detection)

*with blue light*

**6** 

**6** 

Gong

Gong

Gong state

Gong state

show.

1. The game-play score *s* is used to decide the result of the game (i.e., three types of awards), and the result picture is shown on the screens. 2. Detail explanations are given in the "RESULT state" in Algorithm 3.

1. The game-play score *s* is used to decide the result of the game (i.e., three types of awards), and the result picture is shown on the screens. 2. Detail explanations are given in the "RESULT state" in Algorithm 3.

(4) System Flow of Intergenerational Learning via Puppetry

VIVE headset and system screens

VIVE headset and system screens

**A and B**: View the gameplay result (one of the three awards)

**A and B**: View the gameplay result (one of the three awards)

1. During the system experiencing process, if the gesture icon of *gong beating* appears on the system screen, the participant backstage has to hit

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

RESULT state for *s* > 48

RESULT state for *s* > 48

are given in Algorithm 3.

2. The sound of the gong can be sensed by the system to incur an animation of a *cold-palm-wind attack with blue light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a cold palm wind to incur an internal injury in the enemy is a fourth typical attack in the puppet

the gong with a stick.

**7** 

**7** 

**Scenario:** 

The "Good Man" in white pushes a *cold palm wind with blue light* toward the "Evil Person" in brown, to incur an

**D** 

*Cold-palm-wind attack* 

internal injury in him.

**Display:** 

*with blue light*

show.

(4) System Flow of Intergenerational Learning via Puppetry

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

are given in Algorithm 3.

**Gesture 4:**  *Gong beating* **Icon:** 

**Operator:** The participant backstage **Attack 4:** 

**Gesture 4:**  *Gong beating* **Icon:** 

**5** 

**5** 

The "Good Man" in white pushes a *hot palm wind with yellow light* toward the "Evil Person" in brown, to incur an

**Scenario:** 

are given in Algorithm 3.

**Operator:** The participant backstage **Attack 4:** 

the drum with a stick.

2. The sound of the drum can be sensed by the system to incur an animation of a *hot-palm-wind attack with yellow light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a hot palm wind to incur internal injury in the enemy is a third typical attack in the puppet show.

**Scenario:** 

1. During the system-experiencing process, if the gesture icon of *drum beating* appears on the system screen, the participant backstage has to tap

The "Good Man" in white pushes a *cold palm wind with blue light* toward the "Evil Person" in brown, to incur an

The state diagram of the system flow for experiencing the proposed "Recall the Play" system is shown in Figure 12, and the detailed descriptions of the state transition flows

Drum

Drum

(4) System Flow of Intergenerational Learning via Puppetry

the gong with a stick.

Drum state

Drum state

2. The sound of the gong can be sensed by the system to incur an animation of a *cold-palm-wind attack with blue light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a cold palm wind to incur an internal injury in the enemy is a fourth typical attack in the puppet

1. Detailed explanations are given in the "Gong state" in Algorithm 3.

1. Detailed explanations are given in the "Gong state" in Algorithm 3.

**Display: Icon:**  backstage Person" in brown, to incur an 2. The sound of the drum can be sensed interactive gestures, namely, *finger rotation*, *finger flipping*, *drum beating*, and *gong beating*, *Sustainability* **2022**, *14*, x FOR PEER REVIEW 24 of 45

1. In the puppet-show game, a

shown on the screens.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 23 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 23 of 45

(ii) **if** 24 < *s* ≤ 48, **then** award the participants the title of "The Grandmaster"; (iii) **if** *s* ≤ 24, **then** award the participants the title of "The Promising Talent." (2) Show a stone-tablet as the result picture with an inscription of the above-

(ii) **if** 24 < *s* ≤ 48, **then** award the participants the title of "The Grandmaster"; (iii) **if** *s* ≤ 24, **then** award the participants the title of "The Promising Talent." (2) Show a stone-tablet as the result picture with an inscription of the above-

awarded title on the screen.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 19 of 45

(3) End the game and exit.

awarded title on the screen.

(3) End the game and exit.

**Algorithm 2.** Sound Detection and Instrument Beating Verification.

3.2.4. Visual Design

The visual designs of the two 3D characters—the Good Man and the Evil Person and the scene of the environment where the war between them was held in the puppet show animations are shown in Figure 13. They were designed in such a way to simulate the images of the puppets and environment in the real traditional show *Wind and Clouds of the Golden Glove-Puppet Play* so as to arouse the memory of the elderly involved in using

the icon appearing on the screens; (2) a preset value *s* for determining whether the instrument is hit *with sufficient loudness*, and (3) the reference signal pattern *P* of the

The visual designs of the two 3D characters—the Good Man and the Evil Person and the scene of the environment where the war between them was held in the puppet show animations are shown in Figure 13. They were designed in such a way to simulate the images of the puppets and environment in the real traditional show *Wind and Clouds of the Golden Glove-Puppet Play* so as to arouse the memory of the elderly involved in using

3.2.4. Visual Design

**Input:** (1) the instruction *I* of hitting an instrument (the drum or the gong) specified by

the proposed "Recall the Play" system.

correctly or not.

**Method:** 

**Algorithm 2.** Sound Detection and Instrument Beating Verification.

**Input:** (1) the instruction *I* of hitting an instrument (the drum or the gong) specified by

the icon appearing on the screens; (2) a preset value *s* for determining whether the instrument is hit *with sufficient loudness*, and (3) the reference signal pattern *P* of the

instrument specified by *I.*

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 19 of 45

**Output:** the decision *success* or *failure*, indicating whether the instrument has been hit

the proposed "Recall the Play" system.

Step 1. Detect the sound *S* emitted around the interactive installation per second using the microphone and the Arduino chipset, and acquire its volume value *v*. Step 2. Examine whether *v* ≥ *s*: if so, go to Step 3; else, repeat this step until 20 sec. has passed, and at that time make the decision *failure* and exit, meaning the instruction

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 20 of 45

Step 3. If the sound *S* matches the reference signal pattern *P*, then decide that the correct

instrument (the drum or the gong) has been hit loudly enough, send the decision *success* to inform the PC for further processing, and jump to Step 1 to continue the

*I* is not followed in time.

**Output:** the decision *success* or *failure*, indicating whether the instrument has been hit

instrument specified by *I.*

correctly or not.

**Method:** 

**(a) (b) (c)** 

next sound detection and instrument beating verification.

up. **Display:** 

**Icon:** *Flip-causing attack* causing him to flip his body

Step 1. Detect the sound *S* emitted around the interactive installation per second using the microphone and the Arduino chipset, and acquire its volume value *v*. Step 2. Examine whether *v* ≥ *s*: if so, go to Step 3; else, repeat this step until 20 sec. has

**Figure 13.** The 3D modeling of the characters and environment scenes of the puppet-show game played on the proposed "Recall the Play" system: (**a**) 3D model of the Evil Person; (**b**) 3D model of

**(a) (b) (c)** 

shown on the screens.

proscenium has to make Gesture 2 to create a corresponding animation **Figure 13.** The 3D modeling of the characters and environment scenes of the puppet-show game played on the proposed "Recall the Play" system: (**a**) 3D model of the Evil Person; (**b**) 3D model of

2. The flipping fingers of this gesture represent exerting an invisible force toward the enemy's body to *flip*

the Good Man; (**c**) 3D model of the environment scene.

The proposed "Recall the Play" system designed for intergenerational learning combines VR and TUIs as the interface for interactive learning activities. It is noteworthy that the diverse interactions and immersive experiences provided by the system enhance the effects of learning and interaction between the two generations. Four types of interactive gestures, namely, *finger rotation*, *finger flipping*, *drum beating*, and *gong beating*, as shown in Table 6, were designed for use by the participants in the interactive learning activities. The total time duration of a cycle of interaction in the animation shown both on the headset screen and the system screen is 120 s. One of the icons representing the abovementioned four types of gestures is shown *randomly* every 2 s on the screens for the two

(3) Types of gestures designed for generating animations

passed, and at that time make the decision *failure* and exit, meaning the instruction

Step 3. If the sound *S* matches the reference signal pattern *P*, then decide that the correct

instrument (the drum or the gong) has been hit loudly enough, send the decision *success* to inform the PC for further processing, and jump to Step 1 to continue the

*I* is not followed in time.

3.2.5. An Example of Results of Running the Proposed "Recall the Play" System

the Good Man; (**c**) 3D model of the environment scene.

The intermediate interaction results of an example of running the proposed "Recall the Play" system to perform a puppet show is shown in Table 7, which includes the

3.2.5. An Example of Results of Running the Proposed "Recall the Play" System

3. The corresponding Attack 2, *flipcausing attack*, is another basic fighting style seen in the puppet

him/her up.

The intermediate interaction results of an example of running the proposed "Recall the Play" system to perform a puppet show is shown in Table 7, which includes the

intermediate animations of the major steps in the show.

next sound detection and instrument beating verification.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 20 of 45

**Table 7.** List of intermediate interaction results of an example of running the proposed system.

**Scenario:** 

intermediate animations of the major steps in the show.

show.

**Involved Interaction Device**

The proposed "Recall the Play" system designed for intergenerational learning

**Explanation**

The "Good Man" in white pushes a *hot palm wind with yellow light* toward the "Evil

**Involved Interaction Device**

proscenium has to make Gesture 2 to create a corresponding animation

**Corresponding State in Algorithm 3**

**Explanation**

the drum with a stick.

**Corresponding State in Algorithm 3**

**Table 7.** List of intermediate interaction results of an example of running the proposed system.

1. During the system-experiencing process, if the gesture icon of *drum beating* appears on the system screen, the participant backstage has to tap


*Hot-palm-wind attack* 

**4** 

**C**

**Icon:** 

**Gesture 3:**  *Drum beating* 

**Operator:** The participant backstage **Attack 3:**  **Scenario:** 

**Display:** 

**Stage No.** 

**4** 

**Illustration of** 

**Intermediate Result Interaction**

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 20 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 23 of 45

(ii) **if** 24 < *s* ≤ 48, **then** award the participants the title of "The Grandmaster"; (iii) **if** *s* ≤ 24, **then** award the participants the title of "The Promising Talent." (2) Show a stone-tablet as the result picture with an inscription of the above-

awarded title on the screen.

(3) End the game and exit.

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 24 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 24 of 45

3.2.4. Visual Design

1. If the icon of the *finger-rotation* gesture **a** is shown on the headset screen, then the system proceeds to

The visual designs of the two 3D characters—the Good Man and the Evil Person and the scene of the environment where the war between them was held in the puppet show animations are shown in Figure 13. They were designed in such a way to simulate the images of the puppets and environment in the real traditional show *Wind and Clouds of the Golden Glove-Puppet Play* so as to arouse the memory of the elderly involved in using

1. If the icon of the *finger-rotation* gesture **a** is shown on the headset screen, then the system proceeds to

the proposed "Recall the Play" system.

2. If the icon of *finger-flip* gesture **b** is shown on the headset screen, then the system proceeds to stage 4. 3. If the icon of the *drum beating* gesture **c** is shown on the headset screen, then the system proceeds to stage 5. 4. If the icon of the *gong beating* gesture **d** is shown on the headset screen, then the system proceeds to stage 6.

stage 3.

2. If the icon of *finger-flip* gesture **b** is shown on the headset screen, then the system proceeds to stage 4. 3. If the icon of the *drum beating* gesture **c** is shown on the headset screen, then the system proceeds to stage 5. 4. If the icon of the *gong beating* gesture **d** is shown on the headset screen, then the system proceeds to stage 6.

GAME state

GAME state

VIVE headset

proscenium has to make Gesture 2 to create a corresponding animation

**2** 

VIVE headset shown on the screens.

2. The flipping fingers of this gesture represent exerting an invisible force toward the enemy's body to *flip*

**c d** 

left)

**c d** 

left)

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 20 of 45

**a b** 

**a b** 

**A**: View the icon appearing in the headset screen (one of the four ones on the

stage 3.

**A**: View the icon appearing in the headset screen (one of the four ones on the

**Icon:** *Flip-causing attack* causing him to flip his body

**2** 

up. **Display:**  **Icon:** *Flip-causing attack* causing him to flip his body

proscenium has to make Gesture 2 to create a corresponding animation

3. The corresponding Attack 2, *flipcausing attack*, is another basic fighting style seen in the puppet

**A**: Complete the fingerrotation gesture (verified by gesture detection)

**A**: Complete the fingerrotation gesture (verified by gesture detection)

1. Detailed explanations are given in the "Gesture state" in Algorithm 3.

him/her up.

**(a) (b) (c)** 

1. Detailed explanations are given in the "Gesture state" in Algorithm 3. **Figure 13.** The 3D modeling of the characters and environment scenes of the puppet-show game played on the proposed "Recall the Play" system: (**a**) 3D model of the Evil Person; (**b**) 3D model of

Gesture state of the fingerrotation gesture

Gesture state of the fingerrotation gesture

shown on the screens.

Leap Motion

show.

**3** 

2. The flipping fingers of this gesture represent exerting an invisible force toward the enemy's body to *flip*

1. During the system-experiencing process, if the gesture icon of *drum beating* appears on the system screen, the participant backstage has to tap

the Good Man; (**c**) 3D model of the environment scene.

Leap Motion 3.2.5. An Example of Results of Running the Proposed "Recall the Play" System

The intermediate interaction results of an example of running the proposed "Recall the Play" system to perform a puppet show is shown in Table 7, which includes the

1. Detailed explanations are given in the "Gesture state" in Algorithm 3.

him/her up.

3. The corresponding Attack 2, *flipcausing attack*, is another basic fighting style seen in the puppet

the drum with a stick.

**A**: Complete the fingerflipping gesture (verified by gesture detection)

2. The sound of the drum can be sensed by the system to incur an animation of a *hot-palm-wind attack with yellow light* from the "Good Man" to the "Evil Person" on the screens. 3. Pushing a hot palm wind to incur internal injury in the enemy is a third typical attack in the puppet show.

**A**: Complete the fingerflipping gesture (verified by gesture detection)

1. Detailed explanations are given in the "Gesture state" in Algorithm 3.

intermediate animations of the major steps in the show.

**Table 7.** List of intermediate interaction results of an example of running the proposed system.

Gesture state for the fingerflip gesture

show.

Leap Motion

internal injury in him.

1. During the system-experiencing process, if the gesture icon of *drum beating* appears on the system screen, the participant backstage has to tap

**Involved Interaction Device**

Leap Motion **Explanation**

1. In the puppet-show game, a

grandparent (called **A**) wore a VIVE

**Corresponding State in Algorithm 3**

Gesture state for the fingerflip gesture

up. **Display:** 

**3** 

**Scenario:** 

The "Good Man" in white pushes a *hot palm wind with yellow light* toward the "Evil Person" in brown, to incur an


**B**: Conduct

the drum with a stick.

2. The sound of the drum can be sensed

**B**: Conduct

1. Detailed explanations given in the

pushes a *hot palm wind with* 

**C**

**Icon:** 

**Gesture 3:**  *Drum beating* 

**Operator:** The participant


*3.3. Experimental Design* 

*3.3. Experimental Design* 

Figures 14 and 15.

Figures 14 and 15.

**Figure 14.** Scene setting of the proposed "Recall the Play" system.

**Figure 14.** Scene setting of the proposed "Recall the Play" system.

**(a) (b) Figure 15.** Experiencing the intergenerational activity on the proposed "Recall the Play" system:

**(a) (b) Figure 15.** Experiencing the intergenerational activity on the proposed "Recall the Play" system: The participation of a pair including a grandparent and a grandchild was estimated to last approximately 30 min in each game play on the proposed system. When the grandparent and the grandchild entered the field, the first step taken by the staff of this study was to briefly explain the concepts and procedures of the system to them for five minutes. Then, they joined the intergenerational activity on the "Recall the Play" system for 10 min. After the experiencing process, a questionnaire survey of their opinions was conducted for five minutes. Lastly, five pairs of grandparents and grandchildren,

The participation of a pair including a grandparent and a grandchild was estimated to last approximately 30 min in each game play on the proposed system. When the grandparent and the grandchild entered the field, the first step taken by the staff of this study was to briefly explain the concepts and procedures of the system to them for five minutes. Then, they joined the intergenerational activity on the "Recall the Play" system for 10 min. After the experiencing process, a questionnaire survey of their opinions was conducted for five minutes. Lastly, five pairs of grandparents and grandchildren,

(**a**) Case 1; (**b**) Case 2.

(**a**) Case 1; (**b**) Case 2.

In this study, both VR and TUIs technologies were applied to intergenerational learning for the two generations: the grandparent and the grandchild. Additionally, activities on the "Recall the Play" system were displayed in several exhibition halls, including the Bald Pine Forest in Nantou, the Puppets' House in Dounan, the Yunlin, and the Dali Community Care Base in Taichung, which are all located in Taiwan, as shown in

In this study, both VR and TUIs technologies were applied to intergenerational learning for the two generations: the grandparent and the grandchild. Additionally, activities on the "Recall the Play" system were displayed in several exhibition halls, including the Bald Pine Forest in Nantou, the Puppets' House in Dounan, the Yunlin, and the Dali Community Care Base in Taichung, which are all located in Taiwan, as shown in

**D** 

**Gesture 4:**  *Gong beating* **Icon:** 

**D** 

**Gesture 4:**  *Gong beating* **Icon:** 

### *3.3. Experimental Design 3.3. Experimental Design 3.3. Experimental Design*

**8** Same as 7 Same as 7 Same as 7.

**8** Same as 7 Same as 7 Same as 7.

**9** Same as 7 Same as 7 Same as 7.

**9** Same as 7 Same as 7 Same as 7.

In this study, both VR and TUIs technologies were applied to intergenerational learning for the two generations: the grandparent and the grandchild. Additionally, activities on the "Recall the Play" system were displayed in several exhibition halls, including the Bald Pine Forest in Nantou, the Puppets' House in Dounan, the Yunlin, and the Dali Community Care Base in Taichung, which are all located in Taiwan, as shown in Figures 14 and 15. In this study, both VR and TUIs technologies were applied to intergenerational learning for the two generations: the grandparent and the grandchild. Additionally, activities on the "Recall the Play" system were displayed in several exhibition halls, including the Bald Pine Forest in Nantou, the Puppets' House in Dounan, the Yunlin, and the Dali Community Care Base in Taichung, which are all located in Taiwan, as shown in Figures 14 and 15. In this study, both VR and TUIs technologies were applied to intergenerational learning for the two generations: the grandparent and the grandchild. Additionally, activities on the "Recall the Play" system were displayed in several exhibition halls, including the Bald Pine Forest in Nantou, the Puppets' House in Dounan, the Yunlin, and the Dali Community Care Base in Taichung, which are all located in Taiwan, as shown in Figures 14 and 15.

RESULT state for 24 < s ≤ 48

RESULT state for 24 < s ≤ 48

RESULT state for *s* ≤ 24

RESULT state for *s* ≤ 24

**Figure 14.** Scene setting of the proposed "Recall the Play" system. **Figure 14.** Scene setting of the proposed "Recall the Play" system. **Figure 14.** Scene setting of the proposed "Recall the Play" system. in this study was the result of a deliberate decision for the test groups, which included

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 25 of 45

*Sustainability* **2022**, *14*, x FOR PEER REVIEW 25 of 45

**Figure 15.** Experiencing the intergenerational activity on the proposed "Recall the Play" system: (**a**) Case 1; (**b**) Case 2. **Figure 15.** Experiencing the intergenerational activity on the proposed "Recall the Play" system: (**a**) Case 1; (**b**) Case 2. **Figure 15.** Experiencing the intergenerational activity on the proposed "Recall the Play" system: (**a**) Case 1; (**b**) Case 2. immersive environment during this activity, if the audiovisual experience in the VR environment is too strong, it will cause contrary interference for the user. This interference is in particular more serious to the elderly users, because the visual and auditory senses

The participation of a pair including a grandparent and a grandchild was estimated to last approximately 30 min in each game play on the proposed system. When the grandparent and the grandchild entered the field, the first step taken by the staff of this study was to briefly explain the concepts and procedures of the system to them for five minutes. Then, they joined the intergenerational activity on the "Recall the Play" system for 10 min. After the experiencing process, a questionnaire survey of their opinions was conducted for five minutes. Lastly, five pairs of grandparents and grandchildren, The participation of a pair including a grandparent and a grandchild was estimated to last approximately 30 min in each game play on the proposed system. When the grandparent and the grandchild entered the field, the first step taken by the staff of this study was to briefly explain the concepts and procedures of the system to them for five minutes. Then, they joined the intergenerational activity on the "Recall the Play" system for 10 min. After the experiencing process, a questionnaire survey of their opinions was conducted for five minutes. Lastly, five pairs of grandparents and grandchildren, The participation of a pair including a grandparent and a grandchild was estimated to last approximately 30 min in each game play on the proposed system. When the grandparent and the grandchild entered the field, the first step taken by the staff of this study was to briefly explain the concepts and procedures of the system to them for five minutes. Then, they joined the intergenerational activity on the "Recall the Play" system for 10 min. After the experiencing process, a questionnaire survey of their opinions was conducted for five minutes. Lastly, five pairs of grandparents and grandchildren, comprising 10 participants, were randomly selected to be interviewed. The aforementioned experimental procedure is shown in Figure 16. of the elderly decline with age. Therefore, the VR-experiencing time must not be too long, and so was set to be 10 min, as stated above. In short, the original plan of time usage was considered appropriate. In addition, since the story script was short, a merit followed; namely, it did not require an extensive background environment for conducting the experiment. In the future, extensions of this study can be designed to have richer story scripts and allow longer activity-experiencing times with more gorgeous background environments for use by a larger range of generations, instead of being limited to the two generations of grandparents and grandchildren.

A total of 120 copies of questionnaires were collected from 60 pairs of grandparents and grandchildren. Each questionnaire included questions about the participant's basic

In addition, among the 120 collected questionnaires, 114 copies from 57 pairs were valid, and 6 from 3 pairs were invalid. Each questionnaire also included questions about and the evaluations of the two indicators of the GPIC scale and the elderly's attitude scale, with the former indicator involving both the elderly and the children, and the latter concerning only the children. The questions of the two indicators are shown in the second column of Table 9, with questions S1–S9 being related to the first indicator, and T1–T15 related to the second. Some statistics of the collected feedback data of the Likert 5-point

attitude scale. The basic data are shown in Table 8, in which it can be seen that 70% of the participants had never experienced VR before, while 30% of the participants had

**Figure 16.** Design of Experimental Procedures. **Figure 16.** Design of Experimental Procedures.

*3.4. Questionnaire Survey Results* 

previously used a VR system.

The length of the total time used for the formal experimental procedure conducted in this study was the result of a deliberate decision for the test groups, which included young children and old people. In general, children cannot concentrate on an activity for very long during the system-experiencing process. On the other hand, the elderly's physical and mental conditions should be taken into consideration in the experiencing process; if the operation time is too long, they might experience slight dizziness and discomfort.

Accordingly, the activity script for the proposed mutual learning activity was designed to not be too complicated, so that the entire formal experimental procedure could be completed within 30 min; the specific time for experiencing the intergenerational activity was set to be 10 min, which was enough to complete the proposed activity.

Regarding the 10 min activity in which one user was conducting a VR experience by wearing a headset, although the user could relax his/her mind and enjoy the virtual immersive environment during this activity, if the audiovisual experience in the VR environment is too strong, it will cause contrary interference for the user. This interference is in particular more serious to the elderly users, because the visual and auditory senses of the elderly decline with age. Therefore, the VR-experiencing time must not be too long, and so was set to be 10 min, as stated above. In short, the original plan of time usage was considered appropriate. In addition, since the story script was short, a merit followed; namely, it did not require an extensive background environment for conducting the experiment.

In the future, extensions of this study can be designed to have richer story scripts and allow longer activity-experiencing times with more gorgeous background environments for use by a larger range of generations, instead of being limited to the two generations of grandparents and grandchildren.
