The Next Generation of Edutainment Applications for Young Children—A Proposal
Abstract
:1. Introduction
2. Edutainment Applications for Young Children
3. Next Generation of Edutainment Applications for Young Children
- How to automatically identify a child’s emotions?
- How to integrate emotions recognition within an edutainment application?
- How to adapt the edutainment application’s interaction flow based on the identified emotions?
3.1. Young Children’s Emotions
3.2. Automatic Child Emotion Recognition
3.3. Integrating Emotions Recognition into an Edutainment Application
- the edutainment module—that is still responsible for presenting the learning content and the tasks to aid the comprehension of the new knowledge;
- an emotion recognition module—that is responsible for the identification of the emotional state of the user;
- a coordinator module—that is responsible for the coordination of the other modules;
- a dataset building module—that is responsible for the datasets creation and management (for example adding images of young children and annotating them with the corresponding emotion).
- The edutainment module, at some predefined moments from the tasks’ execution flow, sends requests about the emotional state of the user (e.g., when a task is finished, when a long time has passed since using the application, when the user has difficulties in completing a task, etc.) and it adapts the interaction based on the received response.
- The emotion recognition module continuously sends information about the identified emotions to the edutainment module. Based on the received information, the edutainment module will filter the negative emotions and their context, and will trigger its interaction adaptation.
- Phase 1—Development of the edutainment module. In this step, together with the other stakeholders (educational experts, kindergarten teachers, etc.) should be decided the tasks to be included in the edutainment module, their difficulty level, their type, and the normal interaction flow. The idea of each application may be decided by the early childhood educators. Each application should address one or multiple domains from the curricula and should be composed by a learning part, where new content is presented, and a practical part that contains tasks to support knowledge fixation. One possible approach that can be used for the edutainment module development is described in [36].
- Phase 2—Development of the emotion recognition module and dataset building. After the edutainment module was developed, a child’s emotion recognition approach must be selected and validated. If the accuracy of the selected approach is not the desired one, new datasets with data about children while interacting with the edutainment module could be created in order to improve the accuracy of the emotion recognition approach. These datasets should be annotated by human experts. If there already exist big enough datasets, or if the accuracy of the emotion recognition approach used is good enough for this type of applications, then the dataset building step may be skipped.
- Phase 3—Integration of emotions awareness into the edutainment module. The edutainment module should be modified in order to also include the adaptation feature for the interaction flow.
- The IdentifyEmotion Service, part of the emotion recognition module, that is responsible for identifying the emotional state from the data sent to it (images, etc.).
- The AddData Service, part of the dataset building module that is responsible for adding new data to the datasets used for training and validation of the selected emotion recognition approach.
- The Annotation Service, part of the dataset building module that is used to annotate the data from the datasets.
- The StartMonitoring Service, part of the coordinator module that is responsible for initiating the monitoring activity of the emotional state of the child.
- The EndMonitoring Service, part of the coordinator module that is responsible for ending the monitoring activity, started by the StartMonitoring Service.
- The CurrentEmotionalState Service, part of the coordinator module that is responsible for obtaining the necessary input data for the emotion recognition approach and sending it to the IdentifyEmotion Service in order to obtain the current emotional state of the child.
- The AdaptInteraction Service, part of the edutainment module that is responsible for initiating adaptation of the interaction flow, in order to change the child’s emotional state.
3.4. Adapting the Interaction Flow
- the child successfully accomplishes the task;
- the child fails again to perform the task;
- the child does not perform any interaction action (maybe the child leaves the computer, abandoning the interaction altogether).
Algorithm 1: Normal interaction flow algorithm. |
- If the emotional state is frustration, then the edutainment application will resume executing the interaction flow, will stop the monitoring activity, will increase the number of given and will propose relaxing physical tasks in order to change the child’s emotional state. If after the execution of these tasks, the emotional state of the child becomes positive, then the monitoring activity will restart and the edutainment module will continue playing the . Otherwise, the interaction is stopped.
- If the emotional state is boredom, then the edutainment application will increase the difficulty level or the type of the next task to be executed.
Algorithm 2: Interaction adaptation algorithm. |
4. Prototype
- a bright effect when happiness is identified;
- a sepia effect when sadness is identified;
- a distorted effect when anger is identified;
- a blurred effect when disgust is identified;
- a black and white effect when surprise is identified.
5. Discussions
- The children could safely use these applications outside the formal education system, especially when in-person interaction is not possible due to social distancing rules.
- The learning process of the child will be personalized, adapted to his/her own pace.
- Interacting with safe edutainment applications, the young child will also develop basic digital competences.
- The proposed architecture allows to easily plug-in and -out the emotion recognition and adaptation modules.
- Such applications could have an increased response time, affecting negatively the interaction.
- Some researchers consider that the gathered information about the children’ state of mind could be improperly used to influence subconscious processes. In our proposal, the identified emotion is used only to avoid increased negative emotions while interacting with the application. If negative emotions are identified repeatedly, the application should stop executing.
6. Conclusions and Further Work
- validate our proposal on real and more complex case studies (implementation of the proposed approach);
- use multiple channels (body posture, voice, sensors) to extract information for the automatic emotion recognition module;
- consider the situations in which negative emotions occur frequently for different children (in this case it may also mean that changes in the design of the edutainment module should be made); and
- use emotions awareness to also evaluate the satisfaction of the little users. Identifying frustration during learning activities with an edutainment application could also provide hints on interaction flow design.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Disney, W. Educational Values in Factual Nature Pictures. Educ. Horizons 1954, 33, 82–84. [Google Scholar]
- Rapeepisarn, K.; Wong, K.W.; Fung, C.C.; Depickere, A. Similarities and Differences between “Learn through Play” and “Edutainment”. In Proceedings of the 3rd Australasian Conference on Interactive Entertainment, Perth, Australia, 4–6 December 2006; Murdoch University: Murdoch, Australia, 2006; pp. 28–32. [Google Scholar]
- Nemec, J.; Trna, J. Edutainment or Entertainment Education Possibilities of Didactic Games in Science Education. The Evolution of Children Play-24. In Proceedings of the ICCP World Play Conference, Brno, Czech Republic, September 2007; pp. 55–64. [Google Scholar]
- Mat Zin, H.; Mohamed Zain, N.Z. The effects of edutainment towards students’ achievements. Reg. Conf. Knowl. Integr. ICT 2010, 129, 2865. [Google Scholar]
- Kara, Y.; Yeşilyurt, S. Comparing the Impacts of Tutorial and Edutainment Software Programs on Students’Achievements, Misconceptions, and Attitudes towards Biology. J. Sci. Educ. Technol. 2008, 17, 32–41. [Google Scholar] [CrossRef]
- Denham, S.A.; Bassett, H.H.; Thayer, S.K.; Mincic, M.S.; Sirotkin, Y.S.; Zinsser, K. Observing preschoolers’ social-emotional behavior: Structure, foundations, and prediction of early school success. J. Genet. Psychol. 2012, 173, 246–278. [Google Scholar] [CrossRef]
- Hyson, M. The Emotional Development of Young Children: Building an Emotion-Centered Curriculum; Teachers College Press: New York, NY, USA, 2004. [Google Scholar]
- Kostelnik, M.; Soderman, A.; Whiren, A.; Rupiper, M.L. Guiding Children’s Social Development and Learning: Theory and Skills; Cengage Learning: Boston, MA, USA, 2016. [Google Scholar]
- Feidakis, M. Chapter 11—A Review of Emotion-Aware Systems for e-Learning in Virtual Environments. In Formative Assessment, Learning Data Analytics and Gamification; Caballé, S., Clarisó, R., Eds.; Intelligent Data-Centric Systems; Academic Press: Boston, MA, USA, 2016; pp. 217–242. [Google Scholar] [CrossRef]
- Ruiz, S.; Urretavizcaya, M.; Fernández-Castro, I.; López-Gil, J.M. Visualizing Students’ Performance in the Classroom: Towards Effective F2F Interaction Modelling. In Design for Teaching and Learning in a Networked World; Conole, G., Klobučar, T., Rensing, C., Konert, J., Lavoué, E., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 630–633. [Google Scholar]
- Druin, A.; Solomon, C. Designing Multimedia Environments for Children: Computers, Creativity, and Kids; Wiley: Hoboken, NJ, USA, 1996. [Google Scholar]
- Markopoulos, P.; Bekker, M. How to compare usability testing methods with children participants. In Interaction Design and Children; Now Publishers Inc.: Hanover, PA, USA, 2002; Volume 2, pp. 153–158. [Google Scholar]
- Frijda, N.H. Appraisal and Beyond: The Issue of Cognitive Determinants of Emotion; Lawrence Erlbaum Associates Ltd.: Hove, UK, 1993. [Google Scholar]
- Frijda, N.H. Varieties of affect: Emotions and episodes, moods and sentiments. In The Nature of Emotion: Fundamental Questions; Ekman, P., Davidson, R., Eds.; Oxford University Press: New York, NY, USA, 1994; pp. 59–64. [Google Scholar]
- Davou, B. Thought Processes in the Era of Information: Issues on Cognitive Psychology and Communication; Papazissis Publishers: Athens, Greece, 2000. [Google Scholar]
- Damasio, A.R. Descartes’ Error. Emotion, Reason and the Human Brain; Avon Books: New York, NY, USA, 1994. [Google Scholar]
- Ekman, P.; Friesen, W. Facial Action Coding System: A Technique for the Measurement of Facial Movement; Consulting Psychologists Press: Palo Alto, CA, USA, 1978. [Google Scholar]
- Ortony, A.; Clore, G.L.; Collins, A. The Cognitive Structure of Emotions; Cambridge University Press: Cambridge, UK, 1988. [Google Scholar] [CrossRef] [Green Version]
- Parrott, W.G. Emotions in Social Psychology: Key Readings; Psychology Press: Oxfordshire, UK, 2000. [Google Scholar]
- Pekrun, R. The Impact of Emotions on Learning and Achievement: Towards a Theory of Cognitive/Motivational Mediators. Appl. Psychol. 1992, 41, 359–376. [Google Scholar] [CrossRef]
- Pekrun, R.; Lichtenfeld, S.; Marsh, H.W.; Murayama, K.; Goetz, T. Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects. Child Dev. 2017, 88, 1653–1670. [Google Scholar] [CrossRef]
- Rowe, A.D.; Fitness, J. Understanding the Role of Negative Emotions in Adult Learning and Achievement: A Social Functional Perspective. Behav. Sci. 2018, 8, 27. [Google Scholar] [CrossRef] [Green Version]
- Manwaring, K.C. Emotional and Cognitive Engagement in Higher Education Classrooms. Ph.D. Thesis, Brigham Young University, Provo, UT, USA, 2017. Available online: https://scholarsarchive.byu.edu/etd/6636 (accessed on 22 December 2021).
- Halberstadt, A.G.; Eaton, K.L. A meta-analysis of family expressiveness and children’s emotion expressiveness and understanding. Marriage Fam. Rev. 2002, 34, 35–62. [Google Scholar] [CrossRef]
- Taskiran, M.; Kahraman, N.; Erdem, C.E. Face recognition: Past, present and future (a review). Digit. Signal Process. 2020, 106, 102809. [Google Scholar] [CrossRef]
- Adjabi, I.; Ouahabi, A.; Benzaoui, A.; Taleb-Ahmed, A. Past, Present, and Future of Face Recognition: A Review. Electronics 2020, 9, 1188. [Google Scholar] [CrossRef]
- Deng, J.; Guo, J.; Zhang, D.; Deng, Y.; Lu, X.; Shi, S. Lightweight face recognition challenge. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Seoul, Korea, 27–28 October 2019. [Google Scholar]
- Ko, B.C. A brief review of facial emotion recognition based on visual information. Sensors 2018, 18, 401. [Google Scholar] [CrossRef]
- Lopes, A.T.; de Aguiar, E.; De Souza, A.F.; Oliveira-Santos, T. Facial expression recognition with convolutional neural networks: Coping with few data and the training sample order. Pattern Recognit. 2017, 61, 610–628. [Google Scholar] [CrossRef]
- LoBue, V.; Thrasher, C. The Child Affective Facial Expression (CAFE) set: Validity and reliability from untrained adults. Front. Psychol. 2015, 5, 1532. [Google Scholar] [CrossRef]
- Lucey, P.; Cohn, J.F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, San Francisco, CA, USA, 13–18 June 2010; IEEE: New York, NY, USA, 2010; pp. 94–101. [Google Scholar]
- Goodfellow, I.J.; Erhan, D.; Carrier, P.L.; Courville, A.; Mirza, M.; Hamner, B.; Cukierski, W.; Tang, Y.; Thaler, D.; Lee, D.H.; et al. Challenges in representation learning: A report on three machine learning contests. In International Conference on Neural Information Processing; Springer Publishing: New York, NY, USA, 2013; pp. 117–124. [Google Scholar]
- Lyons, M.; Akamatsu, S.; Kamachi, M.; Gyoba, J. Coding facial expressions with gabor wavelets. In Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, Nara, Japan, 14–16 April 1998; IEEE: New York, NY, USA, 1998; pp. 200–205. [Google Scholar]
- Guran, A.M.; Cojocar, G.S.; Diosan, L. A Step Towards Preschoolers’ Satisfaction Assessment Support by Facial Expression Emotions Identification. Knowledge-Based and Intelligent Information & Engineering Systems. In Proceedings of the 24th International Conference KES-2020, Virtual Event, Online, 16–18 September 2020; pp. 632–641. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Guran, A.M.; Cojocar, G.S.; Moldovan, A. Designing edutainment software for digital skills nurturing of preschoolers: A method proposal. In Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering: Software Engineering in Society, ICSE-SEIS ’20, Seoul, Korea, 27 June–19 July 2020; pp. 63–70. [Google Scholar] [CrossRef]
- Guran, A.M.; Cojocar, G.S.; Moldovan, A. A User Centered Approach in Designing Computer Aided Assessment Applications for Preschoolers. In Proceedings of the 15th International Conference on Evaluation of Novel Approaches to Software Engineering, ENASE 2020, Prague, Czech Republic, 5–6 May 2020; pp. 506–513. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the 3rd International Conference for Learning Representations, Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13–15 May 2010; Chia Laguna Resort. Teh, Y.W., Titterington, M., Eds.; JMLR, Inc. and Microtome Publishing: Brookline, MA, USA, 2010; Volume 9, pp. 249–256. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guran, A.-M.; Cojocar, G.-S.; Dioşan, L.-S. The Next Generation of Edutainment Applications for Young Children—A Proposal. Mathematics 2022, 10, 645. https://doi.org/10.3390/math10040645
Guran A-M, Cojocar G-S, Dioşan L-S. The Next Generation of Edutainment Applications for Young Children—A Proposal. Mathematics. 2022; 10(4):645. https://doi.org/10.3390/math10040645
Chicago/Turabian StyleGuran, Adriana-Mihaela, Grigoreta-Sofia Cojocar, and Laura-Silvia Dioşan. 2022. "The Next Generation of Edutainment Applications for Young Children—A Proposal" Mathematics 10, no. 4: 645. https://doi.org/10.3390/math10040645
APA StyleGuran, A. -M., Cojocar, G. -S., & Dioşan, L. -S. (2022). The Next Generation of Edutainment Applications for Young Children—A Proposal. Mathematics, 10(4), 645. https://doi.org/10.3390/math10040645