Next Article in Journal
Analyzing the Associations between Facets of Physical Literacy, Physical Fitness, and Physical Activity Levels: Gender- and Age-Specific Cross-Sectional Study in Preadolescent Children
Previous Article in Journal
Excellence in Professional Disciplines and Their Importance in Social and Educational Entrepreneurship
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reluctance to Authenticity-Imbued Social Robots as Child-Interaction Partners

1
Faculty of Education, University of Primorska, 6000 Koper, Slovenia
2
Faculty of Civil and Geodetic Engineering, University of Ljubljana, 1000 Ljubljana, Slovenia
3
The Institute of Management, Economics and Finances, Kazan Federal University, 420008 Kazan, Russia
4
State High School with Slovenian Teaching Language Ivan Cankar, 34170 Gorizia, Italy
5
The Institute of Psychology and Education, Kazan Federal University, 420008 Kazan, Russia
6
College of Education, Zhejiang University, Hangzhou 310058, China
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(4), 390; https://doi.org/10.3390/educsci14040390
Submission received: 14 February 2024 / Revised: 11 March 2024 / Accepted: 27 March 2024 / Published: 9 April 2024
(This article belongs to the Section Technology Enhanced Education)

Abstract

:
We are facing the rapid development of educational technology and social robots tested in classrooms. Research has identified teachers’ caution and concerns about these robots’ social skills. Pre-service education is critical for forming beliefs and preparing teachers for the future classroom and innovations in educational technology. In the present study, exploratory factor analysis is applied to examine pre-service teachers’ concerns about social robots’ instructional integration in the role of social agents interacting with children. We apply a concerns scale encompassing the instructional and socio-emotional concerns regarding robots’ instructional integration in the classroom environment. In this study, the scale, which was developed in Slovenia, is examined in the Russian cultural context. Based on the concerns scale, exploratory factor analysis identifies a one-factor solution with five statements (of a six-item factor) shared with the Slovene sample, adding three statements focusing on the importance of the teacher’s role. Russian pre-service teachers share concerns with Slovene pre-service teachers and further highlight the authenticity of unique human relationships and interactions. Slovenian pre-service teachers are more focused on children’s social skills and well-being, while Russian participants give special attention to the teacher’s role and value and believe that it would be wrong to place the robot in a classroom for such a purpose. They do not consider the robot’s human-like interaction skills sufficient for it to be assigned the role of a social agent and interaction partner in the classroom. The inappropriateness of the robot for pedagogical interactions and relationships is the basis of all their concerns. The Kruskal–Wallis test identified the moderate magnitude of the difference between the groups (ε2 = 0.07–0.12), with Russian pre-service teachers presenting the strongest reluctance towards authenticity-imbued social robots in pedagogical roles. The authors emphasize the need to clearly state stakeholders (roboticists, teachers, children, parents) in the research design and their roles in the evaluation of robot implementation.

1. Introduction

We are facing the rapid development of educational technology, and artificial intelligence is opening up important issues regarding the teacher’s role and technology’s influence on the learning process and on learners [1]. Robotic technology is being introduced as a teaching aid or to conduct more instructional tasks, or to teach/interact on the teacher’s behalf. Testing robots in classrooms necessitate that teachers plan robot-based lessons with an appropriate level of robot interaction.
Robotic educational technology is being tested in classrooms, and scientists have invested considerable effort in developing robots with human-like interaction capabilities over the last two decades [2]. These robots’ human-like appearance and interactions make them appear familiar to their human user, helping the robot to enter more easily into interpersonal relationships, which have hitherto been the unconditional and undisputed domain of the human–human interaction scheme. If a robot conforms to human appearance and behaviour, human–robot communication will presumably be optimised in many contexts [3].
The entry of human-like machines with human-like interactions into human relationships affects those relationships, and the long-term impact on interpersonal relationships is unclear. As Kahn et al. [3] explain, humanlike robots will affect people socially and morally. Current generations are the first in human history to grow up in an environment gradually becoming saturated with robots [4]. We have not experienced this yet. Šabanović notes that robots will mould our lives, and our interactions with them will enable them to be socially upgraded. Gawdat [5] elaborates on how we educate and teach robots in our everyday interactions with them. When designing artificial intelligence, developers need to develop a deep understanding of human learning, and social robots help us to learn about human learning [6]. Artificial intelligence has been recognised as a tool for investigating and analysing the human learning process and making it more visible [7].
We define social robots as follows: “Social robots are physically embodied autonomous robotic technology, equipped with AI and social skills, developed to become a human-equivalent partner in social relationships, capable of interacting in a human-like and situation- and role-appropriate manner” [1] (p. 63).

2. Literature Review

2.1. The Use of Social Robots in the Educational Context

Robots have been used in education in the past, for example, in the teaching of programming and robotics to students [6,8]. No human image or human interaction was needed with these robots. Today social robots are introduced in classrooms. They are intended to be used in all curriculum subjects [2,9]. They are not limited to the role of a tool [10] but are being developed to become a social agent and vehicles for human-like interaction with humans [10,11] with the robot autonomy ranging from teleoperation to the fully autonomous systems which is in focus of this article [12]. In the early stages of the development of social robot technology for education, scientists were initially more focused on the development and testing of robots [13] for the preschool level. Later, the focus shifted to elementary school [2].
For current use in a learning environment, robots require a lot of preparation, support, and adaptation of the physical and interaction environment to their needs [12,14]. Social robots in current educational contexts are designed for individualised learning to support students in learning at their own pace [2], and not for groups of students performing two-way group interaction. Robot technology is developing very fast, and artificial intelligence has great potential for application in education [6,8,14,15]; however, a social robot is not sufficiently designed for successful large-scale implementation in classrooms [16].
Most research on this topic focuses on the robot artefact rather than on the learning process and the student. Therefore, usability testing is still adapting learning situations tailored for social robot integration, rather than robot testing in authentic classroom situations to improve learning processes and outcomes. The predominant question concerns determining where using robotics makes sense and providing support for the technological development of new robotics technologies. Little attention is given to in-depth analysis of the implications (both positive and negative) of using social robotic technology in education and training and identifying the different impacts they may have on users. Teachers do not have a clear idea of what social robots could be used for [17]. Hrastinski et al. [18] and Istenič et al. [19] add that it is not clear to education professionals why social robots should be used in the classroom and why they should have the appearance of a human being.

2.2. Acceptance

Social robots are not just a radical technological innovation [20]; they are also a radical social innovation and as such are difficult for humans to accept and embrace in human social spaces. People find it difficult to understand highly developed technological artefacts that integrate human-like social capabilities. Therefore, development often occurs without adequate debate on the implications of technological innovations for target user groups (e.g., students) and without explicitly defined social choices [4]. Important robotics topics should not be left solely to roboticists, as integrating robots into society involves and refers to solving social problems of a non-technical nature.
Robot developers are well aware that robots will only be successful in the roles for which they are designed if humans accept them. This is even more the case for social robots, which often have to deal with the issue of their acceptance in the very personal or private nature sphere of human beings. For this reason, much of the research on social robots is concerned with the issue of acceptance: what the robot should look like and what the robot’s gestures, behaviour, voice, etc. should be like. In human–robot studies, the main focus has been on developing and refining a robot that is not rejected by humans. In the case of robots that solve a problem for a human, or relieve a human of dangerous and dirty work [21,22], we can assume that humans will be more neutral in judging the usefulness of the robot, but they may still have concerns. However, when it comes to social robots performing social roles, people are more reluctant to accept robot social behaviour [23]. Among social robot implementation purposes have been identified also teacher shortages [24,25]. Research studies examine a variety of functions that social robots could perform in instruction supporting either teachers or students [2]. Current research does not provide a clear picture of what the long-term consequences of social robot integration may be and “How will man shape the robot and how will the robot shape man” [4,5].
In some cases, it may be clear that a robot is going to make our job easier. But in many cases, roboticists have yet to figure out where and how to place certain robots and create the need for social robots. The comparative advantages of a robot over a human have not yet been satisfactorily explained. Currently, roboticists are focusing on the vision and technical constraints involved when developing new social artefacts that can be integrated in diverse societal areas, which it seems will happen shortly. Do we have an overall idea of the functioning of these robots in the human social sphere and the implications of their integration into human society? Education professionals have to initiate an in-depth debate on the issues of robot integration.
Today, humans are confronted with this new technology and many solutions are offered commercially. It is predicted that in the future, man and robot will complement each other and coexist [4]. At a societal level, we need to have a broad debate about how robots and humans can co-exist, and what are the implications and influences of robots in a society. Šabanović [4] explains that many roboticists conceptualize the social change of integrating robots into human society in terms of linear technological determinism: advancements in the field of robotic technology will be a step to bring human society forward. Assuming that technological advances are a condition for change in the future course of human history, robots are destined to change our lifestyles, and researchers are trying to fit them into our existing lifestyles [4]. However, there is no clear picture of the society that is supposed to result from this process of integrating robots. In education, the role of educational technology is to equip younger generations with skills and follow the advancement of society through the progress of technology.
We are being pushed into the robot age. It is also being brought about by employing education. Education has a formative objective in supporting the upbringing of members of human society. Education professionals have to focus primarily on the learner [26,27]. Education professionals perceive social robots as just another digital technology in the classroom, as a machine following a script rather than as an interaction partner [17,19,28]. It is important to establish whether students perceive these robots that way and whether they will perceive them that way in the future. Based on the literature, Flensborg Damholdt et al. [29] conclude that interacting with a physically embodied robot in a social space has a different effect on people than interacting with a screen application or video of the robot.
Future generations will be born and grow up in an environment saturated with robots [5,11,30]. In the future environment, robots will learn quickly with the capacity to incorporate constant technological improvements, including the social skills required to interact with humans [5]. The efforts of robotics researchers aim to persuade people to accept social robots with social roles in society, in what Kahn et al. [3] call the “I–Thou” relationship, in which people treat the robot as if it were a person, rather than what they call the “I–It” (I–This” relationship), in which people treat the robot like a machine. Turkle [30] pointed to the missing link in the I–You relationship: the authenticity of the relationship.

2.3. Elementary School Teachers’ Perception of Their Profession and the Role of Technology in Education

Educational technology forms an instructional environment with social robots tested in classrooms [2]. Teachers experience their role in a very different way from roboticists, who tend to prepare a social robot to perform teachers’ roles. The teacher is focused on the children, on their all-round growth, and on helping them find a place in society according to the child’s aspirations, abilities, and character. For the roboticist, the focus is on scientific progress that will lead to social development and prosperity. For the teacher, the focus is on a properly educated, nurtured, and socialised child.
The teacher focuses on the learner as a whole. This includes affective concern for the child’s well-being. The teacher supports the student in building self-confidence and a sense of identity. The teacher stimulates the child’s aspirations and motivates him or her to persevere in developing their personal and cognitive abilities.
Children spend much of their day with teachers in a world in which “most roles are affectively neutral, and the positive affectivity of the teacher’s role increases in importance” [26] (p. 26). In this context, the teacher builds an authentic relationship of trust with the child. The teacher becomes a role model for the child, motivating, inspiring, and encouraging him or her. The teacher’s role is much more than mechanically transmitting knowledge. The teacher tends to be involved as a whole person [26] with personal integrity, involving their own values and principles in the relationship with the student [27], especially at the pre-primary and primary level.
It is not possible to specify all the tasks a teacher performs and how to perform them. Therefore, the teacher has a great deal of discretion in the conduct of their work. Because each child is a world themselves, the teacher cannot rely on the certainty of, for example, certain technical procedures [26]. In the classroom, in addition to teaching the content, the teacher promotes the corresponding (1) relational virtues (related to one’s relationship with others, including aspects such as generosity, honesty, trust, and sincerity); (2) performance virtues (related to the performance of tasks, and encompassing responsibility, perseverance, effort, etc.); (3) intellectual virtues (values related to the understanding of reality, such as truth and prudence) [26,27,30]. In this context, the teacher sees teacher’s role more as a concern for the preservation of human identity and the virtue of being human.

2.4. Reflections on Child–Robot Interaction (CRI)

Educational goal is the cognitive and moral development of students [31] which is a complex process manifested on several levels of teaching and learning. So far, the main purpose of surveying teachers about robot-assisted lessons has been to find out if they would be willing to accept a robot in the classroom and to find out how robots could be used in the classroom [32]. These findings should be integrated into the further development of robots. The crux of the matter is to facilitate discussion with education professionals regarding whether they have a real need for the use of a robot and to specify such robots’ circumstances of use, purposes, benefits, and limitations. Discussions are needed to find out whether there is a need for authenticity-imbued social robots or whether authenticity is best reserved for teachers. Teachers and pre-service teachers need to be asked if they envision a future with robot teachers as the only teachers in a face-to-face classroom environment.
A great deal of educational technology resides in perceptions and training readiness [33] and robotic research shows similarly. Even if users of technology perceive the benefits of social robots, they may not be ready to leverage social robots for optimum benefit. The use of technology is not socially and interactively neutral. Robots’ performances in social interaction and relationships are an important issue [17,18,19,34]. Education professionals focus on the robots’ social and interactional capabilities when they consider the implementation of robots in the school environment and the interaction between students and robots. Teachers are much less worried about their own technical skills with regard to robot use [17,19].
As education professionals consider CRI with a focus on their students at the lower levels of education, the issue of relational authenticity in emotional exchanges with social robots is at the forefront [19,28,32]. There are profound doubts that a social robot could authentically perform human roles [19,28,32].

2.5. Authenticity Problem

Social robots are suggestive, but they are not relationally authentic [20]. The perception of the unauthenticity of their relationships can be a barrier to using social robots as relational partners with students [28]. Participants in a study by Diep et al. [28] stated that robots cannot be used to replace humans in tasks that require emotion and communication. They further explained that because social robots lack history, emotion, and sophistication, they cannot authentically assume the role of a human.
This problem of authenticity in robots’ relationships was emphasised by the participating pre-service teachers in the studies conducted by Istenič et al. [19,32], who presented a holistic view of pre-service teachers’ concerns for their students. The participants emphasized the importance of authentic human emotions, empathy, social bonds, facial expressions, and verbal and non-verbal communication. They stated the belief that authentic human contact, communication, and relationships in the classroom are irreplaceable. They pointed out that a robot is incapable of having genuine relationships with students, acknowledging that children need to be raised and educated. They do not consider it appropriate for a robot to perform teachers’ socially intelligent role in the classroom. Their views are in agreement with the study by Smakman et al. [35,36], which indicated that robots are not suitable for socialising and bonding. Furthermore, they believe that the social robot partner is not suitable for caring for, raising, and comforting a child; socialising the child into human society; and teaching the child about their culture, behaviour, life and emotions [19]. Diep et al. [28] also found that teachers were convinced that the robots did not provide enough comfort for the students. In addition, participants in a study by Istenič et al. [19] claimed that robots cannot help children understand what it means to be alive, to adapt, and to interact in a social context. In particular, participating pre-service teachers reported problems with the robots’ lack of social skills, which were highlighted by participants in several related studies. Kennedy et al. [17] identified a lack of social skills in robots, while Diep et al. [28] and Serholt et al. [34] identified a lack of emotions and an inability to recognize an interaction partner’s emotions. In addition to feeling that the social skills of the robot could not reach sufficiently high capacity, participants in Istenič et al.’s study [19] warned about the lack of authenticity of the robots’ emotions and children’s awareness of this authenticity problem. They believe that it would be wrong to place a robot in a classroom for such a purpose and agree with Turkle [30] that when we speak of the empathy of a robot, the robot only exhibits behaviours that would be considered empathic if performed by a human, whereas a robot is not capable of empathising as such.
Istenič et al. [19,32] study shows that participants’ reflections clearly express pre-service teachers’ belief in the uniqueness of human nature [37,38]. Human uniqueness is a socially learned and culturally specific sense of being human that is unique to humans and distinguishes humans from non-humans [39]. The study by Istenič et al. [19,32] indicates that participants’ belief in the uniqueness of human nature is based on the uniqueness of human emotions, bonds, and relationships [19], which following the participants in a study by Smakman et al. [35,36], that robots are not suitable for socialisation and bonding.
Istenič et al. [19,32] study indicates that participants fear that a child’s interpersonal relationships will be substituted with a relationship with the robot, that the child may develop an unhealthy attachment to the robot, and that the robot will become the child’s role model. Current research shows that humans can develop emotional bonds and attachments to social robots, as a reasonable level of human–machine social interaction is now technically possible [18]. For example, Kanda et al. [40] report that children developed a friendly relationship with the robot known as Robovie during two months of interaction with it. However, it remains unclear whether human–robot attachment can reach levels similar to those of human–human attachment [41].
Serholt et al. [34] show that teachers believe children could begin to mimic robots, adopt new ways of speaking, and therefore struggle to understand human facial expressions, resulting in confusion and affecting their emotional intelligence. They fear that children will become dehumanised by interacting with robots. Additional concerns of education professionals include the robot’s lack of authentic emotional exchange with students, the lack of human interaction, the lack of the ability to perform face-to-face interactions with humans, and the lack of the ability to perform communication tasks [28]. Turkle [30] raises concerns about relational artefacts’ authenticity. Merriam-Webster Dictionary [42] defines authentic as “worthy of acceptance or belief as conforming to or based on fact” as well as “conforming to an original to reproduce essential features”. For the purpose of this study, a robot in an educational role can be considered authentic if it can perform its role comprehensively and at the same level as a teacher so that the child’s well-being and overall cognitive and personal development are not compromised by the child–robot interaction. At present, the focus is on student-centred concerns that were raised in the study by Istenič et al. [19] in a sample of pre-service teachers. Merriam-Webster Dictionary [43] defines concern as a “matter that causes feelings of unease, uncertainty, or apprehension”. For the purpose of this study, we define a concern as a limiting condition, situation, or relationship which represents an obstacle that causes a state of mental discomfort and uncertainty and calls for caution. More broadly, in our context, a concern is a situation in which participants feel that a robot is not appropriate in a school setting, meaning they are concerned about introducing a robot into this setting.
There is a lack of studies examining teachers’ and pre-service teachers’ views and concerns when focusing on the learner in robot-assisted instruction. The instrument designed within the Slovene sample of pre-service teachers was applied in this study to disentangle views to slowly extract the concerns that recur across cultural contexts and require attention in future research. The study aims to explore which concerns are present among Russian pre-service teachers and whether pre-service teachers from in the Russian cultural context share concerns with their Slovenian counterparts. HRI studies have shown the potential for cross-cultural differences in human–robot perception and attitudes [44]. Based on the instrument designed in the study in Slovenia [19,45], the goal of the study presented in this article is to examine if Russian pre-service teachers share the concerns of students from the Slovene cultural environment.
Our research question is as follows: What are Russian pre-service teachers’ concerns regarding social robot instruction integration and do they share concerns with Slovene pre-service teachers?

3. Methods

3.1. Research Design, Participants and Procedures

The survey was conducted in 2021 at Kazan Federal University in Kazan. The convenience sample consisted of 124 pre-service classroom teachers. About 20% of all students at Kazan Federal University are future teachers. Of the participants, 123 (99.2%) were female and 1 (0.8%) was male. The participants had a mean age of 19.38, SD 1.84, ranging from 18 to 35 years. A total of 82 (66.1%) participants were attending the first-year courses, 21 (16.9%) were in their second year, 16 (12.9%) were in their third year, and 5 (4%) were in their fourth year. There is a strong female bias in our sample, as in other research involving social robots and education professionals [17,34,46,47,48].
Only a small share of 10 (8.1%) of the participants have already seen social robots; 60 (48.4) have never seen social robots in real life, stating, “I’ve only seen them in media like television and newspapers”, 50 (40.3%) have never seen them before, and 4 (3.2%) have used them before.
Before the intervention, participants were informed about the study, that their participation was voluntary, and that neither participation nor nonparticipation would affect their grades. They provided written consent.
The participants were shown the presentation of the characteristics of social robots. Afterwards, they viewed seven videos of social robots in pre-primary and elementary school settings. The benefits of an on-screen presentation of a robot in research studies have been discussed [32]. Studies about teachers’ attitudes, opinions, and views on the topic established that a video intervention presenting social robots on the screen is appropriate [34,48,49]. After viewing the video material, the participants completed an online questionnaire. The data collection was individual and not guided.

3.2. Instrument and Data Analysis

The instrument was applied in the study to disentangle views and to slowly identify which concerns recur across cultural contexts and will require attention in future research. For this reason, even with small samples, as is the case in this paper, we have taken the position to capture as much data as possible in the EFA, so as not to lose valuable and scarce basic data for further processing. We applied the scale with 27 items and data processing identified a set of latent variables underlying the variables actually observed or measured in the sample of Russian participants.
The instruments applied in the study were the concerns scale, which comprises 27 items, and a 5-point Likert scale. The instrument was designed in the Slovene context [19,45] in two stages. In the first stage in 2019, study participants openly wrote about their reflections on the presence of social robots in a classroom. The open reflections were coded and categorised into a concerns scale composed of 27 items [19]. The scale consists of 27 items (Appendix A): the authenticity of the robot’s human-like appearance and identity, human contact and emotions, empathy, its understanding of the child’s feelings, and its ability to comfort the child and support their socio-emotional development; the authenticity of the robot’s communication and education; child socialisation, human and teacher substitution; teacher interaction; the robot being a role model for a child and the ability to evoke a genuine emotional attachment to the robot; the robot’s ability to attract students’ attention; the robot’s impact on kids’ behaviour and communication; whether children focus on the robotic technology during learning activities instead of being focused on the learning activity; the robot’s inability to resolve interpersonal problems in the classroom; the disruptive impact of robotic technology on upbringing and education; technology attachment; the possibility that the robot may be intimidating to children; and participants’ opinion on whether or not robots should be banned in elementary schools.
In the second stage, in 2021, the testing of the scale was performed on a sample of 132 Slovene pre-service teachers [45]. Two factors with good reliability were identified: “Lack of social skills” (Cronbach α = 0.904) in line with Kennedy, Lemaignan, and Belpaeme [17] and “Inadequacy of robots to promote students’ development” (Cronbach’s α = 0.867). The reliability of the whole concerns scale (including all 27 items) established a Cronbach’s α = 0.945, which is considered very good [50].
This paper presents the results of an exploratory factor analysis of the concerns scale in the Russian sample. The concerns scale was translated from the Slovene language into the Russian language by the authors and evaluated for clarity by two experts and seven students.
The data were analysed using the SPSS 28.0 statistical package. Principal axis factoring was performed to assess the construct validity of the concerns Likert-type scale to determine the underlying factor structure that exists in our set of variables. The reliability was established by calculating Cronbach’s alpha. Basic descriptive statistics are presented, including the mean scores, standard deviation, minimum, and maximum. To examine concerns between the Russian and Slovene samples, the Kruskal–Wallis test was applied to determine pairwise group differences. The Slovene data used were from a study conducted in 2023 [45]. An effect size was applied to determine how meaningful the differences in the Epsilon square (ε2) are.

4. Results

4.1. Principal Axis Factoring and Reliability Analysis of the Instruments

Exploratory factor analysis, the principal axis factoring method, was applied to examine the dimensionality of the 27 items from the concerns scale in a sample of 124 participants. First, data were screened to determine their appropriateness for principal axis factoring. According to Field [51], the value of the Kaiser–Meyer–Olkin test of sampling adequacy in our dataset (KMO = 0.905) indicates a high sampling adequacy. The p-value for Bartlet’s test of sphericity (χ2 (28) = 579.196, p < 0.001) indicates the correlations between items are sufficiently large for principal components. Our determinant (0.008) indicates that our data were suitable for the analysis. Based on the results of principal axis factoring with Oblimin rotation, we concluded that a one-factor solution containing 8 out of 27 items from the original concerns scale obtained from a sample of 124 participants presented a sufficient measurement construct [52] (Table 1). The sample size and number of scale items are heavily discussed topics [51]. In educational research, a sample size of 150 has been mentioned for the initial structure exploration [53].
The one-factor solution indicating the robot’s inappropriateness for bonding, relationships, and educational functions is named “Reluctance toward authenticity-imbued social robots” and explains 67.70% of the variance of the scale. Slovenian pre-service teachers are more focused on social skills and the child’s well-being, while the Russian factor solution highlights the teacher’s role and value. The one-factor solution integrated five items from the Slovene six-item scale, with the first factor being “Lack of social skills” [45]. Omitted was one item: Compared to a robot, a teacher can act quickly when problems occur, such as fights between children. In addition, three items were included which were not present in the Slovene two-factor solution: Since children often consider a person in their life as a role model, it would be wrong for them to form a similar attachment to a robot; a teacher’s word is valuable and robots cannot substitute it; children spend too much time with electronic devices, so it is necessary to encourage other activities, such as spending time in nature.
Russian pre-service teachers are concerned about whether the authenticity-imbued social robots are appropriate and claim that authenticity is best left to teachers. They are reluctant to envision a future with robot teachers. The reliability of the factor was established by calculating Cronbach’s alpha. Cronbach’s alpha for factor “Reluctance towards authenticity-imbued social robots” with eight items (0.913) indicates high [50] reliability. The descriptive statistics for the extracted factor and single items of the extracted factor have been calculated. On average, participants express agreement with the factor “Reluctance towards authenticity-imbued social robots” (mean = 3.78, SD = 0.73, Min = 1; Max = 5). To determine which concerns were particularly strong among our participants, a single-item analysis for a single-factor solution, i.e., “Reluctance towards authenticity-imbued social robots”, was used (Table 1).

4.2. Kruskal–Wallis Test

For the items, which constitute the factor presenting Russian concerns, the Kruskal–Wallis test was performed to compare Russian and Slovene concerns and their level of agreement. The results show statistically significant differences between the two groups. However, the magnitude of the difference between the groups is moderate, as shown by the effect size (ε2) values ranging from 0.07 to 0.12 (Table 2). Russian participants have a higher level of agreement with regard to the importance of a teacher.

5. Discussion

Participants, on average, expressed a high level of agreement with all of the concerns about the robot’s lack of ability to perform social skills and form authentic attachments, which indicates that the authenticity problem is especially important in a CRI. On average (mean = 4.03; SD = 0.87), participants assigned the highest importance to the fact that robot cannot replace a human being. This was also the only item where none of the participants chose value 1, i.e., strongly disagree. Those who disagreed only chose value 2, i.e., disagree. This item can be interpreted in different ways, for example: (a) it would not be appropriate in an educational context for a robot to replace a human; (b) the specificity of the educational context is such that a robot cannot adequately replace a human; (c) in general, it is not appropriate for robots to replace humans in human roles. Given that (a) our one-factor solution is already saturated by the item stating that robots should not replace teachers’ work and interaction with children, and (b) the fact that the main concerns expressed by those involved are rooted in their learner-centred approach, we tend to interpret that the participants do not consider robots in general, and in particular in the pedagogical process, to be an adequate substitute for humans. Our findings indicate that they see the robot as inappropriate for performing typically human tasks. In the future, research needs to delineate exactly what teachers are concerned about when it comes to human replacement, or possibly teacher replacement, by robots [25]. Are they concerned regarding their jobs, sharing work with a robot, the quality of the pedagogical process, or the welfare of the learner?
All other items also have values which tend, on average, towards four (agree): robots should not replace teachers’ work and interaction with children; children need teachers for their socio-emotional development; the teacher understands the child’s feelings and can comfort them, a task which the robot is unable to perform; a robot cannot replace a human; robots cannot replace genuine teacher contact with children; and a child needs a person who will understand, help, and encourage them. In the Slovene sample [45], these five concerns, together with the item stating that compared to a robot, a teacher can act quickly when problems occur, such as fights between children, saturate the factor “Lack of social skills” with a very high mean M = 4.37 (SD = 0.66). The Russian one-factor solution expresses a mean value of agreement (M = 3.78, SD = 0.98).
The item stating that robots should not replace teachers’ work and interaction with children (mean = 4.42; SD = 0.80) is in line with previous research [28,45,54]. Respondents in the study conducted by Diep et al. [28] have a strong belief in the importance of human interaction, which a robot cannot fulfil in the same way as a human being.
Comparing the Russian pre-service teachers’ results with those of the Slovene pre-service teachers surveyed in the same year reveals three main issues:
First, while Rosanda and Istenič [45] extracted two factors (the lack of social skills and the inadequacy of robots for enhancing student development) for the sample of Slovene students, the study of Russian students showed that a one-factor solution is appropriate, with a factor named “Reluctance towards authenticity-imbued social robots”.
Second: The single-factor solution of the Russian sample, named “Reluctance towards authenticity-imbued social robots” includes five out of six statements from the first Slovene factor, the robots’ lack of social skills [45]. Thus, statements indicating the perception that the robot does not have sufficient social skills to act as a social agent with an assigned pedagogical role in the classroom were also highlighted by the Russian participants. Such a position is thus consistent with (a) the positions of two generations of Slovene pre-service teachers from studies conducted in 2019 [19] and 2021 [45]; (b) the views of 35 British educational experts on using robots in schools [17]; (c) the views of 6 special educators suggesting robots are unsuitable for tasks involving emotional and communication needs Diep et al. [28], and (d) the views of 18 Dutch primary teachers emphasising the importance of human contact for children [48].
The view that children need teachers for their socio-emotional development is strongly held by Slovene and Russian pre-service teachers. Similarly Dutch teachers [48] (believe that emotional development cannot be taught by a robot. Humans, not robots, should teach social skills, according to participants in a study by Smakman et al. [35,36].
Similar to the Slovene [45] and Russian pre-service teachers, teachers from England, Scotland, Portugal, and Sweden stated they would find it worrying if robots were to replace human–human interactions [54]. Participating teachers from Sweden, Portugal, and the United Kingdom in the study of Serholt et al. [34] believe that in educational settings, children interacting with robots would lead to the dehumanisation of children.
Third: In addition, the Russian participants focused on three statements that were not extracted in the Slovene sample in 2021 [45]. Two of these emphasize the authenticity of the teacher’s role compared to the educational role of the robot, namely, the statement that since children often see people in their lives as role models, it would be wrong for them to form a similar attachment to a robot, and statement that a teacher’s word is valuable, and a robot cannot replace it. This is in line with a study by Smakman et al. [35,36] which indicated that robots are not suitable for socialising and bonding. It highlights the essence of the teacher’s profession and the teacher as a role model for the child, motivating, inspiring, and encouraging them, especially at the pre-primary and primary level [26].
The third statement emphasises the digitalisation of childhood: children spend too much time with electronic devices, so it is necessary to encourage other activities, such as spending time in nature, to draw attention to the problem of technology addiction. This is in line with previous research, in which educators fear that robots could represent a source of distraction for students [17] and could take these children away from the field of human relationships, as CRI could change their communication preferences [48]. Consequently, as children grow, they will seek constant contact with technology and may prioritize technology over interpersonal relationships [19].
Our results show that the Russian pre-service teachers, similar to the Slovene pre-service teachers, consider robots to be inadequate in terms of social skills, as an attachment figure, and when it comes to providing the human aspects of teaching. They believe that simply because a robot is not human, it is inherently unsuitable for working with children. Only a human has, according to our participants, the true socio-emotional skills and interactions necessary to work with children in the classroom. In addition, Russian pre-service teachers highlight the teacher’s role and value. Their concerns are in line with Sharkey’s [22] concerns in her conceptual learner-centred study. Based on her findings, the difference in focus between roboticists and teachers could be identified. Roboticists pay attention to the technological progress driving the development of society [4], while teachers focus on the well-being of children. The authors of this study hold a belief that improvements in robots’ technical and/or interactive capabilities cannot bridge such a gap between two completely different conceptions of the role of humans, technology, well-being, and society. On the contrary, such improvements could widen the gap.
The Kruskal–Wallis test identified the moderate magnitude of the difference between the Russian and Slovene participants (ε2 = 0.07–0.12), with Russian pre-service teachers presenting the strongest reluctance towards authenticity-imbued social robots in a pedagogical role.

Authenticity and Human Uniqueness

From the perspective of authenticity, the eight statements that saturate the extracted factor in the Russian sample of this study are related to the unauthenticity of the relationship between the robot and the child. In the opinion of the participants of this study, the integration of robots would cause dehumanisation, as a robot cannot be considered an adequate teacher substitute when it comes to meeting a child’s need for communication and interaction. The teacher understands the child’s emotions and can comfort them, which the robot cannot do. Therefore, children need their teachers for their socio-emotional development. Robots cannot replace a teacher’s human contact with children. A child requires a person who can truly understand them and provide help and encouragement. Therefore, a teacher can be a real role model for a child, and it would be wrong if this role model were a robot. Such an attachment to technology is inappropriate in the eyes of the participants of this study. Furthermore, all of the previous seven points are absorbed in the item stating that a robot cannot replace a human. Our participants express a strong conviction that the human being is unique and, therefore, cannot be imitated authentically by a robot.
In this study, we identified that the participants believe that education is based on human relationships, in which there is no place for a social robot as an interaction partner. When it comes to the relationship with a child in the classroom, they assign the robot only the place of a machine. A robot cannot be considered a social agent in pedagogical relationships. The participants believe a genuine relationship with a child is the basis for education, that a robot cannot establish. Improving the social skills of a robot relationship artefact is not an option.
We conclude that the Russian pre-service teachers in our sample disagree about appropriateness of authenticity-imbued social robots as child-interaction partners. Social robot in the relationship with the child is a kind of relational deception of the child and as such is not appropriate for an educational environment.

6. Conclusions, Limitations and Future Directions

Humans have traditionally reserved feelings such as trust, caring, empathy, nurturing, and love for relationships in which all parties are capable of feeling them, that is, for relationships where all parties are human and capable of reciprocation [30]. These areas are now being entered by social robots [11]. Robots’ integration depends on the acceptance from the participants. Child acceptance studies suggest that children will accept them, at least initially [49,54,55,56,57]. Although not much research has been conducted among teachers, the results show that they have mixed feelings about robots interacting with children in the classroom [19]. The results of such studies are difficult to compare. In this regard, Bartneck et al. [58] point out the problem of the comparability of studies that use different measurement instruments. In addition, the results have been obtained from different types of samples, which does not allow for generalisation. The results of different studies indicate several issues that need to be studied in depth concerning the use of robots in children’s lives.
The concerns of our participants are largely in line with the previous studies. We find that some concerns tend to emerge repeatedly across different studies when researching teachers. To ensure the reliability and comparability of the results of the different analyses of perceptions, cognitions, considerations, and attitudes towards social robots, it will be necessary to define the target focus of the research. The authors of this paper identify some main areas of focus, which are presented below.
The robot artefact is the focus in several surveys. Teachers are asked to assess what a robot would need to be equipped with to function in the classroom. Teachers therefore tend to think about variants suitable for the robot to achieve the goal of, e.g., performing a conversational style. The goal is to further the development of skills that will allow such robots to be suitable for working with students. From a pedagogical point of view, technical proficiency is not enough. For a robot supporting students’ learning, pedagogy and instructional models are needed.
The focus is on the teacher and where and how the robot could be used in the classroom. In these contexts, education professionals are usually thinking about where a robot with a certain set of characteristics could be used. Interestingly, the question of why they should use it at all, or whether they need it, is a less frequent topic of discussion. We think it would be useful to find out why it makes sense to use social robots when working with students (in a sense of a pedagogical approach) and not where and how to successfully use social robots when working with students (in a technical sense). Also, the fact that a teacher thinks that they will be successful in using a robot and that they could use it in the classroom, for example, to stimulate student motivation, is not sufficient from a pedagogical point of view when evaluating the robot’s performance in the classroom.
The important goal is to assess the readiness of the teacher to integrate the robot into the learning process. In this context, it is necessary to ask specifically what aspects of the learner the robotic technology is helping to develop: the cognitive, affective, or all-round progress of the learner, and also to assess any undesirable side effects.
These robots are placed in a society as a whole, where students will one day take part as adult citizens. As Šabanović [4] explains, exposing children to robots will have an impact not only on the robots, but also on the children. There are still many unanswered questions in this area. For example, what values, especially in the field of humanities and society, will the robot convey (directly and through subliminal messages and gestures)? We will have to decide what kind of members of society we expect to be raised and educated by robots.
We believe that only the results of studies with the same focus can be rigorously comparable. We will only be able to talk about the acceptance, efficiency, and concerns related to robots in education when we compare the results of both learner-centred and declared learner-centred studies.
It would also be important to find out whether the pre-service teachers realistically expect robots to be introduced in schools while they are working. We will have to see if the results differ between the groups of teachers who have not yet reached the stage of expecting this possibility and the pre-service and in-service teachers who have a realistic expectation of using social robot technology in the future.
Similar to studies on social robots, we applied a convenience sample. Research in the field of social robots in classrooms integrates small convenience samples [19]. To generalise and confirm our findings, studies with random sampling need to be carried out. Perceptions of robots may be culturally conditioned. Therefore, after research in Slovenia [19,45], this study conducted in a Russian sample was followed by a sample in China obtained by the same research team. Cultural contexts influence teachers’ readiness and approaches to integrate educational technology. In future research, we aim to contribute to the realm of technological advancements in education, addressing the complex topic of integrating robotic educational agents into classroom settings.

Author Contributions

Conceptualization, A.I. and V.R.; methodology, A.I. and V.R.; formal analysis, A.I. and V.R.; investigation, A.I., V.R., L.L. and R.V.; data curation, V.R.; writing—original draft preparation, A.I. and V.R.; writing—review and editing, A.I., V.R., L.L., Ž.T., R.V. and X.Z.; supervision, A.I.; funding acquisition, A.I., Ž.T. All authors have read and agreed to the published version of the manuscript.

Funding

The study was financed by Slovene Research Agency (No. P2-0210) and the University of Primorska research programme titled Post-digital Learning Environment and Educational Technology for Innovative Learning and Career Paths (ID 2990-5/2021).

Institutional Review Board Statement

All procedures performed in studies involving human participants followed the ethical standards of the institutional and/or national research committee and with the Helsinki Declaration and its later amendments or comparable ethical standards and following the guidelines of the Code of Ethics by University of Primorska (https://www.upr.si/si/univerza/eticni-kodeks accessed on 14 February 2024). The ethics committee at the University of Primorska was not in place at the time of study design therefore ethical approval was not required as according to national regulations not needed for this type of study. All participants were fully informed that their anonymity was assured, why the research was being conducted, how their data would be used, and that there were no risks associated.

Informed Consent Statement

Written informed consent was obtained from participants.

Data Availability Statement

The data that support the findings from this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Concerns Scale

Items5 Totaly Disagree4 Agree3 Neutral2 Disagree1 Totaly Disagree
1Robots will be very popular and effective at learning in the beginning, but eventually, like any new thing, they would become part of everyday life and no longer be interesting.
2Children could misunderstand a robot, not as a person, but as a toy, and therefore not take it seriously
3Using a robot would contribute to poorer socialization, as people would get used to communicating with an inanimate being and lose touch with reality
4Genuine human contact is more important and teaches and educates children more than a robot could perform.
5I don’t see a robot in an independent role (e.g., a teacher) because robot has no empathy for people.
6A robot cannot establish human contacts and emotions.
7Emotions cannot be learned through a robot.
8Since children often consider a person in their life as a role model, it would be wrong for them to form a similar attachment to a robot.
9Children spend too much time with electronic devices, so it is necessary to encourage other activities, such as spending time in nature.
10A robot cannot replace a human.
11Children would be more motivated to learn with robots because they are interesting, but it is however better if children are taught by teachers and parents and robots are used for play in which children learn.
12Children will not listen to a robot for a long time. They will be more interested in watching robot’s structure and everything else, rather than listening to what it is saying.
13Robots should not replace teacher’s work and interaction with children.
14Compared to a robot, a teacher can act quickly when problems occur, such as fights between children.
15A teacher’s word is valuable and robots cannot substitute it.
16Robots can inhibit the development of empathy.
17Robots cannot replace genuine teacher-child contact as a child needs a person who will actually understand, help and encourage him or her.
18The teacher understands the children’s emotions and can comfort them which robot is unable to perform.
19I don’t feel good if a robot replaces a human. Technology is already almost too much present in our daily lives today.
20The robot does not belong in primary schools because children of this age have to learn the basics of life, not to encounter things of modern technology immediately.
21I don’t see the need why should robot be shaped as a human being.
22Children need teachers for their socio-emotional development.
23Robots are a socially disruptive technology that can change the entire current course of upbringing and education.
24Children could become emotionally attached to robots after a while, which could become a problem.
25I don’t beleive it is possible for humans and machines to communicate with (and thereby educate) each ather.
26Children will be surprised by robot’s presence, some scared.
27Children need human learning because if they listen to robots, they would behave and communicate like robots.

References

  1. Istenič Starčič, A. Human learning and learning analytics in the age of artificial intelligence. Br. J. Educ. Technol. 2019, 50, 2974–2976. [Google Scholar] [CrossRef]
  2. Rosanda, V.; Istenič Starčič, A. A review of social robots in classrooms: Emerging educational technology and teacher education. Educ. Self Dev. 2019, 14, 1–20. [Google Scholar] [CrossRef]
  3. Kahn, P.H.; Ishiguro, H.; Friedman, B.; Kanda, T.; Freier, N.G.; Severson, R.L.; Miller, J. What is a human?: Toward psychological benchmarks in the field of human–robot interaction. Interact. Stud. 2007, 8, 363–390. [Google Scholar] [CrossRef]
  4. Šabanović, S. Robots in society, society in robots: Mutual shaping of society and technology as a framework for social robot design. Int. J. Soc. Robot. 2010, 2, 439–450. [Google Scholar] [CrossRef]
  5. Gawdat, M. Super Intelligenti; Mondadori Libri S.P.A.: Milano/Udine, Italy, 2022. [Google Scholar]
  6. Mubin, O.; Stevens, C.J.; Shahid, S.; Mahmud, A.A.; Dong, J.J. A review of the applicability of robots in education. Technol. Educ. Learn. 2013, 1, 209–215. [Google Scholar] [CrossRef]
  7. Luckin, R.; Holmes, W.; Griffiths, M.; Forcier, L.B. Intelligence Unleashed. In An Argument for AI in Education; Pearson: London, UK, 2016. [Google Scholar]
  8. Benitti, F.B.V. Exploring the educational potential of robotics in schools: A systematic review. Comput. Educ. 2012, 58, 978–988. [Google Scholar] [CrossRef]
  9. Pachidis, T.; Vrochidou, E.; Kaburlasos, V.G.; Kostova, S.; Bonković, M.; Papić, V. Social Robotics in Education: State-of-the-Art and Directions. In Advances in Service and Industrial Robotics, Proceedings of the 27th International Conference on Robotics in Alpe-Adria Danube Region (RAAD 2018), Patras, Greece, 6–8 June 2018; Aspragathos, N., Koustoumpardis, P., Moulianitis, V., Eds.; Mechanisms and Machine Science; Springer: Cham, Switzerland, 2019; Volume 67, pp. 1–11. Available online: https://link.springer.com/chapter/10.1007/978-3-030-00232-9_72 (accessed on 6 November 2020).
  10. Ekström, S.; Pareto, L. The dual role of humanoid robots in education: As didactic tools and social actors. Educ. Inf. Technol. 2022, 27, 12609–12644. [Google Scholar] [CrossRef]
  11. Taipale, S.; de Luca, F.; Sarrica, M.; Fortunati, L. Robot Shift from Industrial Production to Social Reproduction. In Social Robots from a Human Perspective; Vincent, J., Taipale, S., Sapio, B., Lugano, G., Fortunati, L., Eds.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 11–24. [Google Scholar] [CrossRef]
  12. Beer, J.M.; Fisk, A.D.; Rogers, W.A. Toward a Framework for Levels of Robot Autonomy in Human-Robot Interaction. J. Hum. Robot Interact. 2014, 3, 74–99. [Google Scholar] [CrossRef]
  13. Rosanda, V.; Istenič Starčič, A. The Robot in the Classroom: A Review of a Robot Role. In Emerging Technologies for Education, Proceedings of the 4th International Symposium SETE 2019, Magdeburg, Germany, 23–25 September 2019; Popescu, E., Hao, T., Hsu, T.C., Xie, H., Temperini, M., Chen, W., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2020; Volume 11984, pp. 347–357. [Google Scholar] [CrossRef]
  14. Nikolić, G. Robotska edukacija:Robotska pismenost “ante portas? Andrag. Glas. Glas. Hrvat. Andrag. Druš. 2016, 1–2, 25–57. [Google Scholar]
  15. Crompton, H.; Gregory, K.; Burke, D. Humanoid robots supporting children’s learning in an early childhood setting. Br. J. Educ. Technol. 2018, 49, 911–927. [Google Scholar] [CrossRef]
  16. Serholt, S. Breakdowns in children’s interactions with a robotic tutor: A longitudinal study. Comput. Hum. Behav. 2018, 81, 250–264. [Google Scholar] [CrossRef]
  17. Kennedy, J.; Lemaignan, S.; Belpaeme, T. The cautious attitude of teachers towards social robots in schools. In Proceedings of the Robots 4 Learning Workshop at IEEE RO-MAN 2016, New York, NY, USA, 26–31 August 2016. [Google Scholar]
  18. Hastinski, S.; Olofsson, A.D.; Arkenback, C.; Ekström, S.; Ericsson, E.; Fransson, G.; Jaldemark, J.; Ryberg, T.; Öberg, L.-M.; Fuentes, A.; et al. Critical imaginaries and reflections on artificial intelligence and robots in postdigital K-12 education. Postdigital Sci. Educ. 2019, 1, 427–445. [Google Scholar]
  19. Istenič, A.; Bratko, I.; Rosanda, V. Pre-Service Teachers’ Concerns about Social Robots in the Classroom: A Model for Development. Educ. Self Dev. 2021, 16, 60–87. [Google Scholar] [CrossRef]
  20. Beer, J.M.; Prakash, A.; Mitzner, T.L.; Rogers, W.A. Understanding Robot Acceptance Technical Report HFA-TR-1103; Georgia Institute of Technology: Atlanta, GA, USA, 2011; Volume 24, pp. 1–45. Available online: https://smartech.gatech.edu/bitstream/handle/1853/39672/HFA-TR-1103-RobotAcceptance.pdf (accessed on 6 November 2020).
  21. Heyns, C. Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, A/HRC/23/47. 2013. Available online: https://www.ohchr.org/en/special-procedures/sr-executions (accessed on 6 November 2020).
  22. Sharkey, A.J.C. Should we welcome robot teachers? Ethics Inf. Technol. 2016, 18, 283–297. [Google Scholar]
  23. de Graaf, M.M.A.; Ben Allouch, S.; van Dijk, J.A.G.M. Why Would I Use This in My Home? A Model of Domestic Social Robot Acceptance. Hum. Comput. Interact. 2019, 34, 115–173. [Google Scholar] [CrossRef]
  24. Edwards, B.I.; Cheok, A.D. Why not robot teachers: Artificial intelligence for addressing teacher shortage. Appl. Artif. Intell. 2018, 32, 345–360. [Google Scholar] [CrossRef]
  25. Zhai, X.; Chu, X.; Chai, C.S.; Jong, M.S.Y.; Istenič, A.; Spector, M.; Liu, J.-B.; Yuan, J.; Li, J. A review of artificial intelligence (AI) in education from 2010 to 2020. Complexity 2021, 2021, 8812542. [Google Scholar] [CrossRef]
  26. Wilson, B.R. The Teacher’s Role—A Sociological Analysis. Br. J. Sociol. 1962, 13, 15–32. [Google Scholar] [CrossRef]
  27. Brunetti, I. I valori personali e professionali degli insegnanti di scuola primaria: Un’indagine qualitativa. Form. Insegn. 2015, 13, 227–244. [Google Scholar]
  28. Diep, L.; Cabibihan, J.J.; Wolbring, G. Social Robots: Views of Special Education Teachers. In Proceedings of the REHAB ’15, 3rd 2015 Workshop on ICTs for Improving Patients Rehabilitation Research Techniques, Lisbon Portugal, 1–2 October 2015; Fardoun, H.M., Gamito, P., Penichet, V.M.R., Alghazzawi, D.M., Eds.; Association for Computing Machinery: New York, NY, USA, 2015; pp. 160–163. [Google Scholar]
  29. Flensborg Damholdt, M.F.; Vestergaard, C.; Nørskov, M.; Hakli, R.; Larsen, S.; Seibt, J. Towards a new scale for assessing attitudes towards social robots: The attitudes towards social robots scale (ASOR). Interact. Stud. 2020, 21, 24–56. [Google Scholar] [CrossRef]
  30. Turkle, S. Authenticity in the age of digital companions. Interact. Stud. 2007, 8, 501–517. [Google Scholar] [CrossRef]
  31. Kohlberg, L.; Mayer, R. Development as the Aim of Education. Harv. Educ. Rev. 1972, 42, 449–496. [Google Scholar] [CrossRef]
  32. Istenič, A.; Bratko, I.; Rosanda, V. Are pre-service teachers disinclined to utilise embodied humanoid social robots in the classroom? Br. J. Educ. Technol. 2021, 52, 2340–2358. [Google Scholar] [CrossRef]
  33. Istenič, A. Educational Technology and the Construction of Authentic Learning Environments: [Scientific Monograph]; Fakulteta za Gradbeništvo in Geodezijo: Ljubljana, Slovenia, 2021. [Google Scholar] [CrossRef]
  34. Serholt, S.; Barendregt, W.; Vasalou, A.; Alves-Oliveira, P.; Jones, A.; Petisca, S.; Paiva, A. The case of classroom robots: Teachers’ deliberations on the ethical tensions. AI Soc. 2017, 32, 613–631. [Google Scholar] [CrossRef]
  35. Smakman, M.H.; Konijn, E.A.; Vogt, P.; Pankowska, P. Attitudes towards social robots in education: Enthusiast, practical, troubled, sceptic, and mindfully positive. Robotics 2021, 10, 24. [Google Scholar] [CrossRef]
  36. Smakman, M.; Vogt, P.; Konijn, E. A Moral considerations on social robots in education: A multi-stakeholder perspective. Comput. Educ. 2021, 174, 104317. [Google Scholar]
  37. Giger, J.C.; Piçarra, N.; Alves-Oliveira, P.; Oliveira, R.; Arriaga, P. Humanization of robots: Is it really such a good idea? Hum. Behav. Emerg. Technol. 2019, 1, 111–123. [Google Scholar] [CrossRef]
  38. Piçarra, N.J.G. Predicting Intention to Work with Social Robots; Universidade do Algarve: Algarve, Spain, 2014; pp. 1–271. [Google Scholar]
  39. Wilson, S.; Haslam, N. Reasoning about Human Enhancement: Towards a Folk Psychological Model of Humanness and Human Identity. In Handbook of Research on Technoself: Identity in a Technological Society; Luppicini, R., Ed.; IGI Global: Hershey, PA, USA, 2012; pp. 175–188. [Google Scholar] [CrossRef]
  40. Kanda, T.; Sato, R.; Saiwaki, N.; Ishiguro, H. A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Trans. Rob. 2007, 23, 962–971. [Google Scholar] [CrossRef]
  41. Law, T.; Chita-Tegmark, M.; Rabb, N.; Scheutz, M. Examining attachment to robots: Benefits, challenges, and alternatives. ACM Trans. Hum. Robot Interact. 2022, 11, 36. [Google Scholar]
  42. Merriam-Webster. Authentic. In Merriam-Webster.com Dictionary. Available online: https://www.merriam-webster.com/dictionary/authentic (accessed on 6 November 2020).
  43. Merriam-Webster. Concerns. In Merriam-Webster.com Dictionary. Available online: https://www.merriam-webster.com/dictionary/concerns (accessed on 6 November 2020).
  44. Bartneck, C.; Suzuki, T.; Kanda, T.; Nomura, T. The influence of people’s culture and prior experiences with Aibo on their attitude towards robots. AI Soc. 2021, 21, 217–230. [Google Scholar] [CrossRef]
  45. Rosanda, V.; Istenič, A. A Robot-Supported Lesson. In Upbringing and Education between the Past and the Future; Založba Univerze na Primorskem: Koper, Slovenia, 2023; pp. 101–121. [Google Scholar]
  46. Ceha, J.; Law, E.; Kulić, D.; Oudeyer, P.Y.; Roy, D. Identifying functions and behaviours of social robots for in-class learning activities: Teachers’ perspective. Int. J. Soc. Robot. 2022, 14, 747–761. [Google Scholar] [CrossRef]
  47. Ahmad, M.I.; Mubin, O.; Orlando, J. Understanding behaviours and roles for social and adaptive robots in education: Teacher’s perspective. In Proceedings of the Fourth International Conference on Human Agent Interaction, New York, NY, USA, 4–6 October 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 297–304. [Google Scholar]
  48. van Ewijk, G.; Smakman, M.; Konijn, E.A. Teachers’ perspectives on social robots in education: An exploratory case study. In Proceedings of the Interaction Design and Children Conference, ACM, London, UK, 17–24 June 2020; pp. 273–280. [Google Scholar]
  49. Ahmad, M.I.; Mubin, O.; Orlando, J. Children views’ on social robot’s adaptations in education. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, Launceston, Australia, 29 November–2 December 2016; pp. 145–149. [Google Scholar]
  50. Bland, J.M.; Altman, D.G. Statistics notes: Cronbach’s alpha. BMJ 1997, 314, 275. [Google Scholar] [CrossRef] [PubMed]
  51. Goretzko, D.; Pham, T.T.H.; Bühner, M. Exploratory factor analysis: Current use, methodological developments and recommendations for good practice. Curr. Psychol. 2021, 40, 3510–3521. [Google Scholar] [CrossRef]
  52. Field, A. Discovering Statistics Using IBM SPSS Statistics; Sage: Los Angeles, CA, USA, 2013. [Google Scholar]
  53. Beavers, A.S.; Lounsbury, J.W.; Richards, J.K.; Huck, S.W.; Skolits, G.J.; Esquivel, S.L. Practical Considerations for Using Exploratory Factor Analysis in Educational Research. Pract. Assess. Res. Eval. 2013, 18, 6. [Google Scholar] [CrossRef]
  54. Serholt, S.; Barendregt, W. Students’ attitudes towards the possible future of social robots in education. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Workshop on Philosophical Perspectives of HRI, Edinburgh, UK, 25–29 August 2014; IEEE Press: Piscataway, NJ, USA, 2014. [Google Scholar]
  55. Alemi, M.; Meghdari, A.; Ghazisaedy, M. Employing humanoid robots for teaching English language in Iranian junior high-schools. Int. J. Humanoid Robot. 2014, 11, 1450022. [Google Scholar] [CrossRef]
  56. Westlund, J.K.; Dickens, L.; Jeong, S.; Harris, P.; DeSteno, D.; Breazeal, C. A comparison of children learning new words from robots, tablets, & people. In Proceedings of the 1st International Conference on Social Robots in Therapy and Education, Almere, The Netherlands, 22–23 October 2015. [Google Scholar]
  57. Ivanov, S. Will Robots Substitute Teachers? In Yearbook of Varna University of Management. In Proceedings of the 12th International Conference Modern Science, Business and Education, Bucharest, Romania, 21–22 April 2016; Varna University of Management: Varna, Bulgaria, 2016; Volume 9, pp. 42–47. [Google Scholar]
  58. Bartneck, C.; Kulić, D.; Croft, E.; Zoghbi, S. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef]
Table 1. Principal axis factoring and descriptive statistics of the concerns saturating factor: “Reluctance towards authenticity-imbued social robots”.
Table 1. Principal axis factoring and descriptive statistics of the concerns saturating factor: “Reluctance towards authenticity-imbued social robots”.
VariablesFactor
Loading
MeanStandard DeviationMinimum–
Maximum
Robots should not replace teachers’ work and interaction with children.0.8453.800.9881–5
Children need teachers for their socio-emotional development.0.8073.920.8321–5
The teacher understands the children’s emotions and can comfort them, which the robot cannot do.0.8013.690.9211–5
A robot cannot replace a human.0.7724.030.8742–5
Robots cannot replace genuine teacher–child contact, as a child needs a person who will actually understand, help, and encourage him or her.0.754 3.870.8831–5
Since a child finds a person in their life as a role model, it would be wrong to attach to a robot in a similar way.0.6093.610.9081–5
A teacher’s word is valuable and robots cannot substitute it.0.6813.650.9731–5
Children spend too much time with electronic devices, so it is necessary to encourage other activities, such as spending time in nature.0.702 3.700.9711–5
Total 3.780.9881–5
Note: N = 124; KMO = 0.905. Extraction method: principal axis factoring; rotation method: Oblimin rotation; p-value for Bartlet’s test of sphericity (χ2 (28) = 579.196, p < 0.001); determinant = 0.008. Cronbach’s alpha = 0.913.
Table 2. Descriptives for factor solution items and Kruskal–Wallis test results.
Table 2. Descriptives for factor solution items and Kruskal–Wallis test results.
ConcernUNIMeanSDMinMaxχ2dfpε2
Since a child finds a person in their life as a role model, it would be wrong to attach to a robot in a similar way.14.110.952518.051<0.0010.07
23.610.9115
Children spend too much time with electronic devices, so it is necessary to encourage other activities, such as spending time in nature.14.220.931520.711<0.0010.08
23.700.9715
A robot cannot replace a human.14.450.802518.261<0.0010.07
24.030.8725
Robots should not replace the teacher’s work and interaction with children.14.420.802529.061<0.0010.11
23.800.9915
A teacher’s word is valuable and robots cannot substitute it.14.260.832528.031<0.0010.11
23.650.9715
Robots cannot replace genuine teacher–child contact, as a child needs a person who will actually understand, help, and encourage him or her.14.330.862520.681<0.0010.08
23.870.8815
The teacher understands children’s emotions and can comfort them which robot is unable to perform. 14.320.792530.961<0.0010.12
23.690.9215
Children need teachers for their socio-emotional development.14.360.792521.101<0.0010.08
23.920.8315
UNI: 1—Slovene participants; 2—Russian participants. Kruskal–Wallis test p-value ≤ 0.05. Epsilon square (ε2) values interpretation: 0.00 < 0.01—negligible; 0.01 < 0.04—weak; 0.04 < 0.16—moderate; 0.16 < 0.36—relatively strong; 0.36 < 0.64—strong; 0.64 < 1.00—very strong.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Istenič, A.; Latypova, L.; Rosanda, V.; Turk, Ž.; Valeeva, R.; Zhai, X. Reluctance to Authenticity-Imbued Social Robots as Child-Interaction Partners. Educ. Sci. 2024, 14, 390. https://doi.org/10.3390/educsci14040390

AMA Style

Istenič A, Latypova L, Rosanda V, Turk Ž, Valeeva R, Zhai X. Reluctance to Authenticity-Imbued Social Robots as Child-Interaction Partners. Education Sciences. 2024; 14(4):390. https://doi.org/10.3390/educsci14040390

Chicago/Turabian Style

Istenič, Andreja, Liliya Latypova, Violeta Rosanda, Žiga Turk, Roza Valeeva, and Xuesong Zhai. 2024. "Reluctance to Authenticity-Imbued Social Robots as Child-Interaction Partners" Education Sciences 14, no. 4: 390. https://doi.org/10.3390/educsci14040390

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop