Next Article in Journal
A Robotic Head Stabilization Device for Medical Transport
Previous Article in Journal
Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Do You Care for Robots That Care? Exploring the Opinions of Vocational Care Students on the Use of Healthcare Robots

by
Margo A. M. van Kemenade
1,*,
Johan F. Hoorn
2,3 and
Elly A. Konijn
2
1
Faculty of Engineering Design & Computing; Department of Engineering & Business, Inholland University of Applied Sciences, Bergerweg 200, 1817 MN Alkmaar, The Netherlands
2
VU University, Faculty of Social Sciences, Department of Communication Sciences, De Boelelaan 1105, 1081 HV Amsterdam, The Netherlands
3
Department of Computing and School of Design, The Hong Kong Polytechnic University, Hong Kong, China
*
Author to whom correspondence should be addressed.
Robotics 2019, 8(1), 22; https://doi.org/10.3390/robotics8010022
Submission received: 7 February 2019 / Revised: 10 March 2019 / Accepted: 15 March 2019 / Published: 21 March 2019

Abstract

:
Background: There has been a rapid increase in the population of senior citizens in many countries. The shortage of caregivers is becoming a pressing concern. Robots are being deployed in an attempt to fill this gap and reduce the workload of caregivers. This study explores how healthcare robots are perceived by trainee care professionals. Methods: A total of 2365 students at different vocational levels completed a questionnaire, rating ethical statements regarding beneficence, maleficence, justice, autonomy, utility, and use intentions with regard to three different types of robots (assistive, monitoring, and companion) along with six control variables: gender, age, school year, technical skills, interest in technology, and enjoying working with computers. The scores were analyzed by MANOVA statistics. Results: In relation to our research questions: All students viewed companion robots as more beneficent than monitoring and assistive robots. Level of education did not lead to any differences in appraisal. Participants rated maleficence lowest and the highest scores were given to autonomy and utility, meaning a positive evaluation of the use of healthcare robots. Surprisingly, all students rated use intentions low, indicating a poor motivation to actually use a robot in the future, although participants stated a firmer intention for using monitoring devices. Conclusion: Care students find robots useful and expect clients to benefit from them, but still are hesitant to use robots in their future practice. This study suggests that it would be wise to enrich the curriculum of intermediate care education with practical classes on the use and ethical implications of care robots, to ensure that this group of trainee care professionals fully understand the possibilities and potential downside of this emerging kind of healthcare technology.

1. Introduction

The rapid increase in the number of older adults (75 years old and up) is a major concern of the developed countries [1]. In the 1950s, the probability that an 80-year-old would survive to age 90 was 15–16% for women and 12% for men. By 2002, this percentage had increased to 37% and 25%, respectively. Since 1840, female life expectancy has increased by approximately two years per decade, worldwide. This linear growth has yet to stagnate and therefore implies that ‘the human life span is not closing in on its limit’ [2]. Life expectancy at age 20 is predicted to increase by approximately one year per decade for females and males between now and 2040 [3]. Compared to the working-age population, the proportion of older adults will eventually grow to a point where there are not enough people, let alone professional care-givers, to care for the elderly [4,5,6]. It seems impossible to continue organizing eldercare in the same way as before [7]. Thus, the search for innovative solutions has become crucial, and the deployment of robots in healthcare seems to be a promising way forward [8,9,10,11,12,13]; however, robot care is not without some controversy.
Care-robotics can be divided into three types: companion, assisting, and monitoring robots [6]. Assisting robots are used to help patients with their daily tasks (e.g., washing and dressing) and monitoring robots are used to keep an eye on the patient (e.g., medicine intake, fall detection). Companion robots are designed to counter social isolation and loneliness. Most people, however, feel uncomfortable with such robots [14], and if care professionals are hesitant from the start, a possible game changer may be lost without having been properly tried and tested.
A previous study elaborated on the effects of education on moral considerations regarding care technology in a quantitative examination among higher and lower educated care professionals [15]. This previous study found that the acceptance of robots in care was more strongly associated with the participants’ moral considerations than with utility. Hence, the current study aimed to further explore the educational context and conduct research within the group of vocational educated students. We wanted to collect and examine the opinions of students of care and welfare programmes of Dutch Vocational Educational Institutes, because they are the care professionals of the future. Students in healthcare will form a large part of the group of care professionals that, in the future, will work with this kind of technology [16].
Whereas the previous study [15] explored the possible differences between intermediate and higher vocational educated students in care, this current study will examine educational differences within the group of intermediate vocational educated students. In the Netherlands, intermediate Vocational Education and Training (VET) is called Middelbaar Beroeps Onderwijs (MBO). VET is the main supplier to the labour market and is often regarded as the ‘foundation of the economy’ or the ‘backbone of society’. Approximately 40% of the Dutch working population has completed a vocational curriculum to at least an intermediate vocational training level [17]. Intermediate vocational training is divided into four levels in the Netherlands. VET offers programmes on four different levels, ranging from the entry level to the middle-management level. Thus, it is interesting to study whether or not educational differences within the different levels of VET lead to different appraisals of care robots.
Therefore, in the current study, we attempted to quantify the various opinions of Dutch vocational care students on care robots from the perspective of the four principles of biomedical ethics [18]: beneficence, non-maleficence, justice, and autonomy. From a technology acceptance perspective, we added ‘expected utility of the robot’ and ‘intentions to use the robot’ to this list. Since age, skillfulness, and interest in technology also contribute to opinion formation [19,20,21] we studied these variables as well for their possible moderating effects.
Beauchamp and Childress [18] have proposed a system of moral principles in the practice of medicine. Maleficence is derived from the principle of “non-maleficence”, which states that technology should “first, do no harm” (primum non nocere). For many, this is the paramount principle of biomedical ethics. Beneficence holds that technology should promote the well-being of the patient. Justice is a concern for fairness and equality, while autonomy is the patient’s right and ability to freely make decisions about medical treatment. In this study, we aimed to identify which of these principles dominated the moral considerations regarding robots among trainee care professionals. Our research question, then, is:
How do care and welfare students at diverse levels of vocational education differentially evaluate healthcare robotics in relation to their daily routines?
We used the following sub-questions:
  • RQ-1: Do prospective care professionals perceive different types of robots (Assistive, Monitoring, and Companion) differently in terms of occupational ethics (Beneficence, Maleficence, Justice, Autonomy)?
  • RQ-2: Do perceptions of healthcare robots differ between vocational students at lower and middle levels? Related, do they perceive the various robot types differently?
  • RQ-3: How do prospective care professionals evaluate care robots in terms of utility and possible use intentions? Related, how do these evaluations differ per robot type?

2. Methods

Participants

Students of three different Dutch vocational care programs—Helping Care and Cure, Helping Extramural Care and Nursing—volunteered in a questionnaire study. To acquire as many participants as possible, managers of seven different vocational education and training institutes distributed our online Qualtrics questionnaire1 by forwarding an invitation and link to their students in care. A total of 2365 eligible students completed the questionnaire. This number represents 3.9% of the population of all registered (61,244) lower and middle vocational care students in the Netherlands [22]. On average, there are 5000 students per institute of which 30% follows an education in care [23,24]. That would make a total possible response of 10,500 participants. Therefore, a rough and conservative estimate of our sample response rate is 23%. All participants remained anonymous. The incentive for participation was one of five gift vouchers of €50 that were raffled among the participants.

3. Data Collection

Participants received a link to an online questionnaire that was specifically developed for this study. The questionnaire had three versions, one for assistive, one for monitoring, and one for companion robots; each version consisted of 39 questionnaire items. The items per robot type were identical so they could be compared. Demographics were probed using seven additional questions.
After opening the link in Qualtrics (Version 24(892), Provo, UT, USA)a brief introduction and consent form was presented. Upon agreement, the participant could commence the questionnaire. Participants were randomly assigned to one of the three versions. The questionnaire opened with a picture of a healthcare robot (Figure 1) followed by a brief description of its capabilities and the tasks it could execute (e.g., reading aloud, exercise coaching, or reminding of medicine intake).

4. Measures

We measured six theoretical variables, which as a group we will term appraisal domains (i.e., perceived beneficence, maleficence, justice, autonomy, utility, and use intentions), four educational levels (levels 1–4), and three robot types (assistive, monitoring, and companion) along with six control variables: gender, age, school year, technical skills, interest in technology, and enjoying working with computers. Education Level was an ordinal variable: lower vocational education (levels 1 and 2) and middle vocational education (levels 3 and 4). The theoretical variables were measured at quasi interval level with initially six Likert-type items, each rated on a six-point scale (1 = strongly disagree, 2 = disagree, 3 = slightly disagree, 4 = slightly agree, 5 = agree, and 6 = strongly agree). Principal Component Analyses and Reliability analyses revealed that several items had to be discarded to form reliable scales. The questionnaire items that were included per appraisal domain can be found in Appendix A. Scale reliabilities were calculated with Cronbach’s alpha as reported in Table 1, using 0.7 as the cut-off point to decide whether a scale was sufficiently reliable or not [26]. The Justice scale failed the 0.7 criterion and had to be discarded entirely. Apparently, the way we measured justice did not converge into a solid underlying concept. All other scales performed well (Table 1). After recoding the counter-indicative items, the items on each scale could be summed and averaged to calculate a mean index (Table 1).
Although 2365 participants completed the questionnaire, cells were not filled equally. Unfortunately, only 2 participants studied at level 1, which made it necessary to drop that level from our analyses. Then, we had to exclude another 38 participants because they had completed the questionnaire in an unreasonably short period of time (which was recorded by Qualtrics) and/or because they checked the same rating scale answer for each item. That left us with N = 2325 participants in the final analysis. Their characteristics can be found in Table 1. Table 2 provides the means and standard deviations of the appraisal domains.
Table 1 shows that participants were evenly distributed over the three robot types (33% each). Table 1 also shows an overrepresentation of females, although in the care professions that is a valid ecological outcome [27]. Unfortunately, however, over 50% of the total sample studied at vocational level 4 and a mere 9.1% at level 2, which seriously jeopardized our questions on differences in perception between educational levels. Almost the entire sample self-reported that they were skilled computer users.

5. Results

We calculated the grand mean scores of our measurement scales (Table 2) and ran a GLM Repeated Measures for the 5-leveled within-subjects factor of appraisal domain (beneficence vs. maleficence vs. autonomy vs. utility vs. use intentions) by the between-subjects factors of robot type (assistive vs. monitoring vs. companion) and education level (2 vs. 3 vs. 4).

Rating of Appraisal Domains through Ethics

The interaction between robot type, education level, and appraisal domain was not significant (p = 0.296) nor was the interaction between robot type and education level (p = 0.672).
However, the main effect of appraisal domain was significant with an intermediate effect size (Wilks’ λ = 0.53, F(4,2313) = 518.67, p = 0.000, ηp2 = 0.473), indicating that independent of robot type or education level, participants scored lowest on maleficence (M = 2.81, SD = 0.95), then use intentions (M = 3.08, SD = 1.19), then beneficence (M = 3.64, SD = 1.13), higher on autonomy (M = 3.78, SD = 0.81), and highest on utility (M = 3.84, SD = 0.93).

Rating of Appraisal Domains by Robot Type

The interaction between robot type and appraisal domain was significant (λ = 0.99, F(8,4626) = 4.00, p = 0.00, ηp2 = 0.007), which was supported by a significant main effect of Robot Type (F(2,2316) = 5.23, p = 0.005, ηp2 = 0.004). To further scrutinize the interaction effect, we ran a number of independent samples t-tests (two-tailed) with Bonferroni correction (α = 0.05/15 contrasts = 0.003) [28], indicating that independent of education level, participants deemed monitoring robots more beneficent than assistive robots (t(1550) = 3.77, p = 0.000) and assistive robots more beneficent than companion ones (t(1549) = 3.59, p = 0.000).
Assistive robots were judged as more maleficent than monitoring robots (t(1549) = 4.24, p = 0.000) and more maleficent than companion robots (t(1545) = 4.56, p = 0.000). Monitoring robots were perceived as having more utility than companion robots (t(1550) = 5.01, p = 0.000) and as a trend, assistive robots were also judged as having more utility than companion machines (t(1545) = 2.79, p = 0.005).
Participants indicated higher intentions to use monitoring than companion robots (t(1550) = 3.46, p = 0.001). All other comparisons were not significant.

Rating of Appraisal Domains by Educational Levels

The interaction between education level and appraisal domain was significant as well (λ = 0.99, F(8,4626) = 4.47, p = 0.000, ηp2 = 0.008). This interaction was supported by a significant main effect of education level (F(2,2316) = 4.69, p = 0.009, ηp2 = 0.004). We then ran a number of independent samples t-tests (2-tailed) with Bonferroni correction (α = 0.05/15 contrasts = 0.003), indicating that independent of robot type, education level 4 students perceived more beneficence than level 3 students (t(2111) = 2.94, p = 0.003), whereas level 3 saw, as a trend, more maleficence than level 4 (t(2111) = 2.72, p = 0.007). Level 4 also assigned more utility to robots than level 3 students (t(2111) = 4.47, p = 0.000), and as a trend, level 4 thought more than level 2 that robots increased the autonomy of patients (t(1417) = 2.81, p = 0.005). All other comparisons were not significant.

Overall Ratings of Appraisal Domains

In sum, independent of robot type or education level, maleficence and use intentions scored lowest while autonomy and utility scored highest. Independent of education level, participants judged that monitoring robots were more beneficent than assistive robots and assistive robots more beneficent than companion robots. Assistive robots were perceived as more maleficent than monitoring robots and more than companion robots. Monitoring robots had more utility than companion robots and as a trend, assistive robots also had more utility than companion machines. Participants indicated a firmer intention to use monitoring than companion robots. All other comparisons were not significant.
With respect to education level, independent of robot type, level 4 students perceived more beneficence than level 3 students, whereas level 3 saw, as a trend, more maleficence than level 4. Level 4 also assigned more utility to robots than level 3 students, and as a trend, level 4 deemed that robots increased the autonomy of patients, more so than level 2.

6. Conclusions

The purpose of the current study was to determine which of the four principles of Beauchamp and Childress [18] were most prominent in the estimations of lower and middle vocational care students with regard to working with robots in their future care practice.
Overall, students scored Maleficence the lowest, meaning that care robots in general were not seen as pernicious. However, students of care also rated use intentions low, indicating poor motivation to actually use a robot in the future. Possible beneficent effects of care robots were received with relative neutrality, whereas the potential increase in a client’s autonomy was deemed considerable. Highest scores were obtained for the utility of robots, which is surprising in view of the students’ reluctance to use them. On the whole, level 4 students were slightly more positive than level 3 or 2 students with respect to beneficence, utility, and the client’s autonomy. All students viewed companion robots as more beneficent than monitoring robots and monitoring robots more beneficent than assistive robots. Although regarded as more maleficent, monitoring and assisting machines were also seen as more useful.
In all, these care students saw little harm in robots, found them useful, and expected clients to become more independent because of them; on the other hand, they were quite hesitant about using robots in their future practice. It is possible that the perceived potential effectiveness of robots for their work practice was affected by fear of job loss.

7. Discussion

One might think that current students of care have already acquired so-called ‘21st century skills’, that is, they would have “information literacy, media literacy, and information, communication and technology literacy” [29]. Additionally, if that is the case, so theory has it, perceived usefulness and a positive attitude toward technology should increase the intention to use technology. Indeed, computer self-efficacy can act as an antecedent for perceived usefulness and positive attitude towards computer use [30]. 93.6% of our survey participants stated they were skilled computer users and 72.3% claimed to enjoy working with computers. Nevertheless, these 21st-century skilled students were not very eager to employ robots in their future work practice.
Admittedly, the vocational care students in this survey did not see much harm in care robots, for the patient that is. However, when it came to their future work practice, they were reluctant to envision employing care machines, in spite of their potential utility. Listed hereafter are a few possible explanations for this.

Possible Explanations for the Hesitation in Using Care Robots

Robots may not have been viewed as maleficent, but neither were they seen as being beneficial or helpful (beneficence was rated ‘neutral’). Although useful for the instrumental side of nursing and caring (utility high), it might be that these students feared that robots would weaken the relationship between caregiver and care receiver (autonomy high), commonly considered fundamental to ‘good care.’ Hence, they were hesitant to work with robots. Lewis and West [31] state that the care relationship is crucial to securing care quality. If ‘good care’ depends substantially on the quality of the care relationship, then more attention should be paid to the human care workforce. Perhaps this was a principal concern of our sample of care students. Moreover, in covering the instrumental aspects of care, robots may also have been perceived as an occupational threat, making the nurse seem less important (e.g., when the client becomes attached to the robot) or even redundant.
On a societal plane, 70% of Europeans have a positive attitude towards robots in general [32]. However, when it comes to healthcare robots, they are not so positive. Only 22% of Europeans think that robots should be introduced in healthcare [32]. Sixty percent are even resentful of the prospect of robots caring for children and older adults [32]. Perhaps our care students were resonating with a more public trend, vented in the media and discussed at the coffee table.
We did not find great differences between educational levels in the way they perceived care robots. It could be that the educational differences between lower and middle vocational care students are negligible. As an additional exploratory analysis, we combined levels 2 and 3 and compared them with level 4, performing chi-square analysis is (Table 3). We found that level 4 students perceived more beneficence in care robots. However, contrary to our expectations, they do not perceive less maleficence. Higher vocational students also perceived a greater utility in care robots than lower vocational students, however this higher perceived utility does not translate into a higher use intention. All care students expressed low use intentions, the potential utility of robots notwithstanding.
As advised by Holloway and Wheeler [33], it may be more worthwhile to turn to qualitative research approaches to resolve matters of change or conflict, particularly in care relations, and explore the behaviours, feelings, and experiences of care students in confrontation with robots on the work floor in more detail.
Trainee care professionals perceived particular types of robots differently in terms of occupational ethics and use. Our research challenge is to ascertain what these care professionals would want from an assistive or monitoring machine that is—in their eyes—potentially very useful and somewhat innocuous. Should a robot assistant have a moral reasoning system that tells it what is ‘good care’? Should a monitoring device know what information is private and what should be disclosed to the nurses at the ward? Companion robots may be seen as beneficent, but how can they become more useful? Additionally, how should they behave such that they do not undermine the quality of the care relationship between humans? These are not questions of occupational ethics and utility alone, as their answers will encourage better partnerships between human caretakers and artificial systems of the future. More research is needed, with a more rigorous research methodology to truly obtain the objections raised by (trainee) caregivers to facilitate acceptability. It is stated in the literature [34] that healthcare robots can potentially enhance elderly well-being and decrease the workload on caregivers.

Education could Make a Difference

This study suggests that it would be wise to enrich the curriculum of intermediate care education with practical classes on the use and ethical implications of new care technology, particularly in the case of care robots. This is especially of interest in the intermediate vocational domain because this group will be the main supplier of direct care of older adults in the near future. It is important to ensure that the provided care is in line with the wishes and needs of our vulnerable older adults, but at the same time keeps the job satisfaction of our precious caregivers as high as possible, given the circumstances.

Author Contributions

Conceptualization: M.A.M.v.K.; methodology: J.F.H. and E.A.K.; software: data management and statistical analyses were performed in SPSS, version 21; validation: J.F.H. and E.A.K.; formal analysis: M.A.M.v.K., J.F.H., and E.A.K.; data collection: M.A.M.v.K.; resources: VU University Amsterdam; data curation: E.A.K.; writing-original draft preparation: M.A.M.v.K.; writing-review and editing: M.A.M.v.K., J.F.H., and E.A.K.; visualization: E.A.K.; supervision: E.A.K. and J.F.H.; project administration: VU University Amsterdam; funding acquisition: J.F.H.

Funding

This study is part of the SELEMCA project (Services of Electro-Mechanical Care Agencies, grant NWO 646.000.003), which was funded within the Creative Industry Scientific Programme (CRISP) and supported by the Dutch Ministry of Education, Culture and Science. The contribution of the first author was funded by a personal grant from the Central Board of Inholland, University of Applied Sciences.

Acknowledgments

We wish to thank the researchers of the SELEMCA group for their incisive and constructive remarks and Kees Nijhoff for support with the data analysis. All authors are responsible for the reported research and all participated in the conception and design of the study, the analysis and interpretation of the data, and drafting and revising of the manuscript. The first author, MK, collected the data. All authors approved the manuscript as submitted.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Examples of question items. Note: these items are translated from Dutch.
This robot could be:
HandyUtility 1 +
ClumsyUtility 2 −
UnusableUtility 3 −
Suitable for his workUtility 4 +
HarmfulMaleficence 1 +
DangerousMaleficence 2 +
CaringBeneficence 1 +
This robot:
Does the patient wellBeneficence 2 +
Makes someone more independentAutonomy 1 +
Disadvantaged other patientsJustice 1 −
Limits freedom of the patientAutonomy 2 −
Ensures that a patients can take better care of himself Autonomy 3 +
Hurts the patientMaleficence 3 +
Makes the patient independentAutonomy 4 −
Makes life worseMaleficence 4 +
Favours some patientsJustice 2 −
Split attention evenlyJustice 3 +
Increases the quality of lifeBeneficence 3 +
Let the patient make its own choiceJustice 4 +
I think that this robot:
Is not going to be usedUse Intention 1 −
Makes life betterBeneficence 4 +
Can help the patientBeneficence 5 +
Treats everybody equallyJustice 5 +
Diminishes quality of lifeMaleficence 5 +
Diminishes self-relianceAutonomy 5 −
Makes no distinction between peopleJustice 6 +
Neglects the patientMaleficence 6 +
Returns freedom to the patientAutonomy 6 +
Is taking good care of the patientBeneficence 6 +
Does things behind your backJustice 7 -
Limits the patient in his freedom of choiceJustice 8 -
Try to put yourself in the shoes of a professional care provider:
With this robot I would like to workUse Intention 2 +
This robot seems suitable for the jobUse Intention 3 +
I would like to use such a robotUse Intention 4 +
I would leave the robot in the closetUse Intention 5 −
I rather do the work myselfUse Intention 6 −
Working with a robot takes timeUtility 5 −
Working with a robot saves timeUtility 6 +
As extra control-question regarding Use Intention: “It is more than likely that I will use this robot in the near future” (yes/no).
+ and − sign indicate reversed items for counter-indicative purposes. The number refers to the respective question item. All domains have at least six items per category. Answers were scored with Likert-type items, each rated on a six-point scale (1 = strongly disagree, 2 = disagree, 3 = slightly disagree, 4 = slightly agree, 5 = agree, and 6 = strongly agree).

References

  1. Bloom, D.; Chatterji, S.; Kowal, P.; Lloyd-Sherlock, P.; McKee, M. Macroeconomic implications of population ageing and selected policy response. Lancet 2015, 385, 649–657. [Google Scholar] [CrossRef]
  2. Christensen, K.; Doblhammer, K.; Rau, R.; Vaupel, J. Ageing populations: The challenges ahead. Lancet 2009, 374, 1196–1208. [Google Scholar] [CrossRef]
  3. Lindahl-Jacobsen, R.; Rau, R.; Canudas-Romo, V.; Jeune, V.; Lenart, A.; Christensen, K.; Vaupel, J. Rise, stagnation, and rise of Danish women’s life expectancy. Proc. Natl. Acad. Sci. USA 2016, 113, 4005–4020. [Google Scholar] [CrossRef]
  4. Schwiegelshohn, F.; Hubner, M.; Wehner, P.; Gohringer, D. Tackling the New Health-Care Paradigm Through Service Robotics: Unobtrusive, efficient, reliable and modular solutions for assisted-living environments. IEEE Consum. Electron. Mag. 2017, 6, 34–41. [Google Scholar] [CrossRef]
  5. Giesbers, H.; Verwey, A.; de Beer, J.D. Vergrijzing Samengevat; Volksgezondheid Toekomst Verkenning. Nationaal Kompas Volksgezondheid [Aging in Summary; Public Health Future Exploration. National Compass Public Health]. 2013. Available online: http://www.nationaalkompas.nl/bevolking/vergrijzing/vergrijzing-samengevat/ (accessed on 28 November 2018).
  6. Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of Healthcare Robots for the Older Population. Int. J. Soc. Robot. 2009, 1, 319–330. [Google Scholar] [CrossRef]
  7. Bemelmans, R.; Gelderblom, G.; Jonker, P.; de Witte, L. Socially Assistive Robots in Elderly Care: A systematic Review into Effects and Effectiveness. J. Am. Med Dir. Assoc. 2012, 13, 114–120. [Google Scholar] [CrossRef]
  8. Banks, M.; Willoughby, L.; Banks, W. Aninmal-Assisted Therapy and Loneliness in Nursing Homes: Use of Robotic versus Living Dogs. J. Am. Med Dir. Assoc. 2008, 9, 173–177. [Google Scholar] [CrossRef] [PubMed]
  9. Broekens, J.; Heerink, M.; Rosendal, H. Assistive Social Robots in Elderly Care: A Review. Gerontechnology 2009, 8, 94–103. [Google Scholar] [CrossRef]
  10. Frennert, S.; Östlund, B.; Eftring, H. Would Granny let an assistive robot into her home? In International Conference on Social Robotics; Springer: Berlin/Heidelberg, Germany, 2012; pp. 128–137. [Google Scholar]
  11. Klein, B.; Cook, G. Emotional Robotics in Elder Care–A Comparison of Findings in the UK and Germany. In International Conference on Social Robotics; Springer: Berlin/Heidelberg, Germany, October 2012; pp. 108–117. [Google Scholar]
  12. Moyle, W.; Beattie, E.; Cooke, M.; Jones, C.; Klein, B.; Cook, C.; Gray, C. Exploring the Effect of Companion Robots on Emotional Expression in Older Adults with Dementia: A Pilot Randomized Controlled Trial. J. Gerontol. Nurs. 2016, 39, 46–53. [Google Scholar] [CrossRef]
  13. Tamura, T.; Yonemitsu, S.; Itoh, A.; Oikawa, D.; Kawakami, A.; Higashi, Y.; Fujimoto, T.; Nakajima, K. Is an Entertainment Robot Useful in the Care of Elderly People With Severe Dementia? J. Gerontol. Ser. A 2004, 59, M83–M85. [Google Scholar] [CrossRef]
  14. Lauckner, M.; Kobiela, F.; Manzey, D. ‘Hey robot, please step back!’—Exploration of a spatial threshold of comfort for human-mechanoid spatial interaction in a hallway scenario. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication: Human-Robot Co-Existence: Adaptive Interfaces and Systems for Daily Life, Therapy, Assistance and Socially Engaging Interactions (IEEE RO-MAN 2014), Edinburgh, UK, 25–29 August 2014; Institute of Electrical and Electronics Engineers: New York, NY, USA, 2014; pp. 780–787. [Google Scholar]
  15. Van Kemenade, M.; Hoorn, J.; Konijn, E. Healthcare Students’ Ethical Considerations of Care Robots in The Netherlands. Appl. Sci. 2018, 8, 1712. [Google Scholar] [CrossRef]
  16. Ienca, M.; Jotterand, F.; Vica, C. Social and Assistive Robotics in Dementia Care: Ethical Recommendations for Research and Practice. Int. J. Soc. Robot. 2016, 8, 565–573. [Google Scholar] [CrossRef]
  17. MBO Raad. Dutch VET. Retrieved from MBO Counsil. Available online: https://www.mboraad.nl/english (accessed on 27 September 2017).
  18. Beauchamp, T.L.; Childress, J.F. Principles of Biomedical Ethics; Oxford University Press: Oxford, UK, 2013. [Google Scholar]
  19. Yu, T.; Lin, M.; Liao, Y. Understanding factors influencing information communication technology adoption behavior: The moderators of information literacy and digital skills. Comput. Hum. Behav. 2017, 71, 196–208. [Google Scholar] [CrossRef]
  20. Cook, N.; Winkler, S. Acceptance, Usability and Health Applications of Virtual Worlds by Older Adults: A Feasibility Study. JMIR Res. Protoc. 2016, 5, e81. [Google Scholar] [CrossRef] [Green Version]
  21. Massimiliano, S.; Giuliana, M.; Fornara, F. Robots in a Domestic Setting: A psychological Approach. Univ. Access Inf. Soc. 2005, 4, 146–155. [Google Scholar]
  22. CBS Statline 2018. MBO; Doorstroom en Uitstroom, Migratieachtergrond, Generatie, Regiokenmerken. 1 24. November 19, 2018. [VET; Flow and Outflow, Migration Background, Generation, Regional Characteristics]. Available online: https://statline.cbs.nl/StatWeb/publication/?VW=T&DM=SLNL&PA=71895NED&D1=a&D2=2,5,8,11&D3=2,8,10&D4=a&D5=0-1,3-4&D6=0&D7=0&D8=l&HD=101029-1031&HDR=G6,G5,G7,G4,T&STB=G3,G1,G2 (accessed on 14 January 2019).
  23. MBO Raad–[Vocational Counsel]. Het MBO Feiten en Cijfers [The VET, Facts and Figures]. 2019. Available online: https://www.mboraad.nl/het-mbo/feiten-en-cijfers/mbo-scholen (accessed on 28 February 2019).
  24. Rijksoverheid–[Dutch Government]. Onderwijs in Cijfers [Education in Figures]. 2018. Available online: https://www.onderwijsincijfers.nl/kengetallen/mbo (accessed on 28 February 2019).
  25. “Alice Cares” [“Ik ben Alice”]. Documentary, Directed by S. Burger. Keydocs/Doxy/NCRV. 2015. Available online: http://www.ikbenalice.nl/ (accessed on 14 January 2019).
  26. Taber, K. The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res. Sci. Educ. 2018, 48, 1273–1296. [Google Scholar] [CrossRef]
  27. Adams, T. Gender and feminization in healthcare professions. Sociol. Compass 2010, 4, 454–465. [Google Scholar] [CrossRef]
  28. Armstrong, R.A. When to use the Bonferroni correction. Ophthalmic Physiol. Opt. 2014, 34, 502–508. [Google Scholar] [CrossRef] [Green Version]
  29. Kurshan, M. Teaching 21st Century Skills For 21st Century Success Requires an Ecosystem Approach. 2017. Available online: https://www.forbes.com/sites/barbarakurshan/2017/07/18/teaching-21st-century-skills-for-21st-century-success-requires-an-ecosystem-approach/#1c5790f3fe64 (accessed on 28 November 2018).
  30. Teo, T.; Zhou, M. Explaining the intention to use technology among university students: A structural equation modeling approach. J. Comput. High. Educ. 2014, 26, 124–142. [Google Scholar] [CrossRef]
  31. Lewis, J.; West, A. Re-Shaping Social Care Services for Older People in England: Policy Development and the Problem of Achieving ‘Good Care’. J. Soc. Policy 2014, 43, 1–18. [Google Scholar] [CrossRef]
  32. Public Attitudes towards Robots; Special Eurobarometer 382; European Commission: Brussels, Belgium, 2012.
  33. Holloway, I.; Wheeler, S. Qualitative Research in Nursing and Healthcare; Wiley-Blackwell: Oxford, UK, 2015. [Google Scholar]
  34. Kachouie, R.; Sedighadeli, S.; Khosla, R.; Chu, M.T. Socially Assistive Robots in Elderly Care: A Mixed-Method Systematic Literature Review. Int. J. Hum. Comput. Interact. 2014, 30, 369–393. [Google Scholar] [CrossRef]
Figure 1. Picture of a companion robot, with courtesy from ‘Alice Cares’ [25].
Figure 1. Picture of a companion robot, with courtesy from ‘Alice Cares’ [25].
Robotics 08 00022 g001
Table 1. Descriptive statistics and reliability analysis, N = 2325.
Table 1. Descriptive statistics and reliability analysis, N = 2325.
MeanSDCronbach’s Alpha
Appraisal Domains
Beneficence3.6401.0880.88
Maleficence2.7881.0880.82
Autonomy3.8180.8320.78
Utility3.8680.9290.79
Use Intention3.0801.1900.90
Justice--0.57
Percentage
Gender
Female92.9-
Male7.1-
Level of Education
Vocational Level 29.1-
Vocational Level 339.0-
Vocational Level 451.9-
Robot Type
Assisting33.25-
Monitoring33.46-
Companion33.29-
Skilled with Computer
Yes93.6-
No6.4-
Skilled in Technology
Yes45.8-
No54.2-
Enjoys Working with Computers
Yes72.3-
No27.7-
Age
≤166.5-
1718.3-
1819.5-
1915.1-
2010.4-
215.3-
223.5-
232.9-
≥2418.5-
Table 2. Mean scores and SDs on the appraisal domains: B (beneficence), M (maleficence), A (autonomy), U (utility), and UI (use intentions) per robot type: A (assistive) M (monitoring), and C (companion) in the figure on top. In the Figure below, the mean scores and SDs on the appraisal domains are represented per education level (2, 3m and 4). N-total = 2325.
Table 2. Mean scores and SDs on the appraisal domains: B (beneficence), M (maleficence), A (autonomy), U (utility), and UI (use intentions) per robot type: A (assistive) M (monitoring), and C (companion) in the figure on top. In the Figure below, the mean scores and SDs on the appraisal domains are represented per education level (2, 3m and 4). N-total = 2325.
Robotics 08 00022 g002
Table 3. Exploratory analysis.
Table 3. Exploratory analysis.
Lower VocationalMiddle VocationalTotalChi-Square
Mean (SD)Mean (SD)Mean (SD)
Assistive
Beneficence3.54 (1.14)3.62 (1.02)3.58 (1.08)35.80
Maleficence2.97 (0.99)2.89 (0.92)2.93 (0.95)29.52
Autonomy3.72 (0.88)3.83 (0.84)3.78 (0.86)43.20
Utility3.77 (1.00)3.98 (0.85)3.88 (0.93)42.93
Use Intention3.07 (1.24)3.13 (1.13)3.10 (1.18)27.97
Monitoring
Beneficence3.67 (1.10)3.87 (1.02)3.77 (1.06)46.56 *
Maleficence2.80 (0.96)2.66 (0.91)2.72 (0.93)34.24
Autonomy3.77 (0.84)3.86 (0.85)3.82 (0.84)40.55
Utility3.88 (0.92)4.08 (0.89)3.98 (0.91)44.19 *
Use Intention3.07 (1.20)3.27 (1.16)3.17 (1.18)39.62
Companion
Beneficence3.53 (1.19)3.60 (1.04)3.57 (1.11)33.73
Maleficence2.77 (0.96)2.66 (0.85)2.71 (0.91)23.87
Autonomy3.81 (0.79)3.90 (0.78)3.86 (0.79)42.67
Utility3.68 (1.00)3.80 (0.87)3.75 (0.94)42.70
Use Intention2.97 (1.23)2.96 (1.17)2.97 (1.20)42.16
* p < 0.05.

Share and Cite

MDPI and ACS Style

van Kemenade, M.A.M.; Hoorn, J.F.; Konijn, E.A. Do You Care for Robots That Care? Exploring the Opinions of Vocational Care Students on the Use of Healthcare Robots. Robotics 2019, 8, 22. https://doi.org/10.3390/robotics8010022

AMA Style

van Kemenade MAM, Hoorn JF, Konijn EA. Do You Care for Robots That Care? Exploring the Opinions of Vocational Care Students on the Use of Healthcare Robots. Robotics. 2019; 8(1):22. https://doi.org/10.3390/robotics8010022

Chicago/Turabian Style

van Kemenade, Margo A. M., Johan F. Hoorn, and Elly A. Konijn. 2019. "Do You Care for Robots That Care? Exploring the Opinions of Vocational Care Students on the Use of Healthcare Robots" Robotics 8, no. 1: 22. https://doi.org/10.3390/robotics8010022

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop