Next Article in Journal
Effect of Victim Gender on Evaluations of Sexual Crime Victims and Perpetrators: Evidence from Japan
Previous Article in Journal
The Evaluation of Psychosexual Profiles in Dominant and Submissive BDSM Practitioners: A Bayesian Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Outsourcing Love, Companionship, and Sex: Robot Acceptance and Concerns

1
Human Development and Family Science Program, Department of Human Services, University of Central Missouri, Warrensburg, MO 64093, USA
2
Department of Sociology, East Carolina University, Greenville, NC 27858, USA
3
Independent Researcher, Boston, MA 02213, USA
*
Author to whom correspondence should be addressed.
Sexes 2025, 6(2), 17; https://doi.org/10.3390/sexes6020017
Submission received: 17 March 2025 / Revised: 7 April 2025 / Accepted: 9 April 2025 / Published: 15 April 2025

Abstract

:
Due to constantly evolving technology, a new challenge has entered the relationship landscape: the inclusion of robots as emotional and intimate partners. This article raises the question of the degree to which companionship and intimacy may be fulfilled by robots. Three hundred and fourteen undergraduates, the majority of whom were first- or second-year college students, responded to an online survey on robot acceptance. Factor analysis identified two constructs, which the authors labeled as simulated companionship (e.g., robots as companions/helpful assistants) and simulated intimacy (e.g., robots as intimate partners–emotional and sexual). Data analysis revealed a difference between companionship and intimacy regarding student robot acceptance for home use. Overall, there was greater acceptance of robots as companions than as intimate partners. Group differences for simulated companionship were found for gender, sexual values, commitment to religion, and sexual orientation. While robots may enhance various elements of human life, the data revealed the limits of outsourcing emotional intimacy, companionship, and sex to machines.

1. Introduction

From revolutionizing industries to assisting in everyday household tasks, robots have impacted the way we live. However, concomitant with the enthusiasm for progress are apprehensions about the integration of robots into our lives. According to the Pew Research Center, almost three-fourths (72%) of Americans reported concern about robots taking over the duties that individuals normally perform [1]. Popular culture has also sometimes demonized robots as killers and presented them as intimidating to humans [2,3,4,5].
Although robots have been used as labor-saving devices, their new and more humane uses include companionship and intimacy. Research is virtually nonexistent regarding the degree to which robots are being used to meet these needs and the impacts of outsourcing these intimate experiences to machines. To what degree can robots provide our needs for companionship and intimacy, and is there cause for concern? This exploratory study examined factors associated with robot acceptance in relationships and identified implications and concerns.

1.1. Overview of the Status of Robot Technology

Robots are machines that can sense the environment and process/interpret the data to make decisions and respond [6]. The sense-think-act sequence is essential for robotic function and has implications for working with humans safely, effectively, and harmoniously. The technology that allows cars, such as Tesla, to self-drive has been implemented in and advanced in robots. Robotic sensing, such as audio and visual sensing, enables robots to perceive and interact with their environment. Vision sensing, which relies on technologies such as cameras, lasers, and radio frequencies, is crucial for object identification, depth perception, and environmental navigation [7]. Auditory sensors allow robots to detect sound and human speech. Haptic technology facilitates the feeling of touch via various devices and provides tactile feedback to users [8]. The tactile sensors of a robot can provide a sense of touch by measuring pressure and vibration [9]. Some robots have olfactory sensors that can “smell” and identify specific odors [10]. The invention of the electronic tongue has made “taste” possible for robots [11].
Artificial intelligence (AI) has been the focus of research and development by scientists and has been incorporated into robots [12,13]. Indeed, the recent use of generative AI (or conversational AI) has enabled robots to become involved in more human-like communication. Making airline reservations is now routine by “speaking” to a robot. Robots can process enormous amounts of data and learn from data gained through interaction with humans.
Robots may not only look and talk like humans, but their physical encasement may also “feel” like humans (e.g., cyberskin). However, some robots are made of hard plastic, wires, and receptors, and accidentally stepping on a child or bumping into an elderly adult by a robot is dangerous. Scientific advancement and engineering in materials are propelling the development of robots with softer, more flexible, safer bodies [13,14].

1.2. Social Robots with Human-like Features

The primary function of social robots is to interact or communicate with human users or other systems. There are numerous ways to categorize robots, such as function (e.g., industrial or social), mobility (e.g., stationary or mobile), application (medical or outer space), or appearance (e.g., humanoid or animal). These classifications are not mutually exclusive. For example, a social robot that can interact and converse with people in social settings may also be able to provide therapeutic assistance in mental or physical disorders (therapy), promote the autonomy of older persons (services), and offer emotional support to those living alone or in nursing homes (companion) [15].
The appearance of social robots is an important design consideration since humans are less inclined to interact with a cold metal machine (in appearance or voice). Human users of social robots typically feel more at ease with anthropomorphized machines, those designed to resemble human features [16,17]. Robots are also viewed as friendlier if they have a human-like voice (rather than a synthesized voice) [18]. When robots are presented and perceived as feminine, people tend to react to them more courteously [19]. Future innovations in robot development might reflect those with more diverse characteristics [20].
Although a robot is a machine that does not have a gender, a robot’s “gender” cue impacts the user’s perception and influences the user’s behavior. Since gender is a key factor in human social interaction, gender is frequently assigned to robots in order to increase their familiarity with the user. Gender expression is achieved by altering audio or visual cues such as voice, hair, clothing, and physique. As a result, the gendering of robots is frequently based on stereotypical feminine and masculine characteristics. Hence, most social robots have human-like features (e.g., eyes and voice) and are regarded as humanoid robots. However, some people have reported becoming uncomfortable when interacting with robots that looked and behaved too much like humans. This phenomenon is known as the “uncanny valley effect” and has hindered the acceptance of robots [21]. Compared to virtual chatbots, people have more difficulty embracing humanoid robots [22]. Additionally, one’s religion/faith may have an adverse effect on their acceptance of humanoid robots. For some, the human form is sacred based on the Biblical assertion that it originated in the image of God.

1.3. Robot Use in the Home

Robot use has been proven to be highly effective for well-defined, specific tasks, such as surgery, due to their great precision and accuracy. Robots have also demonstrated their usefulness in the home, including domestic chores, entertainment/games, education/tutoring, healthcare/therapy, elder care, and companionship [23]. The use of robots for repetitive and simple chores has become widely popular in homes. However, having a social robot is still uncommon for most families. The rise in artificial intelligence has enabled robots to better interact with human users.
Human communication, which involves both verbal and nonverbal cues, is complex and difficult to interpret accurately, even for humans. Facial expression plays a crucial role in conveying emotions and messages. Accurately detecting human emotions and responding to them remains a challenge [15]. Family communication is personal and relational, and it is also context-specific. The goal of family communication is not always task- and problem-solving-oriented; providing care and support may be more important at times. Most people find communication challenging at times, and social robots encounter additional barriers. However, some robots are capable of responding emotionally and generating “facial expressions” to mimic various reactions/emotions. For example, Sophia is a humanoid robot equipped with “facial muscles” (wires and receptors underneath) that can generate various human facial expressions such as joy and confusion. In addition, Moxi, a hospital robot, has an LED screen as a face and can digitally display various emotions. In spite of these innovations, robots struggle to recognize human emotion, accurately interpret verbal and nonverbal cues, and engage in appropriate responses [24].
Safety is the most salient concern for human–robot interaction. The risks of having a robot include physical and psychological concerns. Machine malfunction can result in injuries, accidents, and psychological harm. Robots used in psychological therapy must be monitored carefully as they can make mistakes (fail to assess a suicidal patient) as well as become a source of overattachment for a client with non-human aid [25]. Other problems with robots involve slow response time and high mistake rates [26]. While most robots are useful, some robots need assistance to operate. For example, robotic prosthetic arms, also known as biotic arms, may take time for the user to gain full control, while a hook arm is easy to use. Future robots may be able to operate more simply and with improved, more intuitive designs.

1.4. Robot Use for Companionship

An increasing number of people are seeking non-human companions [27]. Artificial companions can learn from users’ feedback and become one’s best “friend”. Companion robots are available for both children and adults. For instance, Moxie is a social robot designed to be a friend and developmental assistant for children aged 5 to 10 years old [28]. Moxie is a relatively small robot (16 inches tall and weighs 8.5 pounds), has a cartoon-like face (screen), and has child-like behaviors. Moxi can tell jokes and provide interactive educational activities.
Robots have been particularly helpful with older adults in terms of companionship, assistance in daily living, cognitive training, and physical training [26]. Humanoid and pet robots have been utilized as emotional companions and therapeutic tools, especially for individuals with cognitive impairments. PARO is a life-like robot seal that has artificial intelligence (AI) programming. It can acquire new behaviors and interact with humans in a way that resembles a pet [29]. The acceptance of robots by the elderly is influenced by the behavior of the robot, with older adults being less receptive to robots who ask them to share private and sensitive information or to be “monitored” by robots [30]. However, robot help is more accepted when transactional services like scheduling and reminders are provided.
Artificial companion robots for adults are capable of providing companionship, conversation, and romance. ElliQ, powered by artificial intelligence, possesses its own “personality” and can engage in conversations [31]. Replika, a generative AI chatbot, was developed to serve as an interesting companion and attentive listener, accessible around the clock. Additionally, users have the flexibility to define their relationship with Replika as either platonic or romantic [32].
Sometimes, individuals prefer interacting with a machine rather than a human. A study exploring either using robots or humans to coach reading for elementary school children revealed that robots were not only effective reading companions, but were also preferred over human co-readers. However, this preference for robots could be a result of novelty, curiosity, or feeling unjudged [33].
In a study of the acquaintance process, Drouin and colleagues [34] randomly assigned 417 participants into three groups: face-to-face with a human, online conversation with a human, or an online chat using Replika. Participants who spoke with a human face-to-face experienced considerably more negative emotions than those who spoke with Replika. Fewer conversational issues also occurred when interacting with Replika. However, participants who conversed with humans reported more positive affect (e.g., liking) toward people.
While many cross-sectional studies on patients and caregivers have found significant positive effects on robot acceptance and robot use for older adults [35], some researchers have questioned the long-term benefits of robots. A study by Lee and colleagues [36] examined the experience of 180 low-income and socially isolated older adults who interacted with humanoid robots. Measures of depressive symptoms and health-related quality of life were collected before the trial (baseline), at three and six months later. The study’s findings revealed that participants thought humanoid robots were useful companions, particularly during home confinement. At three months, the data showed robots improved the quality of life and lessened the depressive symptoms of the participants. However, these notable benefits did not last at the six-month assessment. Hence, longitudinal studies are needed to more fully understand the complexity and consequences of human interaction with robots. In addition, since robots vary in appearance and function, an assessment of attitude toward a specific humanoid robot may be insightful [35].
While intentional solitude may be positive, unwanted loneliness can be devastating. Might robots as companions be the answer to loneliness? Skeptics are concerned that robots/new technologies may only exacerbate one’s loneliness [37]. Intimacy and companionship are two related yet different concepts [38]. Intimacy goes beyond having someone (person or robot) be with you/keep you company; it is a profound and intense interpersonal bond that is encouraged by reciprocal disclosure of one’s deepest thoughts/feelings. An emotional bond may also be related to physical, social, intellectual, and spiritual phenomena. The experience of feeling connected is a basic human need and is only sustainable between equals.
While a robot can ask you questions, tell you jokes, and provide you with information, its capacity for intimate emotional interaction/bonding is nonexistent. A robot does not have the ability to engage in meaningful conversations about past shared experiences (“Remember when we both had a crush on Terry our sophomore year?”), respond heartfully to statements like “I am lonely and miss my friends”, and has no capacity to initiate shared activities (“Honey, you had a hard day, let me take you out to your favorite restaurant tonight”). A person in a room with a robot is a person with a soulless machine designed to simulate human interaction, but has no capacity to connect emotionally.

1.5. Robot Use for Sexual Pleasure

Humans often engage in sexual conduct for reasons that extend beyond procreation, such as sexual pleasure or gratification. Similarly to using an object for sexual pleasure, a robot could also be used for sexual pleasure. Using objects as a substitute for a sexual partner is not new. Archeologists and historians have discovered objects constructed of various materials, such as wood or stone, for sexual pleasure [39,40]. The early sex dolls were largely blown-up dolls made of plastic or rubber [41,42]. The new sex robots are made of soft cyberskin and can interact with users. Unlike sex toys, which are devices for mainly solo sex, modern-day sophisticated sex robots not only “look” like humans but also respond like them. The interactive sex robots provide a “quasi-partner” experience [43], and some users perceive robots as “real partners”. The term erobot refers to any virtual or non-human agent, such as chatbots and sex robots, used in reference to sexual enjoyment [44]. In 2022, the market for erobots was valued globally at USD 411 million, and the global sex doll market is projected to increase [45].
Sex robots can facilitate sexual pleasure, provide experimentation, and promote sexual independence. The specific eye color, hair color, skin color, body shape, and gender can be ordered from the manufacturer [46]. Sex robots play a positive role in therapeutic settings [46,47,48,49] and provide sexual experiences for persons with physical limitations, such as long-stay nursing home patients and individuals with disabilities [50]. Sex work is the transactional exchange of sexual service for money and typically does not involve emotional intimacy. Sex robots have also been used for commercial sex and have been featured in brothels [47,51].
Attitudes toward the use of sex robots are more negative than toward the use of robots for other purposes. Women are also more likely to disapprove of the use of sex robots and feel jealous, particularly if they feel their partner is using the robot more for sex than platonic companionship [52]. People who are politically conservative and religious are also more likely to have negative views of sex robots [53].
Men tend to have a more positive view of sex robots than women [54] and are less likely than women to view sex with robots as cheating—42% versus 53% [55]. While there are potential benefits to individuals and couples who use sex robots, some researchers have suggested that engaging with artificial erotic agents may provide the context for eroding real-life relationships, and increasing sexual deviation and dysfunction [56]. Sex robots, it is argued, are customized to the user’s preference and intended to satisfy the user’s needs. Such a one-sided dynamic precludes any concern for an egalitarian relationship. Since sex robots are typically for solo sex, egalitarian concerns are irrelevant.
Human–machine erotic interactions may also include issues such as robot fetishism and porn-induced erectile dysfunction [44,57]. Experience with these human-like machines that satisfy and obey users’ wishes is different from interaction with a real human partner and may make the transition to real partner sex more challenging. However, we await the data on this issue.
Nevertheless, concerns have been raised that robot users do not need to obtain active consent, and the user’s pleasure and preference are always granted [58]. Tschopp et al. [59] noted that users may develop strong attachments to machines, perceiving them as “intimate” partners despite their programmed servitude, thereby fostering a dynamic more akin to master and servant. Is such a dynamic appropriate in solo sex, or is it legitimate to ask the question about concerns for equality of interaction?
In 2015, the Campaign Against Sex Robots (CASR) in the United Kingdom claimed that sex robots contributed to sexism and sexual violence, that sex robots were customizable commercial products, and only reinforced the objectification of women’s bodies since users could select sex robots’ body parts, such as faces and genitals. Research has shown a significant positive correlation between the use of sex robots and hostility towards women [60]. Dudek and Young [61] advocated creating gender fluid sex robots to break down traditional gender and sex barriers.
The sexualization of children and child pornography are serious concerns associated with the personalized customization of robots. Child-like sex robots are now on the market for purchase [62]. Whether selling or using child-like sex robots should be illegal has generated heated debate. Some have argued that sex robots are fantasy creations; hence, child-like sex robots should not be considered real children, and thereby their sale or use should not constitute committing the crime of child pornography. Researchers have questioned whether users must obtain the sex robot’s consent, or whether sexual assault against AI-powered robots is a crime [63]. The interaction of sexuality, technology, and morality is complex.
Sherry Turkle, a Massachusetts Institute of Technology professor and writer, noted that even though robots may mimic emotional and empathic responses, these technology-simulated emotions are not real [64]. Robots are machines designed to serve users and should not have “rights”. Developing love and intimacy with robots can be dangerous since they are devoid of an egalitarian context and not capable of genuine love [65]. Are the concerns of Turkle legitimate?

1.6. Theoretical Framework for Robot Use at Home

Although many theories have been applied to understand the impact of technology (e.g., mobile phones) on individuals, an optimal theory for robot–family interaction has yet to be established. The best theoretical framework to comprehend the impact of robot use in a home setting will need to recognize the unique characteristics of each robot, the characteristics of each user, and the relationship and family context of the interactions.
The initial application of robots in the industrial sector happened decades ago. Earlier theoretical frameworks, such as the Technology Acceptance Model (TAM), originated in engineering and prioritized safety, productivity, and interaction fluency in a factory setting [66].
Subsequently, the symbolic interactionism theory might be helpful in that it emphasizes the symbolic world in which individuals give meaning to social interaction. When interacting with a humanoid robot, the individual may interpret the robot as a real person or partner and interact with the robot accordingly. Recently, the gratification theory has been used to explain the reasons why individuals choose a certain technology, such as a mobile phone, to satisfy their various needs [67,68].
Social exchange theory can also be used to examine the use of robots from a cost–benefit perspective [69,70]. Individuals can enjoy robots for companionship, entertainment, and sexual pleasure since these artificial entities can be programmed to have the desired physical characteristics (eye color, hair color, body type), to always be available, and to be 100% compliant in regard to whatever behaviors are desired by the user. Similarly, AI-enhanced robots may be programmed to “learn” the user’s preferences and cater to the user’s needs. From the social exchange theory perspective, the benefits (availability, compliance) of using a robot as a sexual companion may outweigh the costs (expensive, stigma). Most of those theories were applied to mobile phones or video games, and the focal point was on the individual and the device.
The most comprehensive theory for considering the impact of technology on the family is the sociotechnological model [71]. This theory incorporates Bronferenbrenner’s ecological perspective and identified individual characteristics (e.g., intention and vulnerability), technological components (e.g., type of technology, functions, physical and psychological effects of the technology), and family factors (e.g., family stage, member usage, family process, and cohesion) when examining the impact of technology on families. The sociotechnological model examined the bidirectional communication between humans and humanoid robots, robot functions, and effects on different family members (users and non-users). This model not only considers the social and cultural background of the user but also the variations in individuals, families, and technology. For example, playing a video game on one’s smartphone while sitting on the sofa in one’s living room can be unobtrusive and unnoticed by other family members. In contrast, conversing with a humanoid robot in the same context is open and observable. Similarly, charging one’s sexbot by the bed, even when not in use, may impact the partner of the robot owner.

1.7. The Current Study

Robots are here to stay. They have moved out of the factory and into our homes; they assist with chores, read to our children, provide care/companionship for older family members, and some may even provide sexual pleasure. Despite the involvement in our lives, research on robots has been limited. The current study focused on humanoid robots–machines designed to resemble humans and mimic human capabilities. New technology brings new challenges, and it is important to learn more about the implications of robot use by individuals and in the home.
Previous studies have not distinguished between companionship and intimacy. The authors contend that the concepts related to robots (simulated companionship, simulated intimacy) are distinct and should be examined differently. We were also interested in group differences (gender identity, race, and religion) in the acceptance of robots for companionship and intimacy.

2. Materials and Methods

2.1. Sample

Data for this study were obtained from a sample of 345 undergraduate students at a large southeastern university in the United States who were taking a course in intimate relationships. The majority of the participants were first- or second-year college students (43% first-year and 30% sophomores). Eighty-one percent of the respondents self-identified as female (81.4%). The majority of the participants were heterosexual (90.4%). The sample was predominantly white (70.0%), with the remainder black (14.5%), Hispanic (7.7%), Asian (2.9%), biracial (2.9%), and other (2.3%).

2.2. Procedure

Following approval by the University Institutional Review Board, recruitment emails were sent to students enrolled in several sections of an introductory sociology course at a public university in the country’s southeastern region. This course was part of the general education curriculum and available to all majors. The student volunteers completed an anonymous 34-item questionnaire on “Survey on Acceptance of Robots” (developed by the authors) to assess attitudes toward humanoid robot use. The term humanoid robot was defined as a robot with humanlike features. The items were created by examining the literature to develop preliminary items to measure attitudes toward the acceptance of robots for in-home use. Other items included demographic characteristics, technology use, and several theoretically relevant predictors of acceptance regarding the use of humanoid robots.

2.3. Measures

Acceptance of Humanoid Robots. 11 items were developed that assessed attitudes toward the acceptance of robots. The items used a 5-point Likert scale (1 = “strongly disagree” to 5 = “strongly agree”). We used both exploratory and confirmatory factor analysis to assess and determine the acceptability of the items for use in a scale.
Demographic Predictors. Participants self-reported their biological sex, race, and sexual orientation. Biological sex was measured with a single item indicating male or female. Race was measured with a single-item categorical variable indicating whether the participant self-identified as white, black, Hispanic/Latino, Asian, biracial, or other. Sexual orientation was measured with a single categorical variable with six categories (heterosexual, bisexual, gay male, lesbian, asexual, or other). Religiosity—a single five-point Likert-type item (1 = strongly disagree to 5 = strongly agree)—was used to measure religiosity. “I am very religious”.
Self-Concept—a single five-point Likert-type item (1 = strongly disagree to 5 = strongly agree)—was used to measure positive self-concept: “I have a very positive self-concept”. Sexual value—a single three-point categorical item—was used to measure sexual value. “1 = I think sexual intercourse before marriage is wrong”, “2 = I think if you’re in a loving relationship, intercourse is ok even if you’re not married”, “3 = I think if it feels good do it”, “being in love or being married does not matter”.

2.4. Analytic Strategy

To test the factor structure of the items that assess attitudes toward acceptance of robots, we randomly split the data in half to perform an exploratory factor analysis (EFA) of the items that measured acceptance of humanoid robots. We then ran a confirmatory factor analysis (CFA) on the second half of the sample. We used Mplus (Muthen and Muthen) version 8.2 for both analyses. We used full information maximum likelihood estimation with the geomin rotation, which is the default oblique rotation in Mplus for the EFA. We then fit a series of factors ranging from one to three and used multiple points of evidence to identify the number of factors to extract: Kaiser’s rule of an eigenvalue greater than 1.0, significant and high factor loadings (>0.50), the absence of any cross-loadings, and the interpretability of the items as a scale [72]. After identifying the best-fitting solution to the data, we ran a CFA on the second half of the data. Lastly, using the full data set, we used structural equation modeling to identify significant predictors of the acceptance of robots in households.

3. Results

The present study examined college students’ acceptability of the use of humanoid robots at home. Additionally, we explored how companionship and intimacy were perceived and evaluated.

3.1. Is Companionship Different from Intimacy?

The exploratory factor analysis supported a two-factor solution. There were two factors that had an eigenvalue greater than one. The two factors had significant and large loadings on multiple items and were psychologically interpretable. Table 1 presents the rotated factor loadings of the items used, with bold items indicating items retained in the CFA. The first-factor loading corresponded to items related to using a humanoid robot for companionship in the home. This factor consisted of four items. However, one item, “I do not want any humanoid robot in my home or living space”, had the smallest loading (λ = −0.44) and loaded negatively on the factor. This item was removed for analysis in the CFA. We termed this factor simulated companionship.
The second factor consisted of seven items that corresponded to the acceptance of using humanoid robots for intimate and sexual companionship. Although nine items from the EFA loaded onto this factor, we removed two items that had loadings beneath the cut-off of 0.50 (“The use of humanoid robots for having sex is stigmatized” and “I think humanoid robots could become family members in the sense that parents would depend on the robot”). This resulted in a factor that consisted of seven items that we termed simulated intimacy. Table 1 depicts the rotated factor loadings for the EFA for all 11 items.
Next, we ran a CFA on the second half of the data to verify the factor structure identified in the EFA. The model fit was excellent, with a χ2 = 27.98 (19) p > 0.05, RMSEA = 0.06, and CFI = 0.97, which indicated acceptable fit levels according to Kline [73]. Table 2 presents the standardized factor loadings, which were all large and statistically significant, providing further evidence of the acceptability of the items for use on a scale.

3.2. Overall Acceptance of Humanoid Robots

Table 3 presents the overall mean endorsement for each of the items on the two scales. Only one item, “I think an elderly person who is lonely could benefit from having a humanoid robot as a companion”, had a mean greater than 3.0 (“neither agree nor disagree”). Two items had a mean score below 2.0: “I think people should be able to marry their humanoid robot” and “I could develop feelings of intimacy for a humanoid robot”, indicating stronger disagreement. The other items had means ranging from 2.0 to 3.0.

3.3. Predictors of Acceptance of Using Humanoid Robots for Simulated Companionship and Simulated Intimacy

We used structural equation modeling (SEM) to determine what factors predict the acceptance of humanoid robots for simulated companionship and simulated intimacy. We added multiple theoretically relevant predictors of acceptance of humanoid robots: biological sex, positive self-concept, degree of religiosity, relationship status, sexual orientation, race, and permissive sexual value.
Figure 1 illustrates the SEM model. The model fit the data well, with a χ2 = 91.56, p < 0.01, RMSEA = 0.04, and CFI = 0.96. Three items were significant predictors of acceptance of humanoid robots for simulated companionship. Sex predicted significantly lower acceptance, b = −0.16 (p = 0.02), indicating women were less accepting of humanoid robots. Religiosity also negatively and significantly predicted lower acceptance, b = −0.14 (p = 0.05), indicating that less religious people report higher acceptance of humanoid robots. Lastly, sexual orientation predicted higher acceptance, b = 0.06, p = 0.01, indicating non-heterosexual people had higher acceptance of humanoid robots.
Two factors predicted the acceptance of humanoid robots for simulated companionship. Race, with a b = 0.18 (p = 0.008), significantly predicted higher acceptance of humanoid robots. A post hoc examination of the means indicated that Asians reported the highest acceptance (m = 10.22) while whites reported the lowest (m = 8.14). However, due to the small sample size and low power, no formal statistical comparison was run. Sexual values were also significantly associated with acceptance of using robots for simulated companionship, with a b = 0.20 (p = 0.004). Individuals with less-traditional sexual values reported higher acceptance of robots for simulated companionship.

4. Discussion

This research explored undergraduate attitudes toward the use of humanoid robots and suggested areas of concern that researchers, family scholars, and therapists might address. Our analysis focused on two factors that the creators of humanoid robots attempted to provide for individuals: companionship and intimacy. But since robots are machines and not the same as humans or their capabilities, we have renamed these factors “simulated companionship” and “simulated intimacy”. We encourage future researchers to use these terms when studying human–robot interaction to emphasize that there is a distinction between humans and robots.
Our findings revealed that respondents held generally ambivalent attitudes toward the use of humanoid robots in the home. Fear and apprehension are normal reactions to new phenomena. However, of the two factors (simulated companionship and simulated intimacy), the undergraduates were more approving of the former, such as providing companionship for the elderly. The thinking is that simulated intimacy necessitates a greater degree of human emotional engagement and that there are limits to what a machine may provide. Even though sex robots may provide pleasure, there is an acknowledgment that this interaction is not “emotionally intimate”. Similarly, most respondents rejected the idea of marriage to a robot since marriage to a machine was unrealistic. We also found that while men were more accepting than women of simulated companionship, there were no gender differences in the acceptance of simulated intimacy (both sexes held low acceptance).
In addition to sex differences in men being more accepting of robots for simulated companionship, they were also more accepting of robots for sex. That men were more open about sex novelty than women has been found by previous researchers [54,55]. The non-religious, compared to the religiously devout, were also more accepting of robot use for simulated companionship. While the link between religion and robot attitudes is complex, the religiously devout may have more accepting thoughts about humans, not robots being made “in the image of God”. Finally, regarding the impact of racial background on robot acceptance, the lack of a larger, more diverse sample did not allow for comparisons across racial groups. However, Asian Americans scored higher on the acceptance of both simulated companionship and simulated intimacy. Future research may provide an explanation.

4.1. Implications

Whether the car, cellphone, or robot, technology is neither inherently evil nor virtuous. The acceptance of these technologies is influenced by who uses them for what purpose. Our focus has been on robots in the home, and we discovered general hesitation with greater acceptance by men for simulated companionship use. This exploratory study may spark further research and discussion by researchers, family scholars, and therapists about robot use and acceptance.
In 2023, Surgeon General Vivek Murthy reported an increased feeling of loneliness among the American population. Robots could potentially assist in meeting the emotional needs of our aging population. One object of exploration has been the use of robots as companions and intimates to help quell such feelings and improve mental health and well-being. Various terms such as virtual companion and artificial intimacy have been used to describe the new human–robot experience. We proposed using phrases “simulated companionship” and “simulated intimacy” because such experiences were with machines and will never be the same as those with humans. While some individuals view simulated companionship and simulated intimacy via robots as “real” (and we acknowledge their conceptions as valid from their experience), we suggest that human companionship and human intimacy are unique, genuine, and beyond what robots can provide. Since robots can only provide simulated companionship (they are designed to be your “friends”) and simulated intimacy (they are programmed to mimic empathy and caring), caution is appropriate as we consider how these machines are used in our lives and relationships.
Although our current study found ambivalent attitudes towards using robots for companionship and intimacy, it is likely that advances in generative AI and greater familiarity with robots may lead to increased acceptance. As acceptance of robots increases, what is the future of robots in our lives and in our relationships? David Levy believed that the human–robot relationship would resemble the human–pet relationship [74]. Centuries ago, most cats and dogs were used for herding, hunting, or catching mice/snakes. Today, cats and dogs are typically regarded as family members (e.g., “fur babies”) and play a crucial role in family life.
Saudi Arabia proclaimed Sophia, a highly advanced humanoid robot, as its first citizen in 2017. Will robots be regarded as family members one day? The philosophical ideologies of transhumanism and posthumanism are highly controversial. The advocates for transhumanism suggest an integration between humans and technology (e.g., merging humans with AI). Advocates for posthumanism argue that evolution continues to take place and reject a human-centered worldview [75]. Both transhumanism and post-humanism foresee greater blurring lines between humans and machines [76,77,78]. How far are we willing to go? Levy suggests that simulated companionship and intimacy may be enjoyed and that humans may even love and marry robots [74].
Is there a need for a pause? Just because machines can mimic human behaviors and emotions does not translate into our placing less value on human companionship and human intimacy. As technology progresses, we must evaluate not just the potential functions of any technology, but also the human implications and purposes for which the technology may be used.
We are at the dawn of human–robot home integration, and researchers, family scholars, and therapists might address the effects of robots on human companionship, emotional intimacy, and sexual intimacy. Guidelines are needed to navigate the new terrain. While robots can help facilitate aspects of human life, the “devil is in the details”. The outsourcing of some human needs (e.g., intimacy) is not like obtaining parts from another country to plug into a human need but an endeavor that requires careful examination and consideration.

4.2. Limitations

Although this research provided some insight in regard to the acceptance of robots in undergraduate relationships, there are several limitations of the study. This study was based on young adults attending college. Subsequent samples might include non-college students, individuals of different ages, races, cultural backgrounds, religion/spiritual beliefs, and respondents who are widows or caregivers.
The validity, reliability, and design of the questionnaire might also be strengthened. For example, many concepts were measured by a single question. While it was beneficial to apply classic family and sociological theories to this study, a new conceptual framework is needed to better capture the complexity of human–machine interaction and the rapid advancement of technology. Although our results revealed two factors (simulated companionship and simulated intimacy), methods of future inquiry might include a more detailed survey, including robot types, generative AI, etc. Such methodological improvements could provide a deeper understanding of the human–robot connection by a wider sample of the population.

5. Conclusions

This exploratory study marked the first investigation of the variables associated with humanoid robot acceptance from a family scholar’s or therapist’s perspective. Robot’s artificial intelligence and human-like qualities will advance at an accelerated rate. We hope this research sounds the alarm for scholars to consider/wrestle with the complexities of integrating the discussion of robots into the lives of our individuals, couples, and families. Increasingly, people are experiencing others through various technological advancements, which invite new ethical and legal challenges for consideration and debate. Are we relying on machines to meet needs only humans can provide? Is it appropriate to outsource companionship, intimacy, and affection? Which tasks should never be replaced by machines? These are questions we look to in future research for enlightenment.

Author Contributions

The initial project conceptualization and literature review were carried out by A.C.T., I.J.C. and D.K.; analysis was conducted by T.S.W.; and the discussion was led by D.K. and A.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

All data were anonymous. This study was approved by the Institutional Review Board of East Carolina University (IRB number 15-002385 Social Psychological Variables and Robot Acceptance, 23 June 2024).

Informed Consent Statement

All participants volunteered to participate in this online survey. Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are available upon reasonable request to the corresponding author.

Acknowledgments

The authors would like to thank Annika C. Tsay for editing and formatting.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Smith, A.; Anderson, M. Americans’ Attitudes Toward a Future in Which Robots and Computers Can Do Many Human Jobs. Pew Research Center. 2017. Available online: https://www.pewresearch.org/internet/2017/10/04/americans-attitudes-toward-a-future-in-which-robots-and-computers-can-do-many-human-jobs/ (accessed on 2 March 2025).
  2. Banks, J. Optimus Primed: Media Cultivation of Robot Mental Models and Social Judgments. Front. Robot. AI 2020, 7, 62. [Google Scholar] [CrossRef] [PubMed]
  3. DiTecco, D.; Karaian, L. New Technology, Same Old Stigma: Media Narratives of Sex Robots and Sex Work. Sex. Cult. 2023, 27, 539–569. [Google Scholar] [CrossRef]
  4. Onusseit, P. Demonized Inventions: From Sex Doll to AI. 2023. Available online: https://www.dw.com/en/demonized-inventions-from-railroads-to-ai/a-65658720 (accessed on 5 March 2025).
  5. Savela, N.; Turja, T.; Latikka, R.; Oksanen, A. Media effects on the perceptions of robots. Hum. Behav. Emerg. Technol. 2021, 3, 989–1003. [Google Scholar] [CrossRef]
  6. Guizzo, E. What Is a Robot? Robots Guide. Available online: https://robotsguide.com/learn/what-is-a-robot (accessed on 5 March 2024).
  7. Magaña, A.; Vlaeyen, M.; Haitjema, H.; Bauer, P.; Schmucker, B.; Reinhart, G. Viewpoint Planning for Range Sensors Using Feature Cluster Constrained Spaces for Robot Vision Systems. Sensors 2023, 23, 7964. [Google Scholar] [CrossRef]
  8. Van Wegen, M.; Herder, J.L.; Adelsberger, R.; Pastore-Wapp, M.; van Wegen, E.E.H.; Bohlhalter, S.; Nef, T.; Krack, P.; Vanbellingen, T. An Overview of Wearable Haptic Technologies and Their Performance in Virtual Object Exploration. Sensors 2023, 23, 1563. [Google Scholar] [CrossRef]
  9. Wang, C.; Liu, C.; Shang, F.; Niu, S.; Ke, L.; Zhang, N.; Ma, B.; Li, R.; Sun, X.; Zhang, S. Tactile Sensing Technology in Bionic Skin: A Review. Biosens. Bioelectron. 2023, 220, 114882. [Google Scholar] [CrossRef]
  10. Monroy, J.; Ruiz-Sarmiento, J.R.; Moreno, F.A.; Galindo, C.; Gonzalez-Jimenez, J. Olfaction, Vision, and Semantics for Mobile Robots: Results of the IRO Project. Sensors 2019, 19, 3488. [Google Scholar] [CrossRef]
  11. Paul, A. An ‘Electronic Tongue’ Could Help Robots Taste Food Like Humans. Popular Science. 2023. Available online: https://www.popsci.com/technology/electronic-tongue-ai-robot/ (accessed on 5 March 2025).
  12. Copeland, B.J. Artificial Intelligence; Encyclopedia Britannica: Chicago, IL, USA, 2024; Available online: https://www.britannica.com/technology/artificial-intelligence (accessed on 7 March 2024).
  13. Bo, Y.; Wang, H.; Niu, H.; He, X.; Xue, Q.; Li, Z.; Yang, H.; Niu, F. Advancements in materials, manufacturing, propulsion and localization: Propelling soft robotics for medical applications. Front. Bioeng. Biotechnol. 2024, 11, 1327441. [Google Scholar] [CrossRef]
  14. Hassani, H.; Silva, E.S.; Unger, S.; TajMazinani, M.; Mac Feely, S. Artificial Intelligence (AI) or Intelligence Augmentation (IA): What Is the Future? AI 2020, 1, 143–155. [Google Scholar] [CrossRef]
  15. García-Córdova, F.; Guerrero-González, A.; Zueco, J.; Cabrera-Lozoya, A. Simultaneous Sensing and Actuating Capabilities of a Triple-Layer Biomimetic Muscle for Soft Robotics. Sensors 2023, 23, 9132. [Google Scholar] [CrossRef]
  16. Ragno, L.; Borboni, A.; Vannetti, F.; Amici, C.; Cusano, N. Application of Social Robots in Healthcare: Review on Characteristics, Requirements, Technical Solutions. Sensors 2023, 23, 6820. [Google Scholar] [CrossRef] [PubMed]
  17. Aggarwal, P.; McGill, A. When brands seem human, do humans act like brands? Automatic behavioral priming effects of brand anthropomorphism. J. Consum. Res. 2012, 39, 307–323. [Google Scholar] [CrossRef]
  18. Momen, A.; Hugenberg, K.; Wiese, E. Social perception of robots is shaped by beliefs about their minds. Sci. Rep. 2024, 14, 5459. [Google Scholar] [CrossRef] [PubMed]
  19. Eyssel, F.; Hegel, F. (S)he’s got the look: Gender stereotyping of Robots. J. Appl. Soc. Psychol. 2012, 42, 2213–2230. [Google Scholar] [CrossRef]
  20. Fortunati, L.; Edwards, A.; Edwards, C.; Manganelli, A.E.; de Luca, F. Is Alexa Female, Male, or Neutral? A Cross-National and Cross-Gender Comparison of Perceptions of Alexa’s Gender and Status as a Communicator. Comput. Hum. Behav. 2022, 137, 107426. [Google Scholar] [CrossRef]
  21. Papadopoulos, I.; Lazzarino, R.; Miah, S.; Weaver, T.; Thomas, B.; Koulouglioti, C. A systematic review of the literature regarding socially assistive robots in pre-tertiary education. Comput. Educ. 2020, 155, 103924. [Google Scholar] [CrossRef]
  22. Kugler, L. Crossing the Uncanny Valley: The “Uncanny Valley Effect” May Be Holding Back the Field of Robotics. Commun. ACM 2022, 65, 14–15. [Google Scholar] [CrossRef]
  23. Prochazka, A.; Brooks, R. Digital Lovers and Jealousy: Anticipated Emotional Responses to Emotionally and Physically Sophisticated Sexual Technologies. Hum. Behav. Emerg. Technol. 2024, 2024, 1413351. [Google Scholar] [CrossRef]
  24. Hoffman, G.; Kshirsagar, A.; Law, M.V. Human-robot interaction challenges in the workplace. In The Psychology of Technology: Social Science Research in the Age of Big Data; Matzo, S., Ed.; American Psychological Association: Washington, DC, USA, 2022; pp. 305–348. [Google Scholar]
  25. Lee, M.C.; Chiang, S.Y.; Yeh, S.C.; Wen, T.F. Study on Emotion Recognition and Companion Chatbot Using Deep Neural Network. Multimed. Tools Appl. 2020, 79, 19629–19657. [Google Scholar] [CrossRef]
  26. Szondy, M.; Fazekas, P. Attachment to robots and therapeutic efficiency in mental health. Front. Psychol. 2024, 15, 1347177. [Google Scholar] [CrossRef]
  27. Andtfolk, M.; Nyholm, L.; Eide, H.; Fagerström, L. Humanoid Robots in the Care of Older Persons: A Scoping Review. Assist. Technol. 2022, 34, 518–526. [Google Scholar] [CrossRef] [PubMed]
  28. CBS News. Virtual Valentine: People Are Turning to AI in Search of Emotional Connections. CBS News, 14 February 2024. Available online: https://www.cbsnews.com/news/valentines-day-ai-companion-bot-replika-artificial-intelligence/ (accessed on 2 December 2024).
  29. Lee, O.E.; Lee, H.; Park, A.L.; Choi, N.G. My Precious Friend: Human-Robot Interactions in Home Care for Socially Isolated Older Adults. Clin. Gerontol. 2024, 47, 161–170. [Google Scholar] [CrossRef] [PubMed]
  30. Yu, C.; Sommerlad, A.; Sakure, L.; Livingston, G. Socially assistive robots for people with dementia: Systematic review and meta-analysis of feasibility, acceptability and the effect on cognition, neuropsychiatric symptoms and quality of life. Ageing Res. Rev. 2022, 78, 101633. [Google Scholar] [CrossRef] [PubMed]
  31. Ostrowski, A.; Zhang, J.; Brezeal, C.; Park, H.W. Promising Directions for Human-Robot Interactions Defined by Older Adults. Front. Robot. AI 2024, 11, 1289414. [Google Scholar] [CrossRef]
  32. Zarecki, B. The Rise of Robotic Companions to Address Social Isolation. Center for Healthy Aging, Colorado State University. 2023. Available online: https://www.research.colostate.edu/healthyagingcenter/2023/10/25/the-rise-of-robotic-companions-to-address-social-isolation (accessed on 3 March 2024).
  33. Skjuve, M.; Følstad, A.; Fostervold, K.I.; Brandtzaeg, P.B. My Chatbot Companion—A Study of Human-Chatbot Relationships. Int. J. Hum.-Comput. Stud. 2021, 149, 102601. [Google Scholar] [CrossRef]
  34. Yueh, H.; Lin, W.; Wang, S.; Fu, L. Reading with robot and human companions in library literacy activities: A comparison study. Br. J. Educ. Technol. 2020, 51, 1884–1900. [Google Scholar] [CrossRef]
  35. Drouin, M.; Sprecher, S.; Nicola, R.; Perkins, T. Is Chatting with a Sophisticated Chatbot as Good as Chatting Online or FTF with a Stranger? Comput. Hum. Behav. 2022, 128, 107100. [Google Scholar] [CrossRef]
  36. Tobis, S.; Piasek-Skupna, J.; Neumann-Podczaska, A.; Suwalska, A.; Wieczorowska-Tobis, K. The Effects of Stakeholder Perceptions on the Use of Humanoid Robots in Care for Older Adults: Postinteraction Cross-Sectional Study. J. Med. Internet Res. 2023, 25, e46617. [Google Scholar] [CrossRef]
  37. Lee, O.E.K.; Nam, I.; Chon, Y.; Park, A.; Choi, N. Socially Assistive Humanoid Robots: Effects on Depression and Health-Related Quality of Life among Low-Income, Socially Isolated Older Adults in South Korea. J. Appl. Gerontol. Off. J. South. Gerontol. Soc. 2023, 42, 367–375. [Google Scholar] [CrossRef]
  38. Herold, E. Robots and the People Who Love Them: Holding on to Our Humanity in an Age of Social Robots; St. Martin’s Press: New York, NY, USA, 2024. [Google Scholar]
  39. Buhrmester, D.; Furman, W.; Buhrmester, D.; Furman, W. The Development of Companionship and Intimacy. Child Dev. 1987, 58, 1101–1113. [Google Scholar] [CrossRef]
  40. Bloch, I. The Sexual Life of Our Time in Its Relations to Modern Civilization; Paul, M.E., Translator; Dalton: London, UK, 1910. [Google Scholar]
  41. Viik, T. Falling in Love with Robots: A Phenomenological Study of Experiencing Technological Alterities. Paladyn J. Behav. Robot. 2020, 11, 52–65. [Google Scholar] [CrossRef]
  42. Beck, J.A. (Straight, Male) History of Sex Dolls. The Atlantic. 2014. Available online: https://www.theatlantic.com/health/archive/2014/08/a-straight-male-history-of-dolls/375623/ (accessed on 14 January 2024).
  43. Ferguson, A. The Sex Doll: A History; McFarland Publishing: Jefferson, NC, USA, 2010. [Google Scholar]
  44. Liberati, N. Making Out with the World and Valuing Relationships with Humans. Paladyn J. Behav. Robot. 2020, 11, 140–146. [Google Scholar] [CrossRef]
  45. Dubé, S.; Anctil, D. Foundations of Erbotics. Int. J. Soc. Robot. 2021, 12, 1205–1233. [Google Scholar] [CrossRef] [PubMed]
  46. 360 Market Updates. Sex Doll Market Size Will Reach US $644.09 Million in 2031. LinkedIn Pulse. 2024. Available online: https://www.linkedin.com/pulse/sex-doll-market-size-reach-us-64409-million-ln-2031-ynomf/ (accessed on 5 March 2025).
  47. Rigotti, C. How to Apply Asimov’s First Law to Sex Robots. Paladyn J. Behav. Robot. 2020, 11, 161–170. [Google Scholar] [CrossRef]
  48. Döring, N.; Poeschl, S. Experiences with Diverse Sex Toys Among German Heterosexual Adults: Findings from a National Online Survey. J. Sex Res. 2020, 57, 885–896. [Google Scholar] [CrossRef]
  49. Dubé, S.; Santaguida, M.; Anctil, D. Erobots as Research Tools: Overcoming the Ethical and Methodological Challenges of Sexology. J. Futur. Robot Life 2022, 11, 207–221. [Google Scholar] [CrossRef]
  50. Flippen, E.; Gaither, G. Prevalence, Comfort with, and Characteristics of Sex Toy Use in a US Convenience Sample Using Reddit.com. Grad. Stud. J. Psychol. 2023, 20, 5–25. [Google Scholar] [CrossRef]
  51. Shaddel, F.; Mayes, D. Considering Capacity to Use Sex Toys in Secure Care: Two Case Reports. Prog. Neurol. Psychiatry 2023, 2, 4–55. [Google Scholar] [CrossRef]
  52. New York Post. Inside Berlin’s Cybrothel, the World’s First AI Brothel Using Virtual Reality Sex Dolls. 2024. Available online: https://nypost.com/2024/02/04/lifestyle/inside-cybrothel-the-worlds-first-ai-brothel-using-sex-dolls/ (accessed on 6 March 2025).
  53. Morten, N.; Nass, J.O.; Husoy, M.F.; Arnestad, M.N. Friends, lovers or nothing: Men and women differ in their perceptions of sex robots and platonic love robots. Front. Psychol. 2020, 11, 355. [Google Scholar] [CrossRef]
  54. Olesky, T.; Wunk, A. Do Women Perceive Sex Robots as Threatening? The Role of Political Views and Presenting the Robot as a Female- vs. Male-Friendly Product. Comput. Hum. Behav. 2021, 117, 106664. [Google Scholar] [CrossRef]
  55. Kislev, E. The Robot-Gender Divide: How and Why Men and Women Differ in Their Attitudes Toward Social Robots. Soc. Sci. Comput. Rev. 2023, 41, 2230–2248. [Google Scholar] [CrossRef]
  56. Brandon, M.; Shlykova, N.; Morgentaler, A. Curiosity and Other Attitudes Towards Sex Robots: Results of an Online Survey. J. Futur. Robot Life 2022, 3, 3–16. [Google Scholar] [CrossRef]
  57. Monteiro, L.H.A. A Mathematical Model to Start a Discussion on Cybersex Addiction. Int. J. Math. Educ. Sci. Technol. 2024, 1–12. [Google Scholar] [CrossRef]
  58. Vanmali, B.; Osadchiy, V.; Shahinyan, R.; Mills, J.; Eleswarapu, S. Matters Into Their Own Hands: Men Seeking Pornography Addiction Advice From a Nontraditional Online Therapy Source. J. Sex. Med. 2020, 17 (Suppl. S1), S1. [Google Scholar] [CrossRef]
  59. Rousi, R. Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion. Robotics 2018, 7, 44. [Google Scholar] [CrossRef]
  60. Tschopp, M.; Gieselmann, M.; Sassenberg, K. Servant by Default? How Humans Perceive Their Relationship with Conversational AI. Cyberpsychology 2023, 17, 126–148. [Google Scholar] [CrossRef]
  61. Desbuleux, J.C.; Fuss, J. Is the Anthropomorphization of Sex Dolls Associated with Objectification and Hostility Toward Women? A Mixed Study Among Doll Users. J. Sex Res. 2023, 60, 206–220. [Google Scholar] [CrossRef]
  62. Dudek, S.Y.; Young, J.E. Fluid Sex Robots: Looking to the 2LGBTQIA+ Community to Shape the Future of Sex Robots. In Proceedings of the 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7–10 March 2022; pp. 746–749. [Google Scholar] [CrossRef]
  63. Chatterjee, B.B. Child sex dolls and robots: Challenging the boundaries of the child protection framework. International Rev. Law Comput. Technol. 2020, 34, 22–43. [Google Scholar] [CrossRef]
  64. Grigoreva, A.D.; Rottman, J.; Tasimi, A. When does “no” mean no? Insights from sex robots. Cognition 2024, 244, 105687. [Google Scholar] [CrossRef]
  65. Turkle, S. Alone Together: Why We Expect More from Technology and Less from Each Other; Basic Books Inc.: New York, NY, USA, 2011. [Google Scholar]
  66. Weiss, D.M. Learning to Be Human with Sociable Robots. Paladyn J. Behav. Robot. 2020, 11, 19–30. [Google Scholar] [CrossRef]
  67. Davis, F. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  68. Kim, W.; Cake, D.A. Gen Zers’ Travel-Related Experiential Consumption on Social Media: Integrative Perspective of Uses and Gratification Theory and Theory of Reasoned Action. J. Int. Consum. Mark. 2025, 37, 89–116. [Google Scholar] [CrossRef]
  69. Warraich, N.F.; Irfan, M.; Ali, I. Understanding Students’ Mobile Technology Usage Behavior During COVID-19 Through Use & Gratification and Theory of Planned Behavior. SAGE Open 2024, 14, 1–12. [Google Scholar] [CrossRef]
  70. Blau, P.M. Exchange and Power in Social Life; Wiley: New York, NY, USA, 1967. [Google Scholar]
  71. Hormans, G. Social behavior as exchange. Am. J. Sociol. 1958, 63, 597–606. Available online: http://www.jstor.org/stable/2772990 (accessed on 16 March 2025). [CrossRef]
  72. Lanigan, J.D. A Sociotechnological Model for Family Research and Intervention: How Information and Communication Technologies Affect Family Life. Marriage Fam. Rev. 2009, 45, 587–609. [Google Scholar] [CrossRef]
  73. Brown, T.A. Confirmatory Factor Analysis for Applied Research; Guilford Publications: New York, NY, USA, 2015. [Google Scholar]
  74. Kline, R.B. Principles and Practice of Structural Equation Modeling; Guilford Publications: New York, NY, USA, 2023. [Google Scholar]
  75. Levy, D. Love and Sex with Robots; Harper Collins: New York, NY, USA, 2007. [Google Scholar]
  76. Osborne, T.; Rose, N. Against Posthumanism: Notes Towards an Ethopolitics of Personhood. Theory Cult. Soc. 2024, 41, 3–21. [Google Scholar] [CrossRef]
  77. Cole-Turner, R. Posthumanism & Transhumanism; Wiley: Hoboken, NJ, USA, 2022; Available online: https://onlinelibrary.wiley.com/doi/abs/10.1002/9781118499528.ch122 (accessed on 4 May 2024).
  78. Díaz de Liaño, G.; Fernández-Götz, M. Posthumanism, New Humanism and Beyond. Camb. Archaeol. J. 2021, 31, 543–549. [Google Scholar] [CrossRef]
Figure 1. Structural equation model conceptual figure.
Figure 1. Structural equation model conceptual figure.
Sexes 06 00017 g001
Table 1. Rotated factor loadings.
Table 1. Rotated factor loadings.
12
Simulated Companionship
I am open to the idea of having a humanoid robot as a personal companion at home 0.66 *0
I think an elderly person who is lonely could benefit from having a humanoid robot as a companion0.92 *−0.02
I think the development of humanoid robots to meet companionship needs is a good thing 0.84 *0.05
I do not want any humanoid robot in my home or living space−0.44 *−0.1
Simulated Intimacy
I think it is possible to fall in love with a humanoid robot0.080.63 *
I humanoid robot would never have a “headache” and would always be available for sex0.020.74 *
I think people should be able to marry their humanoid robots if they want to 00.67 *
I think the development of humanoid robots to meet sexual needs is a good idea −0.060.82 *
The use of humanoid robots for having sex is stigmatized0.110.43 *
I could develop feelings of intimacy for a humanoid robot −0.020.54 *
I think humanoid robots could become family members in the sense that parents would depend on the robot0.42 *0.37 *
Note: * p < 0.05.
Table 2. Means of items.
Table 2. Means of items.
MSD
Simulated Companionship
I am open to the idea of having a humanoid robot as a personal companion at home 2.171.11
I think an elderly person who is lonely could benefit from having a humanoid robot as a companion3.321.1
I think the development of humanoid robots to meet companionship needs is a good thing 2.81.07
Simulated Intimacy
I think it is possible to fall in love with a humanoid robot2.121.14
I humanoid robot would never have a “headache” and would always be available for sex2.811.32
I think people should be able to marry their humanoid robots if they want to 1.81.05
I think the development of humanoid robots to meet sexual needs is a good idea 2.031.08
I could develop feelings of intimacy for a humanoid robot 1.450.78
Table 3. Parameters of the structural equation model.
Table 3. Parameters of the structural equation model.
Simulated IntimacySimulated Companionship
BSEBSE
Race0.18 *0.070.070.07
Sexual Value0.20 *0.070.10.07
Sex−0.010.07−0.16 *0.07
Self-Concept−0.0090.070.0070.07
Religiosity−0.10.07−0.14 *0.07
Sexual Orientation0.080.070.16 *0.06
Relationship Status0.060.070.020.06
Note: * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chang, I.J.; Welch, T.S.; Knox, D.; Likcani, A.; Tsay, A.C. Outsourcing Love, Companionship, and Sex: Robot Acceptance and Concerns. Sexes 2025, 6, 17. https://doi.org/10.3390/sexes6020017

AMA Style

Chang IJ, Welch TS, Knox D, Likcani A, Tsay AC. Outsourcing Love, Companionship, and Sex: Robot Acceptance and Concerns. Sexes. 2025; 6(2):17. https://doi.org/10.3390/sexes6020017

Chicago/Turabian Style

Chang, I. Joyce, Tim S. Welch, David Knox, Adriatik Likcani, and Allison C. Tsay. 2025. "Outsourcing Love, Companionship, and Sex: Robot Acceptance and Concerns" Sexes 6, no. 2: 17. https://doi.org/10.3390/sexes6020017

APA Style

Chang, I. J., Welch, T. S., Knox, D., Likcani, A., & Tsay, A. C. (2025). Outsourcing Love, Companionship, and Sex: Robot Acceptance and Concerns. Sexes, 6(2), 17. https://doi.org/10.3390/sexes6020017

Article Metrics

Back to TopTop