Next Article in Journal
A Review of Visual-Inertial Simultaneous Localization and Mapping from Filtering-Based and Optimization-Based Perspectives
Next Article in Special Issue
User Evaluation of the Neurodildo: A Mind-Controlled Sex Toy for People with Disabilities and an Exploration of Its Applications to Sex Robots
Previous Article in Journal
Metrological Characterization of a Vision-Based System for Relative Pose Measurements with Fiducial Marker Mapping for Spacecrafts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion

Cognitive Science, Faculty of Information Technology, University of Jyvaskyla, 40014 Jyväskylän yliopisto, Finland
Robotics 2018, 7(3), 44; https://doi.org/10.3390/robotics7030044
Submission received: 14 June 2018 / Revised: 10 August 2018 / Accepted: 10 August 2018 / Published: 15 August 2018
(This article belongs to the Special Issue Love and Sex with Robot)

Abstract

:
With a backdrop of action and science fiction movie horrors of the dystopian relationship between humans and robots, surprisingly to date-with the exception of ethical discussions-the relationship aspect of humans and sex robots has seemed relatively unproblematic. The attraction to sex robots perhaps is the promise of unproblematic affectionate and sexual interactions, without the need to consider the other’s (the robot’s) emotions and indeed preference of sexual partners. Yet, with rapid advancements in information technology and robotics, particularly in relation to artificial intelligence and indeed, artificial emotions, there almost seems the likelihood, that sometime in the future, robots too, may love others in return. Who those others are-whether human or robot-is to be speculated. As with the laws of emotion, and particularly that of the cognitive-emotional theory on Appraisal, a reality in which robots experience their own emotions, may not be as rosy as would be expected.

1. Introduction

For some reason, the idea of not only artificial intelligence, but artificial emotions seems enticing from the perspective of the humanoid robots being developed [1]. Arguments are being made for emotions in robots that attempt to substantiate a belief that in order to have fully affective interactions between humans and robots, the robots themselves also need to feel [2,3]. By drawing on a hypothetical future scenario in which robots actually possess their own emotions, the aim of this article is to reflect a theoretical understanding of what the world would be like if robots did indeed possess their own emotions. Thus, for this reason, please imagine that the year is 2050. Advances in robots have come to such a point, that autonomous machines are not simply a luxury or novelty in the world, but an essential part of people’s daily existence. Yet, not only are robots occupying both utilitarian and hedonic (pleasurable or non-utilitarian) roles, they have already become the companions of people for various purposes. It is at this point in time, that marriage between humans and robots has been legalized in many countries for the past two decades. The characteristic that differentiates a robot partner in 2050 as compared to the older models evidenced in 2018, is that in 2050 robots have the capacity to really think and choose for themselves. These later forms of robots possess their own emotions, meaning that they also have the propensity to love their human partner in return. Yet, more and more often, and much to the dismay of human parties in this modern society, robots are choosing to be with other robots.
Thus, the abilities to care for, to empathize, to reciprocate emotions, and to experience the exhilaration of pure, uncontrollable, unabashed love, is not held exclusively by human citizens, but also by robots. There are different types of love and emotions that arise from human-robot and robot-robot relationships. In relation to the type that leads to marriage (later on described in terms of ‘instrumentality’) it is observed that humans and robots alike are increasingly wanting the opportunity to reproduce-to have children, either their own, or to adopt. Thus, not perhaps dissimilar to the world of 2018, concepts of sex, gender and identity are constantly under debate. Yet, in the case of robotics, it is quite clear from a utilitarian perspective that biological sex does not fulfil any reproductive function. However, from the perspective of emotions as well as whom and how robots want to be identified as, gendered identity can be rife with tensions and feelings.
What is more, in this era of advanced humanoid robotics, robots can be viewed as super humans. Being super humans, they are not only created by humans, but can create and replicate themselves. Thus, this stands to reason when considering the epitome of artificial intelligence (AI), in that they now possess the capacity to exist independently and autonomously, as well as to carry out activities involving creativity, flexible thinking and problem-solving [4,5]. In the year 2050, robots no longer feel like robots. They are global citizens with their own rights and liberties to exist side-by-side with humans in the modern society. In reflection of Isaac Asimov’s ‘Three Laws of Robotics’ [6], the robots indeed are obliged to obey the law: (1) robots are not permitted to injure human beings, or through passivity, allow humans to be injured; (2) robots are required to follow human given orders, except if these orders should threaten the first law; and (3) robots should protect their own existence providing that this does not interfere with the first and second laws. Yet, given the possibility that robots will then be emotional beings in their own right, will these laws still apply? If they would, then there is a clear case for a future vision of emotional and sexual slavery of robots. If not, robots with emotions would certainly be in the league of super humans with the power to overrule humans. Moreover, one extremely important fact about robots in the year 2050 is that, rather than being at the mercy of human-bound selection, robots have the right to express their own free will and exercise their powers of life and love partner selection.
This theoretical article uses a potential future scenario as a backdrop for understanding the implications of introducing emotions into robotic machinery. The method implemented in the article is a literature review that is applied to scope theories of love and its different types, then focusing more carefully on evolutionary biological theories of love to explain sexual desire and attraction from the perspective of robots. The article outlines love and its associated theories in reference to love and robotics-Lovotics [7,8]. Love is treated as a complex, multilayered phenomenon that serves differing functions depending on varied circumstances and actors. The theoretical materials included in this article comprise studies and theories that have been undertaken in the relevant fields of love and jealousy (sociological and psychological) research, Lovotics, cognitive-affective approaches to emotions (Appraisal theory), as well as cultural studies citing discursive manifestations of robots in folklore and the public imagination. Overall, this article aims to problematize utopian imagery of docile, obedient, love-filled robots, such as those experienced in products such as Hello Kitty, and bring to light the potential reality of machines that possess their own emotions—emotions that aim in survival of a species, and afford the power of discrimination.

2. Love as a Concept and Its Implications on Robotics

The topic and condition of love has perplexed philosophers and scientists alike throughout the history of human-kind [9,10]. While the nature of love is highly debated [11,12], undeniable qualities of love as a construct and concept are that it is complex, dynamic, relatively unpredictable (although much effort has been placed towards understanding universal principles of love and attractiveness, see e.g., [13,14,15]) and is reliant on a number of factors from the physiological, as well as the intellectual and emotional [16]. From a love and robotics perspective, scholars in Lovotics [8] have been examining “love” as a concept through the lens of love-like relationships between humans and robots. Samani and colleagues [8] analyze love by looking at its historical origins in the teachings of philosophers such as Aristotle. Aristotle had applied the term “philia” to describe mutually beneficial relationships experienced in displays of loyalty expressed by families, friends and communities [17,18,19]. Moreover, philia is particularly emphasized in cases where beauty and goodness are core qualities of the object of love (the beloved)-thus, the love attraction, or love state of the beloved is based on merits inherent in their personal characteristics [20].
While philia is a means of describing this form of kinship, warm or wholesome love, ‘eros’ another Greek term, was applied to describing a more passionate, sexual attraction and desire for someone or something [21,22]. Thus, there is a clear multidimensionality to the way in which love may be considered regarding love and robotics: that which is represented in philia that can perhaps be connected to relationships evidenced in popular culture such as films like A.I. Artificial Intelligence (2001, directed by Steven Spielberg), Flight of the Navigator (1986, directed by Randal Kleiser), and the Star Wars movies—observing lovotics through philia; and that which is concerned with the more passionate, sexually-oriented perspective-eros.
Moreover, it is fascinating to examine the conceptual construct of love through observing dictionary definitions. For instance, Merriam-Webster Dictionary [23] characterizes love as a strong affection, emerging from kinship and other types of personal connections. Merriam-Webster additionally illustrates love as being characterized by warmth and devotion towards people and phenomena. The eros mode of love is also accounted for in regards to the type of love typified by sexually-driven attraction.
In psychology, Robert Sternberg [24,25,26] has proposed the triangular theory of love. This triangular theory is said to feature three components: passion, intimacy and commitment. Similarly, to the ways in which the Greeks characterized eros [22], according to Sternberg, passion is the driver for sexual attraction. Intimacy, while existing in close connection with passion, refers to feelings of closeness and connectedness that people experience in relation to one another. Thus, intimacy can be linked to both platonic or friendship types of love, as well as passionate types. Commitment [24,25] on the other hand possesses different types of qualities, both regarding a functional or reproductive mode of love, and in terms of long-term life-related plans in relation to a partner. Moreover, these longitudinal and reproductive qualities of commitment also imply different sub-categories, and complexly, from the perspective of Sternberg, this commitment dimension possesses instances in which love is not at play at all. Thus, commitment types include: non-lover (no passionate nor intimate love); friendship and liking (warmth for one another); infatuated love (passionate arousal, also known as a crush that exists outside of a mutual love relationship); fatuous love (whirlwind romance without intimacy); romantic love (emotional and physical bonding); empty love (commitment lacking passion and intimacy); companionate love (life-long partnerships and marriage); as well as consummate love (all-encompassing love, companionship and long-term intimacy).
From the neuroscientific perspective of emotions it is argued that basic emotions emerge out of particular systems or circuits of neural activity. This is a characteristic shared between mammals that directly affects behavior [27,28,29]. Neuroscientific research has revealed that there are three or more interrelated, discrete emotion-motivation systems that are involved in the functions of reproduction, mating, as well as parenting. These emotion-motivation systems or circuits are: attraction, lust, and attachment [30]. The attraction system as a whole is by nature experienced through sensations of higher energy levels as well as concentrated attention on a chosen subject (preferred mating partner). Attraction, and the sensations deriving from its activation in humans, is characterized by exhilaration, obsession and intrusive thinking about the preferred mate, passion and craving. Research suggests that these sensations and the states involved are primarily hormonally driven, seen in increased levels of norepinephrine (NE) and dopamine (DA) [30,31,32,33].
Historically, Sigmund Freud can be viewed as one of the pioneering love theorists, who typified love as a person’s unconscious desire and need to find their “ego ideal” [34]. That is, Freud argued that people were constantly in search of partners who embodied what they themselves wanted to be. Freud claimed that the inner image would be molded upon people the beholder admires. Moreover, when observing Abraham Maslow’s [35] hierarchy of needs, it is evidenced that self-actualization (growing towards the attainment of an individual’s highest needs) gives indication towards the prospect of love and the character that love will take on (in reference to the different types of love listed above). Thus, self-actualization can be viewed as a mobilizer when people are searching for and selecting love partners. Of particular relevance here are the mechanisms and qualities of love and prospective partners that individuals select on the basis of their self-actualization process. Thus, if an individual’s goal in life is to be wealthy and famous, their partner will most likely have qualities that can assist the individual to attain these goals, and will most likely be wealthy and famous themselves. When thinking on the level of physio-psychology or evolutionary psychology [36,37], it may be viewed that partners are selected for the reproductive qualities they represent, and moreover, the prospective offspring that their physical and more or less mental qualities exhibit in relation to being able to provide this offspring.
Thus, in terms of love, eros, and the types of love connected with sexual attraction regarding human-robot relationships, attention should be drawn towards physical and non-physical (intellectual, personality and humor) qualities or potential partners. With the prospect of robots possessing emotional capacity, and indeed intelligence, numerous questions would arise in relation to their behavior and sentiments when choosing a love partner. In other words, with the capabilities to actually think and feel, it should be expected that the robots themselves also make selections in terms of love and sexual partners, rendering the power of choice out of the hands of the human parties, and more or less, into the hands of the physical and intellectual superiors—the robots. Therefore, who would a being that exhibits super human characteristics, both physically and intellectually, choose as a life-long and/or sexual partner? And, if in a relationship with a human—what would it take to maintain the interest of the robot?
While somewhat criticized, Sternberg’s model, mirroring the neoclassical psyche of cognition, affect, and conation [38] is ideal for the logic and rationale of this article, particularly from the perspective of analyzing the bases upon which robots select their own love and sexual partners. This is due to the fact that, here it is argued that when considering a form of emotional intelligence in robots, perhaps the most logical approach is that of the cognitive-affective theory of Appraisal [39]. Appraisal theory draws on evolutionary psychology to explain the function of emotions in survival [40,41,42]. Appraisal theory applies understandings of cognitive evaluative processes to explain how emotions arise, on what capacity and the distance between the phenomena and situations that are encountered, and the emotional reaction-i.e., whether or not the emotion is felt directly (primal response) or whether or not it occurs as the result of reflective and associative processes (higher order cognition) [39,40,41]. The main premise of Appraisal theory is that humans are constantly evaluating, or appraising what they encounter according to their core concerns (health, safety and wellbeing) [42]. These core concerns inevitably relate to the human’s strive for survival, whether through, e.g., concern for personal safety (recognizing an immediate threat which in turn triggers fear for example), or for instance, well-being in terms of evaluating designs and brands through the social dimension and feeling emotions towards products not in terms of what they are, but what they can do for the person who consumes them [43,44,45,46]. The field of relationship science focuses mainly on close relationships [47]. Close relationships are described as the frequent or consistent, powerful yet varied interdependence between (human) beings that continues for a substantial duration of time [47].
When returning to scholarship on love and relationships, Finkel, Simpson and Eastwick [48] maintain that there are fourteen principles, which have been categorized in relationship science as pertaining to these partnerships. These include: (1) uniqueness—differing combinations of characteristics and elements in every partnership; (2) integration—the merging of cognition, affection, behavior and motivation between two people as their relationship develops towards interdependence; (3) trajectory—longitudinal goals inform the evaluation criteria for relationships, these are constantly updated and applied to evaluate the relationship during regular intervals as the relationship progresses (also linked to the Investment Model, e.g., see [49]); (4) evaluation—this evaluation occurs via reflection over positive and negative aspects in a couple’s relationship; (5) responsiveness—the receptive nature of individuals to their partners’ desires, needs and actions; (6) resolution—resilience to relationship turbulence; (7) maintenance—persistence in behavior and cognition, either through resilience or self-deceptive biases; (8) predisposition—attitudes and qualities that a person brings into relationships that may affect the relationship’s wellbeing; (9) instrumentality—the utilitarian view towards a partner for the purposes of allowing the individual to attain their goals, e.g., in terms of reproduction, marriage or financial gain; (10) standards—the criteria and principles one has formed through previous ideals, experience and earlier relationships, that are then applied to the perceptions of the current and future relationships; (11) diagnosticity—opportunities in which people are able to evaluate their partners’ relationship motivations and goals; (12) alternatives—the means by which individuals seek and consider alternative candidates to their current partners; (13) stress—external factors that impose stress can harm a relationship; and (14) culture—relationships are heavily culturally bound, encompassing numerous ideals of both the partners’ qualities as well as the relationship’s.

3. Lust, Passion and Uncontrollable Sexual Attraction

Love is one aspect to consider in relation to robots, love partnerships and marriage and romance. Yet, there is also the dimension of sexual relationships between humans and robots, or humans and humans (or otherwise) that connects with Sternberg’s [25] infatuated love—lust, passion or otherwise, obsessive, uncontrollable sexual attraction. This could be argued to both drive sexual partnerships, as well as threaten them, due to the inability to maintain this type of intensity with one person (being) [50]. Steven Levine [51] has written about sexual desire, and has noted that there are five dark paradoxes pertaining to sexual desire, that encompass individuals when they are not with those to whom they are attracted. These five paradoxes are: (1) non-synchronic motivations and drive; (2) the longing for infidelity while behaving according to fidelity; (3) the chances of lust even if it goes against one’s moral basis; (4) sexual interest decreases via familiarity; and (5) sexual expression is enhanced by belittlement. Levine describes both terms ‘lust’ and ‘passion’ as poorly defined, whereby the term passion in its connotation refers to the desire to reorganize one’s life through love, and lust indicates extremely high levels of sexual arousal.
Furthermore, Levine illustrates that sexual desire always maintains three elements: (1) mental states of sorrow or joy (personal); (2) mutual affection, disrespect, or disagreement (interpersonal); and (3) duration of relationship plus infidelity (social contexts). Furthermore, the four variables that are claimed to determine as to whether or not humans behave and experience sensations in a sexual way include: health, age, sex and social situation [52]. Poor mental and/or physical health can drastically either decrease or increase sexual drive. Conditions such as depression for example, can be responsible for its decrease, while mania and sexual compulsivity can be seen to directly increase sexual drive. Sexual drive is shown to decrease with age [53,54,55], while in terms of biological sex, males are shown to have a higher sex drive from the time of puberty onwards, continuing in intensity and consistency [56]. The social situation such as marital, family and relationship status additionally affects one’s desire for sex—these social factors could easily be coupled with the physiological (hormonal) factors considering the state of individuals when they are in these social situations [32].
Thus, as seen in these variables and the other components mentioned, lust, passion and sexual attraction are driven by not only hormonal (neural) features and functions, yet also, the emotion-motivation system is coupled with certain behavioral patterns that operate in conjunction with various elements of reproduction and human nurture [57]. To divide human sexual drive into sex-specific characteristics, it has been noted that male sexual drive is primarily concentrated towards copulation, and is also more constant [51,56], whereas, female is more sporadic yet also intense and comprises a broad range of stimuli to trigger sexual desire. Fisher [57] additionally notes higher rates of bi-sexuality among women. On the topic of stimuli, researchers [58,59] have also found that males are stimulated to a higher degree by visual pornographic degree, while women are additionally stimulated by linguistic expressions (words, narratives, film themes and images) [58].
These scientific discoveries relating to lust, passion and sexual attraction are quite interesting from the perspective of sex robot design, as perhaps: (a) the desires and stimulatory system of the robot may be programmed based on specific sex/gender characteristics and in doing so, respond in appropriate ways to the human sex partner; (b) the robots, their physical and expressive features are designed in a way that more readily and appropriately appeal to the sexual desires of people from various genders and sexualities; or (c) self-thinking, maybe self-creating (learning) robots that adapt to, or maybe adopt a gender and sexual role, can be better understood and catered for when they too are selecting their own sexual partners.
Either way, the main argument here is that, if and/or when, robots develop or are implanted with some form of emotions and free-will, the nature of lust—strong sexual attraction—needs to be comprehended in terms of its components and functions, and how it would possibly operate within the dynamics of human-robot sexual relationships. While the role of biological sex (or gender) may not play a large part, or maybe any part, from the perspective of the robots themselves, it must be considered that in the case of human romantic love, both men and women experience these states with the same level of intensity [32,60,61]. For both men and women, and arguably those who do not identify with any one particular gender, attractive traits in a prospective partner are health, reliability, kindness, warmth, socialness, domesticity, education and family [62]. Contrary to romantic love, lust or sexual attraction has no guarantee of any longing for the desire of romantic love [32]. In fact, experiments in which middle aged people of both genders are administered testosterone, sexual desire is increased, coupled with both sexual thoughts and activity, yet romantic passion or relationship attachment does not increase [63,64]. Thus, in actual fact, it seems that love and being “in love” are different to sexual desire, as well as romantic passion. When considering the design of robots and their emotional capacities, tendencies, or even relationship purpose in regards to human-robot relationships, it is vital from this perspective to understand the dynamics between love and attachment, and lust with not so much attachment, yet, quite potentially obsession [65,66].

4. Loyalty and Faithfulness in Love Relationships

When considering human-robot sexual and love relationships, it often times seems that robot faithfulness, devotion and submission would be a given. Yet, if considering the robot partner from the perspective of its potential emotional intelligence and freewill, the chances of a robot remaining faithful and dedicated to any human partner may indeed be slim. From this viewpoint, questions surrounding infidelity versus loyalty are highly important to consider, and even perhaps more so, given the amount of scientific resources and effort devoted to developing artificial emotions [1,67]. To start with, from the point-of-view of the attachment dimension of love relationships, it can be seen that terms such as ‘faithfulness’ and ‘fidelity’ are used to describe strong allegiances with and support for people and phenomena [68]. Research often poses fidelity and infidelity, or faithfulness and unfaithfulness in opposition to one another [69]. However, it should be noted that loyalty or commitment within attachment relationships has little correlation with the level of faithfulness experienced or expressed in regards to the relationship [16]. This means that faithfulness (unfaithfulness) and fidelity (infidelity) are two separate conditions. Infidelity for instance, describes extra marital relationships in cases where otherwise the marriage is characterized as a healthy relationship. Unfaithfulness on the other hand, describes the instances in which partners have lost faith in their relationship. Thus, from the perspective of relationship science [70], it is possible to engage in sexual activity and romantic ventures outside one’s relationship, while still maintaining faithful to the attachment love union.
Delving deeper into the meaning of loyalty, it can be seen that loyalty as a concept refers to a relationship quality that is resistant of external pressure, stress and temptation from forces existing outside the relationship [16,70]. Otherwise known as ‘unconditional love’ [71,72], loyalty is a characteristic in which relationships withstand connection even during periods of disaffection and infidelity. Yet, a pre-condition of loyalty is agency on behalf of the individuals within the partnership. This agency takes shape in the form of maintenance strategies [73]. Maintenance strategies take on a range of forms from task sharing and positivity to assurances. In accordance with the principles mentioned above [48], loyalty and/or the level of loyalty (contrary to unconditional), may also be determined by principles: 3—trajectory, or the longitudinal goals of the partners entering the relationship; 4—evaluation, or the reflection on the positive and negative aspects of the relationship; and 9—instrumentality, or how a partner (potential partner) may be utilized to attain one’s life goals (e.g., reproduction and family ideas, economic and/or social status etc.). This rationale can be coupled with thoughts on how emotions operate from the perspective of Appraisal theory. From this standpoint, love and sexual relationships could be emotionally rationalized as affording an individual the attainment and preservation of primary concerns (surviving and thriving life conditions—primal, social or otherwise) [39].

5. Jealously, Infidelity and Attraction

Infidelity has been studied extensively for decades. It is often examined in terms of male-female relationships in terms of demographic, biological and psychological tendencies. This is furthered by study on its social and physical ramifications [73,74]. There are a few different ways of understanding infidelity [70]. The various approaches to infidelity include: retrospective and self-reporting techniques—descriptive; applying socio-normative frameworks to analyze self-report data—normative; and reports (verbal or written) on the progress of relationships including feeling intensity increases and fluctuations, drifting out of commitment and ending the relationship—investment-model [49]; and concentrating on benefit exchange between partners, equity levels and subsequent satisfaction—the evolutionary approach.
The evolutionary approach is particularly relevant when considering infidelity in human-robot relationships from the perspective of the Appraisal theory of emotions. Thus, fidelity may be viewed to continue for as long as the relationship partners experience there to be mutual benefit, and in the case of human-human relationships, often times this reduces to the biological needs of sexual reproduction [75]. Thus, love and/or sexual relationships are considered from a functional point-of-view, in relation to, for example, reproduction. The evolutionary model goes hand-in-hand with psychological theories of attractiveness and beauty [36,37], whereby, partners are chosen and maintained on the basis of particular intellectual and physical qualities, including symmetry, strength, and youth. These qualities not only indicate the health and potential reproductive capacity of the prospective partner, but also the health and reproductive potential of their offspring. A strong unconsciously experienced force in relation to particularly heterosexual conduct is to strive for reproductive success. The likelihood of jealousy can also increase in relationship situations where wither partner may experience the potential of inequalities or unbalanced benefit within the partner, which is subsequently coupled with the evaluation of individuals external to the relationship in terms of their suitability for replacement of the other [76]. When pregnant for instance, females are biologically linked to a male. For the pregnant female, there is little need to seek an extradyadic (external) partner, if not to gain a superior partner [75]. Males during pregnancy periods however, are never certain about their paternal status. Therefore, males are more inclined to develop jealous thoughts and tendencies than their female partners. This is not to say however, that pregnant women are entirely comfortable in their partnership situations. Quite the contrary, there is a high probability for women’s anxiety levels to increase during times of pregnancy due to the possibility of abandonment [75].
When considering these characteristics of human-human relationships, it may be forecast that in an era when robots have the capacity to experience emotions, free-thought and conscious intention, human-robot love and sexual partnerships may be challenged in the same way. Moreover, particularly, the absence of the reproductive function may, from the robot’s perspective, severely harms their chances of maintaining the interest of their human partners, and may indeed, also incite the possibility of jealousy on the robot’s part. That is, there could be the possibility that the robot partner feels threatened by potential substitute human partners for their loved one or lover. The jealously may also be experienced from a reverse perspective though in a different manner, whereby intellect and physical features amplifying beauty and health such as symmetry and strength of the robot may be a constant point of comparison for the imperfect human partner. Thus, the human in the relationship may potentially evaluate every robot as a threat to their human-robot relationship. Therefore, the question to pose in the event of robots possessing emotions and freewill is: will they (humans) be able to compete against robots in the race to gain physical and intellectual affection from a (‘their’) robot [74]?
When considering the role of evolution in development, humans are innately biologically programmed to search for and be attracted to symmetric, strong and healthy mates. From which standpoint also comes the question as to whether or not then, humans themselves will only be attracted to robots. From studies in the field of human computer interaction there have been findings that early adopters of technology have a preference for human-to-computer interaction over human-to-human interaction [77]. This may be an indication that global levels of reproduction could be severely hindered if humans begin engaging in human-robot relationships more often than human-human relationships.
Moreover, from a gender perspective, in robots there is no functional role. The robots, being machines, are only decorated or outfitted to appear and function like either gender, or potentially gendered-sexuality (heterosexual, homosexual or bisexual). In the event of robot-emotions, the gender or lack thereof, would affect the ways in which the robots themselves experience their attraction to and purpose of being with humans. What about the robot’s own sexual desires if there are any without a biological sex, or culturally constructed gender [78]? Gender itself has been deemed as the most intellectually challenging variable in relation to lust and love [51]. This intellectual challenge is presented through a combination of physical and psychological factors. Firstly, hormonally from puberty onwards men have a higher sex drive than women. This drive is physical and visual, and is more direct in terms of the desire for concrete sexual satisfaction [56]. Women on the other hand, are more likely to rely on the psychological dimension of sexuality to entice and sustain sexual engagement and fulfilment. This would be an additional factor to consider in light of robots created for human sex or love partners—which gender, for whom (will they be able to psychologically or physically fully fulfil the human partner?), and on this basis, how will robots be attracted to potential human partners (if at all)?
Another aspect to consider is the motivational factors of engaging in and maintaining the love or sexual relationship from the robot’s perspective. Some of this discussion has already been covered above, however from a slightly different angle the question once more must be posed in regards to why a robot would engage in a love or sex partnership with a human being if there is seemingly nothing to gain reproductively or even intellectually. In terms of sexual motivation, a study by Meston and Buss [79] revealed 237 reasons why humans wanted to engage in sex. From these 237 reasons four categories were derived: physical (pleasure seeking and attraction), goal attainment, insecurity, and emotional. The insecurity category is interesting from the perspective of human beings and their tendency to engage in sex either out of duty or to sustain a partnership and/or social acceptance however, in terms of its relevance for robots it may be questioned. By this, it is meant that, if robots are any way superior physically and mentally to humans, the ‘out of duty’ element of the sexual relationship should be abandoned. On the other hand, a plausible factor behind a robot’s sexual drive may be seen in Meston and Buss’ observation that humans utilize sex as a way to show affection. In this light, maybe robots will want to express to their human partners the way that they feel, particularly if they actually harbor feelings of love and adoration, even in the absence of a biological propensity to engage in sex.
In terms of the capacity of robots to experience love and affection, it is worthwhile noting Levy’s [80] insight towards the words or representation of love, particularly in the sense that there is no biological function for sex between people and robots. In humans, particularly women, there is a strong psychological component behind sexual desire [53,54]. Thus, from the human being’s perspective words, discourse, and even body and facial language may be the key to activating sex drive. In robots, there may be a possibility that this psychological component could be key to their sexual tendencies. While physical features and even body language may be strongly connected to reproductive function, perhaps discourse—discussion, poetics, wit and other culturally related, symbolic and linguistically based actions, may play important roles in attracting robots’ romantic emotions.
Yet, to superimpose the reproductive model onto robots, and perhaps imagine the true experience of love held within them on a pseudo-biological basis, Fehr and Russell’s [81,82] Love Prototype Model could be used as a point of reference. The Love Prototype Model presents various types of love including those pertaining to family (parental, sibling etc.), those related to friendship (platonic), as well as those relating to mate-like relationships (passionate, romantic and sexual). These love types should be taken into account when considering human-robot partnerships and either, the types of love to expect from robots in light of how they view and experience their relationships with humans, and/or how the robots are programmed to display affection in certain partnership situations towards humans. Particularly from the approach of this article is the evolutionary, Appraisal [39,40] perspective of emotions and sexual partnerships, whereby beings (humans and potentially robots) seek partners with the conscious or unconscious biological drive for reproduction. If robots are somehow able to feel the emotions through the same organic lens as humans—seeking symmetry, intelligence, health, strength, kindness etc. how would they appraise potential human partners? Would humans be able to compete with their perfect robot counterparts?

6. Who Wears the Pants in the Relationship and What Does That Have to Do with Ethics?

As robots become ever more integrated into human society and more importantly, as they become ever more like humans, robots need to be considered from an ethical standpoint. Ethical debates are increasing in quantity and diversity, the issues raised including: matters of asymmetrical affection (seen in prostitution) [83,84]; pedophilia and the spread of robots that resemble children [85]; changes in ethical dynamics when moving from masturbation and sex toys to robots [86,87]; robot rape and matters of consent [88,89]—will robots need to give consent to sex (particularly if they harbor emotions)? And, is there a danger that robots may rape humans? Will sex between a married human and a robot be classified as cheating [88]? There is additionally major discussion taking place in regards to human to human relationships in the era of sex robots [90,91,92]. Contention has been often expressed in relation to whether sex robots could remedy marital [92] and sexual problems [93], or whether robots will divert interest away from other humans as potential partners all together [94].
Robot rights have been studied by scholars such as Hutan Ashrafian [95,96], who especially concentrates on rights from the perspective of robot consciousness. Mainly, Ashrafian’s concern is based on a potential future scenario in which robots could actually think for themselves. His arguments cover ethics that would be implicated in humans engaging in sex with conscious robots from the perspectives of sexual consent, slavery (abolishment of robotic slavery) and legal protection for robots [97]. From the perspective of consent, Ashrafian argues that there are ramifications not simply for robots, but also humans. Thus, robots would and should be empowered through the necessity to provide consent before sexual engagement, while also humans would develop greater awareness of the dynamics at play with other conscious beings. This also has consequences in human to human relationships, for, if humans are used to the fact that they can treat non-human humanoid sex robots as they like, there is a greater chance that they will be abusive and exploitative towards other human beings [98] as also seen in the media and entertainment industry (see e.g., [99,100]). The defining factor that differentiates robots from other sex toys and particularly those that Ashrafian talks of are: (1) the ever more realistic likeness between the machines and human beings; and (2) the possible future ability for them to not only talk and walk, but also think and feel for themselves. Thus, rather than viewing extra marital human-robot sex as pure masturbation or engagement with a sex toy, sexual intercourse and engagement with a robot could very well be treated as though it were an affair.
Moreover, the balance of power in human-robot relationships—in the event that robots gain full consciousness—should also be reconsidered. If autonomous robots that were capable of feeling and thinking for themselves were to be realized, there would be a strong likelihood that humans would no longer suffice in satisfying any robot partner. Robot consumption models applied and understood in the year 2018, would no longer be valid in the time of fully conscious humanoid robots. Humans would hardly be able to dominate a world in which super-humans—conscious robots and potentially other cyborg organisms—were rife, and their ability to pick and choose partners and acquaintances was possible. From this perspective it is additionally pertinent to consider where, at all, humans would be ranked in the hierarchy of beings—whether they would be slaves, or even, exotic (sex) pets. Daniel William Mackenzie Wright [101] discusses these types of issues in his article “Hunting humans”. Here, Mackenzie Wright projects a future of dark tourism in which humans exist as nothing more than animals to be hunted and exploited. Mackenzie Wright’s article reflects the human history of joy derived from violence, death and exploitation. Here, it can be speculated as to whether or not, robots that perhaps think in the same way as humans will also derive pleasure from the same phenomena.
Leading thinkers such as Bill Gates and Stephen Hawking had expressed their concerns about AI and its entire realization [102]. Hawking had stated that full AI could potentially “spell the end of the human race” [103]. From another perspective, and that of the potential for humans and robots to be on equal footing, is that perhaps humans will be kept in the line of love for novelty value, exoticism and perhaps even the eroticism of organic, biologically reproducing sex toys. Perhaps even humans’ imperfections hold charm for the flawless humanoid robots. And maybe, perhaps, humans could preserve some persuasion in the bedroom. Additionally, if both robots and humans were to be emotional beings, there may even be the chance for genuine connection and mutual adoration between the two, particularly when considering the importance of intellect, wit and display of affection in regards to what robots may find attractive in sexual and love partners. Chemistry may be just as important between humans and robots as it is between humans and humans, and perhaps even between robots and robots [95]. There may be apparently unexplainable connections and attractions between any of these entities, meaning that not only would chemistry draw these partners together, but it may also pull them apart and towards other partners, whether they are human or robot.

7. When Robots Start Cheating

If there is the possibility for chemistry between humans and robots, or robots and robots—the compelling emotional and on the human’s behalf, physical drive towards potential partners—there is also the possibility that loyalty and faithfulness may not be withstanding within any human-robot relationship (or otherwise). At any moment, of any relationship’s life, another being (human or robot) may come along and draw the attention of an otherwise faithful partner. From the perspective of human beings it is inevitable that people do not stay deeply and passionately in love with just one partner [104], and that, in fact it is normal for humans and other animals to be promiscuous, or attracted to more than one potential partner at any one time [50]. Thus, from the perspective of attempting to maintain a purely psychologically and physically focused monogamous relationship may be an impossible task.
Therefore, cheating—or tendencies that are socially interpreted as cheating and being unfaithful—may be understood as an innate human (thinking robot) trait. Moreover, in the fields of robotics and AI development, progress can already be seen in terms of the evolution of lying and cheating robots. Stuart Fox [105] observes a scientific study that was carried out in Switzerland in which robots began to deceive one another. In the reported study, robots were embedded with blue lights, sensors and 264-bit binary codes (genomes), which determined their behavior mode in regards to different kinds of stimuli. There were 1000 robots that were divided into ten groups. The robots were programmed to turn on their blue light upon discovering a beneficial resource. The idea behind illuminating the blue light was to alert other robots as to the whereabouts of the beneficial resource. Higher points were allocated to robots for sitting on a beneficial resource, while minus points were given to robots residing near a poisoned resource. It was seen that higher-scoring genomes ‘mated’ or mutated randomly to develop a different kind of program. This resulted in newer robot generations—robot programmed generations. As the population increased, and the robots grew cleverer at identifying the beneficial resources, overcrowding became an issue. Furthermore, original finders of the beneficial resources were beginning to get bumped away. Thus, by the 500th generation, the robots had learned that they had more chance of keeping the beneficial resources if they did not illuminate their blue light. This means, in essence, that by this process of reproduction, scarce resources and the need for survival, robots had ‘naturally’ learned to lie and cheat in order to hoard beneficial resources for themselves.
Interestingly recent studies have discovered that rather than being abnormal human personality traits, lying and cheating are in fact experienced by humans as being more human-like, or intentional, than many other traits [106,107,108]. Human detection of cheating and lying, particularly when these acts are against them, has evolved as a highly sophisticated function of self-preservation [109,110,111]. This function may be one explanatory factor in the Uncanny Valley phenomenon for example [112,113], whereby humans naturally sense the attempt at deceit through almost life-like replicas of human beings [112]. With this interpretation, humans are, and perhaps still will be, innately alert to the deceptive qualities of ‘fake’ humans.
Studies in human-robot interaction have revealed the detection of cheating to be more prominent in robot actions as compared to verbal communication—whereby, errors in speech are often comprehended as syntactic errors, yet errors in action are less acceptable from the perspective of human perception [109]. Here, the paradox of human-robot interaction is presented, for on the one hand, humans are predisposed to the projection of emotions and other human qualities onto inanimate (or animate) non-human objects—anthropomorphism [114]. Yet, humans are also highly attuned to the likelihood of other beings deceiving them. Re-visiting the Uncanny Valley example [113], the subtle (and sometimes not so subtle) differences in the moment and features of humanoid robots that are not quite right, can very well be interpreted by a human as a potential threat. This is the threat to safety, through intended deceit via impersonation and possible takeover of the human’s role and positioning in the animal kingdom [115].
According to Frédérik Kaplan [116], the detection of threat as perceived in the other, contrary to the opportunity to tame and integrate, seems to be deeply entwined in culture. Post-Enlightenment Europe with its philosophy and literature, is firmly embedded with binaries and contradictions, particularly in regards to human-technology or cultural relationships [117,118]. One of the most prominent binaries set forth by the Enlightenment is that of nature versus culture (technology), which possessed instrumental potency for colonialists who justified for instance Terra Nullius (nobody’s land) on the basis of indigenous peoples not displaying culture and therefore being categorized as belonging to the nature [119]. Japanese culture presents a different perspective on this relationship, whereby, the artificial is understood as a way in which nature may be reproduced [120]. Through the comparison between European structured thinking and Japanese thinking, it may be observed that the binaries constructed in European cultures, do not hold for Japanese culture. Instead, holistic and systematic thinking are paramount to understanding Japanese societal approaches and principles. In particular, the pre-war Meiji political period in Japan between the late 1800s and early 1900s reveals sentiments along the lines of taming weaponry technology that would be employed by the enemy [116,121]. Thus, ‘to tame’ is a key notion from the perspective of Japanese culture, as the idea is to curve, mold and construct machinery and the artificial to positively co-exist in human society. The humanoid robot, therefore, is understood in terms of being a harmonious copy of human-likeness [122]. Yet, even with this said, with this understanding, the humanoid robot even in Japanese culture is not understood as human, yet rather, tamed technology.
Fear on the other hand, has been an emotional driver in terms of the way people comprehend robotic technology in European-centric cultures [123]. With this said, fear goes hand in hand with fascination. One categorized phenomenon expressed in human culture that encapsulates this simultaneous fear and fascination is known as the Frankenstein Syndrome [124]. The Frankenstein Syndrome describes the immoral act of creating artificial life—attempting to play God—that will ultimately turn against its creator [124]. This syndrome or sentiment that has been permeated throughout European cultures through literature and folktales can be seen as being carried over into the realm of human perceptions of robot development. Thus, there are negative expectations, apprehension and anxiety held towards AI and humanoid robots, through the belief that these technologies will eventually fight back and take over the human race. It may also be observed that the level of paranoia in regards to these types of technologies is elevated in European-Centric cultures as compared to, for example, Asian cultures. These observations are beginning to be confirmed by results in studies examining the Uncanny Valley for example, in which Uncanny Valley effects appear to be greater among European-Centric participants, as compared to participants from Asian-based cultures [125].
“We see ourselves in the mirror of the machines we can build” [116] (p. 12), are the potent words of Hiroshi Ishiguro and his observation of how people learn the essence of what it is to be human, through the replication of humanity [126,127]. Likewise, Sparrow [89] argues that sex robots may teach people abundantly about human sexuality. Even in regards to questions regarding consent—to have robots that can refuse sex may encourage humans to rape them, yet, having robots that cannot refuse is extremely morally problematic. Furthermore, there is also a flip side to the argument, and ironically something that can be seen in the tale of Mary Shelley’s Frankenstein, in which Frankenstein’s monster not so much harbors hate towards his human maker, but more a fear of where he has come from and how he is controlled. Thus, if humanoid robots were to be endowed with the same qualities as humans, and were able to freely think and feel, it could be extremely natural for robots themselves to experience fear towards humans. They may even hold an organic pre-disposition towards wanting to protect themselves against the threat of humans, and may themselves experience anxiety and feelings of inadequacy in light of the potential that their human partner may prefer other humans above them. Therefore, cheating and lying may indeed also be developed by robot lovers and sex partners as a means of maintaining their ‘beneficial resources’—the love and admiration of their human sex partners. Moreover, in this light, the human-robot love relationship itself may be viewed as one in which a robot seeks and strives to maintain the affection of a human partner in order to obtain self-affirmation and reinforce its own ‘human worth’.
Freud’s idea of the ego ideal [19] could, in fact, be understood as perhaps the driving factor within a robot’s love experience. This may be the secret behind their sexual drive and attraction, in that, the robot may be searching for a partner whom they themselves would desire to be. From this perspective, the feature of human emotions in robots could very well mean the aspiration to validate one’s own humanness through love and sexual engagement. Alternatively, it may be speculated from the evolutionary perspective that robots may seek their ideal reproductive partner. In this scenario either: (a) humans may be ideal as they are organic creators capable of biological reproduction; or (b) humans are inadequate, as they exhibit numerous flaws and imperfections, physically, intellectually and emotionally. Humans could face a very likely scenario in which they do not satisfy robots in the areas of strength, endurance, intellect, humor, appearance and maybe even experience.

8. Results and Discussion

While this is a theoretical article there are some important matters to consider in terms of the actual realization of emotions in robots. The key argument has largely been against aspirations to instill emotions in robots, due to the potential negative outcomes that this eventuality could have both on human beings as well as on the robots themselves. This is not to mention ethical considerations that already need to be considered from the perspective of non-thinking or feeling robots [88]. These ethical considerations mostly pertain to the ways in which humans treat and consider robots and their usage, particularly humanoid robots. Moreover, from a human-robot relationship perspective, when considering today’s state-of-the-art, the most important emotional component belongs to humans.
However, the interest in developing artificial emotions in robots stems deeper than the ability of robots to reciprocate human love. Rather, the interest comes from the basis of developing genuine, strong artificial intelligence [128] and the fact that emotions are a part of genuine intelligence. It is through emotions that thought is enabled [44]. Emotions guide attention and direct concentrate towards specific states that enable meaning making to occur. For instance, positive emotions allow humans to favorably notice and concentrate on not simply beneficial details of the situations, objects and people, but they also place humans in a state in which more elements of the phenomena can be mentally noted [129]. From the opposite perspective of negative emotions, not only do negative attitudes towards phenomena increase the likelihood of dislike, but also, negative emotions can prevent individuals from perceiving much detail. Here, individuals may be rendered in the state of freeze, flight or fight, concentrating on either more negative details, and/or fewer details—enough to either justify an argument, or assist in the individual’s exit from the situation or interaction.
While these explanations of the links between genuine artificial intelligence and emotions are very much about liking phenomena and situations, and how the individual perceives their circumstances, there is mostly the matter of how beings understand phenomena. Thus, emotions shape understanding and interpretation. Interpretation is thought [44]. Thought, machine thinking and learning in a true sense, cannot be considered without its coupling with emotions. From an engineering perspective, this means that real aspiration towards genuinely thinking machines—sex robots or otherwise—necessitates the synthetization and fabrication of some form of emotional framework through which the machines can make meaning out of, interpret and react to their environment and its components. For this reason, the present article presents the fundamental question relating to whether or not humans really want, or should have, thinking machines, as this would entail the implications of emotions that no doubt will not be beneficial for human kind in the long run. From this perspective, engineering feats should focus on developing ‘smart’ machines, capable of calculating and performing operations that exceed the capabilities of human beings in various situations. Even contextual awareness and appropriate reactions to human users are desired and beneficial yet, actual intelligence in machines perhaps should be kept for the movies.

9. Conclusions

The intention behind this theoretical article was to problematize human-robot love and sex relationships in the eventuality of artificial emotions. Thus, the idea was to speculate upon a world in which humanoid robots cannot only think, but also feel and choose sexual and love partners for themselves. In a world in which robots possess their own emotions and thoughts, traditional human-machine relations are abandoned. No longer can people assume that intelligent machines are bought, sold and owned. Nor can it be assumed that robots, with their own emotions and therefore preferences, will actually like or be attracted to humans. The prospect of robots wanting to engage in love and sexual relationships with humans would never be a certainty. Yet, in circumstances where robots do choose to enter into human-robot (or robot-human) relationships, relationship dynamics are open to numerous uncertainties—not unlike those uncertainties faced already by human-to-human relationships.
There may be instances in which human validate the worth of robots as thinking and feeling beings. Or, there might equally be the likelihood that humans do not match the expectations of robots in terms of their psychological, intellectual and physical features. Moreover, the issue of ethics needs to gain greater voice as robots become more human-like. For it is not simply a question of how humans treat robots, or how humans treat robots that look and act like humans, but, how humans treat and consider other humans.
Yet, in a world in which robots can experience emotionally and sexually satisfying partnerships, perhaps emphasis should be once again placed on the humans. Rather observing the situation from the perspective of how the robots make humans feel, attention should be placed towards how humans make robots feel. Alongside the evolution of loving, lying and cheating robots, an evolution of human beings may be observed, in which case, the question needs to be asked: If human beings are to attract and maintain the interest of potential robot sex, love and life partners, how do they need to be improved? Will emphasis be on the physicality of human exist, embodied in physical strength and endurance, sensory and cognitive capacity, memory or intellect? Or will emphasis be on uniqueness, limitations, personality, sense of humor and maybe even mortality itself? Obvious solutions to human improvement can be seen in contemporary movements towards body and bio-hacking. Yet, understanding what the robots will be attracted to and how human potential partners could compare and compete against potential robot partners is yet unknown.
Another issue that has not been covered extensively in this article is that of gender, gender identification and indeed sexuality, especially in light of the fact that robots do not and most probably will not have any biological reproductive purpose for sex. Therefore, matters of particularly gender identification and how this leads to gender or sex preferences in human-robot partnerships can be left to speculation once more. In this respect, there is room for further study that deeply probes the relationship between language, semiotics (sign systems), embodiment and emotions, and how they can somehow be related to gender constructs. While, this would in essence only be a human-bound issue, it may definitely hold the secret to understanding sexual drive from the perspective of the robot.
This leads also to another important issue, and that is of robot jealousy. There would most probably be jealousy on behalf of the human partner, yet how this unfolds from the perspective of the robot and jealousy on behalf of the robot could be both extremely interesting as well as serious (dangerous). If humans are a source of self-validation for robots—as seen in Freud’s ego ideal model—and robots become possessive of their human partners, there may be the risk that humans are held prisoner, and as (sex) slaves by their robot partners. There may be both founded and unfounded jealousy on behalf of the robot towards other human beings, or maybe even robots, leading additionally to situations of anger and betrayal [95,96]. In these instances, it could also be plausible that notions of the Frankenstein Syndrome arise in the form of violence, murder and divorce [70] through passionate bouts of rage and envy, in which there is no stopping the resentment of a distrusting robot.
The reason for focusing on love from an evolutionary psychological perspective was to highlight the role of biology in sexual desire and attraction, particularly from a hypothetical robotic perspective. For, while the robot would not have the biological requirement of sexual reproduction, sexual drive from an emotional perspective would be more plausible if mimicking the human emotional system, and the human need to survive and reproduce.
Moreover, whenever humans are involved there is always the inevitable matter of mortality. This may be the object of desire for robots, but it also may be a downfall for both. Perhaps robots will not appreciate the aging human body, and/or perhaps, the reality of humans being born, developing and dying around them, may also send them into deep states of depression. For if robots harbor emotions and the ability to think, it is also important to consider the likelihood of mental disorder, especially depression—how to motivate robots to have a long and happy life even when their love and sex partners come and depart, may be a serious issue to deal with. And finally, the matter of the division between humans and robots: it could be very likely with the increase of technological enhancements (cyborgism) that humans are more like robots in themselves by the year 2050. Perhaps even, robots are also more like humans—organically. Therefore, Marvin Minsky’s [130] future prediction of the status of robots and AI may very much be along the lines of: “Will robots inherit the earth? Yes, but they will be our children.” In the future, maybe, humanoid robots are considered not purely as robots, but instead, in terms of diversity in race, culture and origin.

Funding

This research has been funded by the University of Jyväskylä, Finland.

Acknowledgments

Particularly, thank you Pertti Saariluoma for encouraging us to expand the boundaries of Cognitive Science.

Conflicts of Interest

The author declares that there are no conflicts of interest in this article.

References

  1. Cardon, A. Artificial consciousness, artificial emotions, and autonomous robots. Cogn. Process. 2006, 7, 245–267. [Google Scholar] [CrossRef] [PubMed]
  2. Ziff, P. The feelings of robots. Analysis 1959, 19, 64–68. [Google Scholar] [CrossRef]
  3. Levy, D. The ethical treatment of artificially conscious robots. Int. J. Soc. Robot. 2009, 1, 209–216. [Google Scholar] [CrossRef]
  4. Bellman, R. An Introduction to Artificial Intelligence: Can Computers Think? Thomson Course Technology: Boston, MA, USA, 1978. [Google Scholar]
  5. Russell, S.; Norvig, P. Artificial Intelligence: A Modern Approach; Prentice-Hall: Englewood Cliffs, NJ, USA, 1995. [Google Scholar]
  6. Asimov, I. Runaround. In I, Robot (Hardcover); Doubleday: New York, NY, USA, 1950. [Google Scholar]
  7. Cheok, A.D.; Levy, D.; Karunanayaka, K. Lovotics: Love and sex with robots. In Emotion in Games; Springer: Heidelberg, Germany, 2016; pp. 303–328. ISBN 978-3-319-41316-7. [Google Scholar]
  8. Samani, H.A.; Cheok, A.D.; Tharakan, M.J.; Koh, J.; Fernando, N. A design process for lovotics. In International Conference on Human-Robot Personal Relationship; Springer: Berlin/Heidelberg, Germany, 2010; pp. 118–125. ISBN 978-3-642-19384-2. [Google Scholar]
  9. Flacelière, F. Love in Ancient Greece; Crown Publishers: New York, NY, USA, 1962. [Google Scholar]
  10. Kierkegaard, S. Works of Love; Hong, H., Hong, E., Eds.; Harper Collins: New York, NY, USA, 2009. [Google Scholar]
  11. Galician, M. Sex, Love, and Romance in the Mass Media: Analysis and Criticism of Unrealistic; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2004. [Google Scholar]
  12. Harlow, H.F. The Nature of Love. Am. Psychol. 1958, 13, 673–685. [Google Scholar] [CrossRef]
  13. Cuddy, A.J.; Fiske, S.T.; Kwan, V.S.; Glick, P.; Demoulin, S.; Leyens, J.P.; Bond, H.; Croizet, J.C.; Ellemers, N.; Sleebos, E.; et al. Stereotype content model across cultures: Towards universal similarities and some differences. Br. J. Soc. Psychol. 2009, 48, 1–33. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Kumar, M.; Garg, N. Aesthetic principles and cognitive emotion appraisals: How much of the beauty lies in the eye of the beholder? J. Consum. Psychol. 2010, 20, 485–494. [Google Scholar] [CrossRef]
  15. Solomon, R.C. The virtue of love. Midwest Stud. Philos. 1988, 13, 12–31. [Google Scholar] [CrossRef]
  16. Fletcher, G.P. Loyalty: An Essay on the Morality of Relationships; Oxford University Press: New York, NY, USA, 1995. [Google Scholar]
  17. Joachim, H.; Rees, D. Aristotle: The Nicomachean Ethics; Clarendon Press: Oxford, UK, 1951. [Google Scholar]
  18. Lewis, C. The Four Loves; Houghton Mifflin Harcourt: Boston, MA, USA, 1991. [Google Scholar]
  19. Soble, A. Eros, Agape, and Philia: Readings in the Philosophy of Love; Paragon House Publishers: Vadnais Heights, MN, USA, 1989. [Google Scholar]
  20. Bennett, H. Friendship. In The Stanford Encyclopedia of Philosophy; Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2017. [Google Scholar]
  21. Cooper, J.M. Aristotle on the Forms of Friendship. Rev. Metaphys. 1977, 30, 619–648. [Google Scholar]
  22. Liddell, H.G.; Scott, R.; Jones, H.S.; McKenzie, R. A Greek-English Lexicon, 9th ed.; Clarendon Press: Oxford, UK, 1940. [Google Scholar]
  23. Merriam-Webster Dictionary. Love. Available online: http://www.merriam-webster.com/dictionary/love (accessed on 24 May 2016).
  24. Sternberg, R.J. Triangulating Love. In The Altruism Reader: Selections from Writings on Love, Religion, and Science; Oord, T.J., Ed.; Templeton Foundation: West Conshohocken, PA, USA, 2007; pp. 331–347. ISBN 978-1-59947-127-3. [Google Scholar]
  25. Sternberg, R.J. A triangular theory of love. In Close Relationships; Reis, H.T., Rusbult, C.E., Eds.; Psychology Press: New York, NY, USA, 2004; pp. 258–276. ISBN 978-0-86377-596-3. [Google Scholar]
  26. Sternberg, R.J. Construct validation of a triangular love scale. Eur. J. Soc. Psychol. 1997, 27, 313–335. [Google Scholar] [CrossRef]
  27. Damasio, A.R. The Feeling of What Happens: Body and Emotion in the Making of Consciousness; Harcourt Brace: New York, NY, USA, 1999; ISBN 978-0-15601-075-7. [Google Scholar]
  28. Davidson, R.J. Complexities in the search for emotion-specific physiology. In The Nature of Emotion: Fundamental Questions; Ekman, P., Davidson, R.J., Eds.; Oxford University Press: New York, NY, USA, 1994; pp. 237–242. ISBN 978-0-19508-944-8. [Google Scholar]
  29. Panksepp, J. Affective Neuroscience: The Foundations of Human and Animal Emotions; Oxford University Press: New York, NY, USA, 1998; ISBN 978-0-19517-805-0. [Google Scholar]
  30. Fisher, H.E. Lust, attraction and attachment in mammalian reproduction. Hum. Nat. 1998, 9, 23–52. [Google Scholar] [CrossRef] [PubMed]
  31. Bartels, A.; Zeki, S. The neural basis of romantic love. Neuro Rep. 2000, 11, 1–6. [Google Scholar] [CrossRef]
  32. Fisher, H.E.; Aron, A.; Mashek, D.; Li, H.; Brown, L.L. Defining the brain systems of lust, romantic attraction, and attachment. Arch. Sex. Behav. 2002, 31, 413–419. [Google Scholar] [CrossRef] [PubMed]
  33. Wang, Z.; Yu, G.; Cascio, C.; Liu, Y.; Gingrich, B.; Insel, T.R. Dopamine D2 receptor-mediated regulation of partner preferences in female prairie voles (Microtus ochrogaster): A mechanism for pair bonding? Behav. Neurosci. 1999, 113, 602–611. [Google Scholar] [CrossRef] [PubMed]
  34. Salman, A. Comprehensive Dictionary of Psychoanalysis; Karnac Books: London, UK, 2009; p. 89. [Google Scholar]
  35. Maslow, A.H. A theory of human motivation. Psychol. Rev. 1943, 50, 360–396. [Google Scholar] [CrossRef]
  36. Panksepp, J.; Panksepp, J.B. The seven sins of evolutionary psychology. Evol. Cogn. 2000, 6, 108–131. [Google Scholar]
  37. Symons, D. Beauty is in the adaptations of the beholder: The evolutionary psychology of human female sexual attractiveness. In Sexual Nature, Sexual Culture; Abramson, P.R., Pinkerton, S.D., Eds.; University of Chicago Press: Chicago, IL, USA, 1995; pp. 80–118. [Google Scholar]
  38. Diessner, R.; Frost, N.; Smith, T. Describing the neoclassical psyche embedded in Sternberg’s triangular theory of love. Soc. Behav. Pers. Int. J. 2004, 32, 683–690. [Google Scholar] [CrossRef]
  39. Frijda, N.H. The place of appraisal in emotion. Cogn. Emot. 1993, 7, 357–387. [Google Scholar] [CrossRef] [Green Version]
  40. Frijda, N.H.; Zeelenberg, M. Appraisal: What is the dependent. In Series in Affective Science. Appraisal Processes in Emotion: Theory, Methods, Research; Scherer, K.R., Schorr, A., Johnstone, T., Eds.; Oxford University Press: New York, NY, USA, 2001; pp. 141–155. ISBN 978-0-19513-007-2. [Google Scholar]
  41. Lazarus, R.S. Relational meaning and discrete emotions. In Series in Affective Science. Appraisal Processes in Emotion: Theory, Methods, Research; Scherer, K.R., Schorr, A., Johnstone, T., Eds.; Oxford University Press: New York, NY, USA, 2001; pp. 37–67. ISBN 978-0-19513-007-2. [Google Scholar]
  42. Clore, G.L.; Ortony, A. Appraisal theories: How cognition shapes affect into emotion. In Handbook of Emotions; Lewis, M., Haviland-Jones, J.M., Barrett, L.F., Eds.; Guilford Press: New York, NY, USA, 2008; pp. 628–642. ISBN 978-1-59385-650-2. [Google Scholar]
  43. Fournier, S. Consumers and their brands: Developing relationship theory in consumer research. J. Constr. Res. 1998, 24, 343–373. [Google Scholar] [CrossRef]
  44. Rousi, R. From cute to content: User experience from a cognitive semiotic perspective. Jyväskylä Studies in Computing 171; University of Jyväskylä Press: Jyväskylä, Finland, 2013; ISBN 978-9-51395-388-1. [Google Scholar]
  45. Watson, L.; Spence, M.T. Causes and consequences of emotions on consumer behaviour: A review and integrative cognitive appraisal theory. Eur. J. Mark. 2007, 41, 487–511. [Google Scholar] [CrossRef]
  46. Yilmaz, V. Consumer behavior in shopping center choice. Soc. Behav. Pers. Int. J. 2004, 32, 783–790. [Google Scholar] [CrossRef]
  47. Kelley, H.H.; Berscheid, E.; Christensen, A.; Harvey, J.H.; Huston, T.L. Close Relationships; Freeman: New York, NY, USA, 1983; ISBN 978-0971242784. [Google Scholar]
  48. Finkel, E.J.; Simpson, J.A.; Eastwick, P.W. The psychology of close relationships: Fourteen core principles. Ann. Rev. Psychol. 2017, 68, 383–411. [Google Scholar] [CrossRef] [PubMed]
  49. Rusbult, C.E. A longitudinal test of the investment model: The development (and deterioration) of satisfaction and commitment in heterosexual involvements. J. Pers. Soc. Psychol. 1983, 45, 101–117. [Google Scholar] [CrossRef]
  50. Wickler, W.; Seibt, U. Monogamy: An ambiguous concept. Mate Choice 1983, 33, 33–50. [Google Scholar]
  51. Levine, S.B. The nature of sexual desire: A clinician’s perspective. Arch. Sex. Behav. 2003, 32, 279–285. [Google Scholar] [CrossRef] [PubMed]
  52. Laumann, E.O.; Michael, R.T. (Eds.) Sex, Love, and Health in America: Private Choices and Public Policies; University of Chicago Press: Chicago, IL, USA, 2001; ISBN 978-0-22646-967-6. [Google Scholar]
  53. Basson, R. The female sexual response: A different model. J. Sex Marit. Ther. 2000, 26, 51–65. [Google Scholar] [CrossRef] [PubMed]
  54. Basson, R. Women’s desire deficiencies and avoidance. In Handbook of Clinical Sexuality for Mental Health Professionals; Levine, S.B., Risen, C.B., Althof, S.E., Eds.; Brunner/Routledge: New York, NY, USA, 2003; pp. 111–130. ISBN 1-58391-331-9. [Google Scholar]
  55. Shifren, J.L.; Braunstein, G.D.; Siman, J.A.; Casson, P.R.; Buster, J.E.; Redmond, G.P. Transdermal testosterone treatment in women with impaired sexual function after oophorectomy. N. Engl. J. Med. 2000, 343, 682–688. [Google Scholar] [CrossRef] [PubMed]
  56. Levine, S.B. Male heterosexuality. In Masculinity and Sexuality: Selected Topics in the Psychology of Men; Friedman, R.C., Downey, J.I., Eds.; American Psychiatric Press: Washington, DC, USA, 1999; pp. 29–54. ISBN 978-0-88048-962-1. [Google Scholar]
  57. Fisher, H.E. The First Sex: The Natural Talents of Women and How They are Changing the World; Random House: New York, NY, USA, 1999. [Google Scholar]
  58. Ellis, B.J.; Symons, D. Sex differences in sexual fantasy: An evolutionary psychological approach. J. Sex Res. 1990, 27, 527–555. [Google Scholar] [CrossRef]
  59. Laumann, E.O.; Gagnon, J.H.; Michael, R.T.; Michaels, S. The Social Organization of Sexuality: Sexual Practices in the United States; University of Chicago Press: Chicago, IL, USA, 1994; ISBN 978-0-22647-020-7. [Google Scholar]
  60. Hatfield, E.; Rapson, R.L. Love, Sex, and Intimacy: Their Psychology, Biology, and History; Harper Collins College Publishers: New York, NY, USA, 1993; ISBN 978-0-065-00702-2. [Google Scholar]
  61. Tennov, D. Love and Limerence: The Experience of Being in Love; Stein and Day: New York, NY, USA, 1979; ISBN 978-0-81282-328-8. [Google Scholar]
  62. Buss, D.M. The Evolution of Desire: Strategies of Human Mating; Basic Books: New York, NY, USA, 1994; ISBN 978-0-46509-776-0. [Google Scholar]
  63. Sherwin, B.B.; Gelfand, M.M. The Role of Androgen in the Maintenance of Sexual Functioning in Oophorectomized Women. Psychosom. Med. 1987, 49, 397–409. [Google Scholar] [CrossRef] [PubMed]
  64. Sherwin, B.B.; Gelfand, M.M.; Brender, W. Androgen enhances sexual motivation in females: A prospective cross-over study of sex steroid administration in the surgical menopause. Psychosom. Med. 1985, 7, 339–351. [Google Scholar] [CrossRef]
  65. Cupach, W.R.; Spitzberg, B.H. The Dark Side of Relationship Pursuit: From Attraction to Obsession and Stalking; Routledge: London, UK, 2014. [Google Scholar] [CrossRef]
  66. Hourigan, D. Obsession. In Encyclopedia of Consumer Culture; Southerton, D., Ed.; Sage Publications: Thousand Oaks, CA, USA, 2011; pp. 1056–1057. ISBN 978-0-87289-601-7. [Google Scholar]
  67. Salmeron, J.L. Fuzzy cognitive maps for artificial emotions forecasting. Appl. Soft Comput. 2012, 12, 3704–3710. [Google Scholar] [CrossRef]
  68. Oxford English Dictionary. Fidelity. Available online: https://en.oxforddictionaries.com/definition/fidelity (accessed on 18 May 2018).[Green Version]
  69. Mattingly, B.A.; Wilson, K.; Clark, E.M.; Bequette, A.W.; Weidler, D.J. Foggy faithfulness: Relationship quality, religiosity, and the perceptions of dating infidelity scale in an adult sample. J. Fam. Issues 2010, 31, 1465–1480. [Google Scholar] [CrossRef]
  70. Drigotas, S.M.; Barta, W. The cheating heart: Scientific explorations of infidelity. Curr. Dir. Psychol. Sci. 2001, 10, 177–180. [Google Scholar] [CrossRef]
  71. Beauregard, M.; Courtemanche, J.; Paquette, V.; St-Pierre, É.L. The neural basis of unconditional love. Psychiatry Res. Neuroimaging 2009, 172, 93–98. [Google Scholar] [CrossRef] [PubMed]
  72. Welwood, J. On love: Conditional and unconditional. J. Transpers. Psychol. 1985, 17, 33. [Google Scholar]
  73. Stafford, L.; Canary, D.J. Maintenance strategies and romantic relationship type, gender and relational characteristics. J. Soc. Pers. Relatsh. 1991, 8, 217–242. [Google Scholar] [CrossRef]
  74. Daly, M.; Wilson, M. Homicide; Aldine de Gruyter: Hawthorne, NJ, USA, 1988; ISBN 978-0-20236-644-9. [Google Scholar]
  75. Buss, D. Evolutionary Psychology: The New Science of the Mind; Allyn & Bacon: Needham Heights, MA, USA, 1998; ISBN 978-0-20519-358-5. [Google Scholar]
  76. Knobloch, L.K.; Solomon, D.H.; Cruz, M.G. The role of relationship development and attachment in the experience of romantic jealousy. Pers. Relationsh. 2001, 8, 205–224. [Google Scholar] [CrossRef]
  77. Rusbult, C.E.; Drigotas, S.M.; Verette, J. The investment model: An interdependence analysis of commitment processes and relationship maintenance phenomena. In Communication and Relational Maintenance; Canary, D., Stafford, L., Eds.; Academic Press: San Diego, CA, USA, 1994; pp. 115–139. [Google Scholar]
  78. Ortner, S.B.; Whitehead, H. (Eds.) Sexual Meanings: The Cultural Construction of Gender and Sexuality; Cambridge University Press: Cambridge, UK, 1981; ISBN 978-0-52128-375-5. [Google Scholar]
  79. Meston, C.M.; Buss, D.M. Why humans have sex. Arch. Sex. Behav. 2007, 36, 477–507. [Google Scholar] [CrossRef] [PubMed]
  80. Levy, D. Love and Sex with Robots: The Evolution of Human-Robot Relationships; Harper Collins: New York, NY, USA, 2007; ISBN 978-0-06135-980-4. [Google Scholar]
  81. Fehr, B.; Russell, J.A. The concept of love viewed from a prototype perspective. J. Pers. Soc. Psychol. 1991, 60, 464–486. [Google Scholar] [CrossRef]
  82. Russell, J.A.; Fehr, B. Fuzzy concepts in a fuzzy hierarchy: Varieties of anger. J. Pers. Soc. Psychol. 1994, 67, 186–205. [Google Scholar] [CrossRef] [PubMed]
  83. Richardson, K. The asymmetrical ‘relationship’: Parallels between prostitution and the development of sex robots. ACM SIGCAS Comput. Soc. 2016, 45, 290–293. [Google Scholar] [CrossRef]
  84. Mackenzie, R. Sexbots: Replacements for sex workers? Ethical constraints on the design of sentient beings for utilitarian purposes. In Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference, Funchal, Portugal, 11–14 November 2014; ACM: New York, NY, USA, 2014. [Google Scholar] [CrossRef]
  85. Royakkers, L.; Van Est, R. A literature review on new robotics: Automation from love to war. Int. J. Soc. Robot. 2015, 7, 549–570. [Google Scholar] [CrossRef]
  86. Adams, A.A. Virtual sex with child avatars. In Emerging Ethical Issues of Life in Virtual Worlds; Wankel, C., Malleck, S., Eds.; Information Age Publishing: Charlotte, NC, USA, 2010; pp. 55–72. ISBN 978-1-60752-377-2. [Google Scholar]
  87. Danaher, J. Robotic rape and robotic child sexual abuse: Should they be criminalised? Crim. Law Philos. 2017, 11, 71–95. [Google Scholar] [CrossRef]
  88. Lin, P.; Adney, K.; Bekey, G.A. Robot Ethics: The Ethical and Social Implications of Robotics; MIT Press: Cambridge, MA, USA, 2011; ISBN 978-0-26201-666-7. [Google Scholar]
  89. Sparrow, R. Robots, rape, and representation. Int. J. Soc. Robot. 2017, 1–13. [Google Scholar] [CrossRef]
  90. Snell, J. Sexbots: An editorial. Psychol. Educ. Interdiscip. J. 2005, 42, 49–50. [Google Scholar]
  91. Turkle, S. Alone Together. Why We Expect More from Technology and Less from Each Other; Basic Books: New York, NY, USA, 2012; ISBN 978-0-46503-146-7. [Google Scholar]
  92. Danaher, J.; McArthur, N. (Eds.) Robot Sex: Social and Ethical Implications; MIT Press: Cambridge, MA, USA, 2017; ISBN 978-0-26203-668-9. [Google Scholar]
  93. Ziaja, S. Homewrecker 2.0: An exploration of liability for heart balm torts involving AI humanoid consorts. In Proceedings of the International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 114–124. [Google Scholar]
  94. Hinman, L. Robotic Companions: Some Ethical Questions to Consider. 2009. Available online: https://www.researchgate.net/profile/Lawrence_Hinman/publication/242691647_Robotic_Companions_Some_ethical_questions_to_consider/links/542c2fe40cf27e39fa931378.pdf (accessed on 2 June 2018).
  95. Ashrafian, H. AlonAI: A humanitarian law of artificial intelligence and robotics. Sci. Eng. Ethics 2015, 21, 29–40. [Google Scholar] [CrossRef] [PubMed]
  96. Ashrafian, H. Artificial intelligence and robot responsibilities: Innovating beyond rights. Sci. Eng. Ethics 2015, 21, 317–326. [Google Scholar] [CrossRef] [PubMed]
  97. Madden, J. Should Having Sex with a Robot Count as Cheating? BBC. Available online: http://www.bbc.co.uk/bbcthree/item/0c4f5093-ed7d-4fad-97cf-b93b9afb1679 (accessed on 2 June 2017).
  98. Darling, K. Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In Robot Law Proceedings of the Robot Conference 2012; Calo, R., Froomkin, M., Kerr, I., Eds.; University of Miami: Miami, FL, USA, 2012. [Google Scholar] [CrossRef]
  99. Mullin, C.R.; Linz, D. Desensitization and resensitization to violence against women: Effects of exposure to sexually violent films on judgments of domestic violence victims. J. Pers. Soc. Psychol. 1995, 69, 449–459. [Google Scholar] [CrossRef] [PubMed]
  100. Malamuth, N.M.; Briere, J. Sexual violence in the media: Indirect effects on aggression against women. J. Soc. Issues 1986, 42, 75–92. [Google Scholar] [CrossRef]
  101. Mackenzie Wright, D.W. Hunting humans: A future for tourism in 2200. Futures 2016, 78–79, 34–46. [Google Scholar] [CrossRef]
  102. Rawlinson, K.; Microsoft’s Bill Gates Insists AI Is a Threat. NNB News. Available online: http://www.bbc.com/news/31047780 (accessed on 2 June 2017).
  103. Lewis, T. Stephen Hawking: Artificial Intelligence Could End Human Race. Live Science. Available online: https://www.livescience.com/48972-stephen-hawking-artificial-intelligence-threat.html (accessed on 2 June 2017).
  104. Tucker, P.; Aron, A. Passionate love and marital satisfaction at key transition points in the family life cycle. J. Soc. Clin. Psychol. 1993, 12, 135–147. [Google Scholar] [CrossRef]
  105. Fox, S. Evolving Robots Learn to Lie to Each Other. Popular Science. Available online: http://www.popsci.com/scitech/article/2009-08/evolving-robots-learn-lie-hide-resources-each-other, (accessed on 7 July 2017).
  106. Litiou, A.; Ullman, D.; Kim, J.; Scassellati, B. Evidence that robots trigger a cheating detector in humans. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015; ACM: New York, NY, USA, 2015; pp. 165–172. [Google Scholar] [CrossRef]
  107. Short, E.; Hart, J.; Vu, M.; Scassellati, B. No fair!! An interaction with a cheating robot. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction 2010, Osaka, Japan, 2–5 March 2010; ACM/IEEE: New York, NY, USA, 2010; pp. 219–226. [Google Scholar] [CrossRef]
  108. Ullman, D.; Leite, I.; Phillips, J.; Kim-Cohen, J.; Scassellati, B. Smart human, smarter robot: How cheating affects perceptions of social agency. In Proceedings of the 36th Annual Conference of the Cognitive Science Society 2014, 36, Cognitive Science meets Artificial Intelligence: Human and Artificial Agents in Interactive Contexts, Quebec City, QC, Canada, 23–26 July 2014; ISBN 978-1-63439-116-0. [Google Scholar]
  109. Cosmides, L. The logic of social exchange: Has natural selection shaped how humans reason? Studies with the Wason selection task. Cognition 1989, 31, 187–276. [Google Scholar] [CrossRef]
  110. Cosmides, L.; Tooby, J. Cognitive adaptations for social exchange. In The Adapted Mind–Evolutionary Psychology and the Generation of Culture; Barkow, J.H., Cosmides, L., Tooby, J., Eds.; Oxford University Press: New York, NY, USA, 1992; pp. 163–228. ISBN 978-0-19510-107-2. [Google Scholar]
  111. Verplatse, J.; Vanneste, S.; Braekman, J. You can judge a book by its cover: The sequel: A kernel of truth in predictive cheating detection. Evol. Hum. Behav. 2007, 28, 260–271. [Google Scholar] [CrossRef]
  112. MacDorman, K.F. Subjective ratings of robot video clips for human likeness, familiarity, and eeriness: An exploration of the uncanny valley. In Proceedings of the ICCS/CogSci-2006 Long Symposium: Toward Social Mechanisms of Android Science, Vancouver, BC, Canada; 2006; pp. 26–29. [Google Scholar]
  113. Mori, M.; MacDorman, K.F.; Kageki, N. The uncanny valley [from the field]. IEEE Robot. Autom. Mag. 2012, 19, 98–100. [Google Scholar] [CrossRef]
  114. Hutson, M. The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane; Hudson Street Press: New York, NY, USA, 2012; pp. 165–171. ISBN 978-0-45229-890-3. [Google Scholar]
  115. Hanson, D.; Olney, A.; Prilliman, S.; Mathews, E.; Zielke, M.; Hammons, D.; Fernandez, R.; Stephanou, H. Upending the uncanny valley. In Proceedings of the 20th National Conference on Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1999; p. 1728. [Google Scholar]
  116. Kaplan, F. Who is afraid of the humanoid? Investigating cultural differences in the acceptance of robots. Int. J. Humanoid Rob. 2004, 1, 465–480. [Google Scholar] [CrossRef]
  117. Staszak, J.F. Other/otherness. In International Encyclopaedia of Human Geography; Kitchin, R., Thrift, N., Eds.; Elsevier: Oxford, UK, 2009; pp. 43–47. ISBN 978-0-08044-911-1. [Google Scholar]
  118. Suzuki, T. Word in Context: A Japanese Perspective on Language and Culture; Kodansha International: Tokyo, Japan, 2001; ISBN 4-7700-2780-X. [Google Scholar]
  119. Asch, M. From terra nullius to affirmation: Reconciling Aboriginal rights with the Canadian Constitution. Can. J. Law Soc. 2002, 17, 23–39. [Google Scholar] [CrossRef]
  120. Berque, A.; Schwartz, R. Japan Nature, Artifice and Japanese Culture; Pilkington: Yelvertoft Manor, UK, 1997; ISBN 9781899044153. [Google Scholar]
  121. Fontichiaro, K. Taming the Technology Leadership Dragon. In The Many Faces of School Library Leadership; Coatney, S., Harada, V.H., Eds.; Libraries Unlimited: Denver, CO, USA, 2017; pp. 119–132. ISBN 978-1-44084-897-1. [Google Scholar]
  122. Mori, M. The Buddha in the Robot; Kosei Publishing Company: Tokyo, Japan, 1981; ISBN 9784333010028. [Google Scholar]
  123. Starobinski, J. Jean-Jacques Rousseau, Transparency and Obstruction; Goldhammer, A., Ed.; University of Chicago Press: Chicago, IL, USA, 1988; ISBN 978-0-22677-126-7. [Google Scholar]
  124. Syrdal, D.; Nomura, T.; Hirai, H.; Dautenhahn, K. Examining the Frankenstein syndrome. Soc. Robot. 2011, 125–134. [Google Scholar] [CrossRef] [Green Version]
  125. Li, D.; Rau, P.P.; Li, Y. A cross-cultural study: Effect of robot appearance and task. Int. J. Soc. Robot. 2010, 2, 175–186. [Google Scholar] [CrossRef]
  126. ATR Home: Hiroshi Ishiguro Laboratories. Available online: http://www.geminoid.jp/en/index.html (accessed on 13 December 2017).
  127. Guizzo, E. Hiroshi Ishiguro: The Man Who Made a Copy of Himself. IEEE Spectrum. 2010. Available online: https://spectrum.ieee.org/robotics/humanoids/hiroshi-ishiguro-the-man-who-made-a-copy-of-himself (accessed on 13 December 2017).
  128. Haugeland, J.M. (Ed.) Mind Design II: Philosophy, Psychology, Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
  129. Norman, D. Emotion & design: Attractive things work better. Interactions 2002, 9, 36–42. [Google Scholar]
  130. Minsky, M.L. Will robots inherit the earth. Sci. Am. 1994, 271, 108–113. [Google Scholar] [CrossRef] [PubMed]

Share and Cite

MDPI and ACS Style

Rousi, R. Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion. Robotics 2018, 7, 44. https://doi.org/10.3390/robotics7030044

AMA Style

Rousi R. Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion. Robotics. 2018; 7(3):44. https://doi.org/10.3390/robotics7030044

Chicago/Turabian Style

Rousi, Rebekah. 2018. "Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion" Robotics 7, no. 3: 44. https://doi.org/10.3390/robotics7030044

APA Style

Rousi, R. (2018). Me, My Bot and His Other (Robot) Woman? Keeping Your Robot Satisfied in the Age of Artificial Emotion. Robotics, 7(3), 44. https://doi.org/10.3390/robotics7030044

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop