Next Article in Journal
Informational Granules in Interactive Granular Computing †
Previous Article in Journal
Similarities, Differences, and Limitations of Humans and AI Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

The AI Betrayal of Social Emotions †

1
Independent Scholar, Affiliated to Bioinformation Group, Aragon Health Sciences Institute (IACS), 50006 Zaragoza, Spain
2
Independent Researcher, Integral Biomathics Centre, 10117 Berlin, Germany
3
Grupo Decisión Multicriterio Zaragoza (GDMZ), University of Zaragoza, 50006 Zaragoza, Spain
*
Author to whom correspondence should be addressed.
Presented at the Workshop on AI and People, IS4SI Summit 2023, Beijing, China, 14–16 August 2023.
Comput. Sci. Math. Forum 2023, 8(1), 69; https://doi.org/10.3390/cmsf2023008069
Published: 11 August 2023
(This article belongs to the Proceedings of 2023 International Summit on the Study of Information)

Abstract

:
We examine the growing trend in contemporary societies of appealing to sentiments and emotions in so many areas of social life, particularly under the command of AI tools and algorithms. Two aspects are considered: on the one side, their is insufficient conceptualization around the nature of social emotions, with disconnected classical and recent views in psychology, ethology, evolutionary psychology, neuroscience, etc. In this respect, a clearer idea about the relationships between emotions and social structures will be discussed. On the other side, related to the conceptual disarray around emotions, there is the direct commercial benefit of abusing practices of surveillance and marketing systems around AI. The towering presence of AI, and the increased presence of social networks, are menacing the democratic structures of contemporary societies, both psychologically, culturally, and politically.

1. On the Nature of Social Emotions

Research on emotions was rather ”provincial” until the advent of the cognitive–emotional revolution of the 1990s. Figures such as Antonio Damasio, Joseph Ledoux, Jak Panksepp, Robert Trivers, Daniel Kahneman, Amos Tversky, and many others contributed to a new intellectual environment that contemplated the necessary coupling between rationality and emotions, especially concerning social matters. See for instance [1,2]. This important change in view affected computation too, with the beginning of “affective computing” research by Picard, R.W. [3]. Definitely, around this very decade, AI lost its exclusively rationalistic orientation and went to “fish” for emotions. This is also connected to the exponential monetization of Internet activities and the explosion of the “attention economy” [4]. What happened later on is aptly described as “surveillance capitalism” by Zuboff, S. [5]. Far from contributing to crystallizing consistent research on emotions, it has contribute to longer and longer lists of emotions and to a retreat towards capturing the basics of our attention mechanisms—and to the development of perverse misinformation and polarization schemes.
In order to minimally clarify the conceptual panorama around social emotions, we must refer to a previous classical sociological distinction between “strong bonds” and “weak bonds”. The former extends to nuclear family ties and very close friends, while the latter refer to the ties with other friends, work colleagues, and acquaintances in general. The sociologist Granovetter M.S. [6] developed a highly influential approach to the social dynamics inherent to weak social bonds, which are responsible for the emergence of complex structures in society and for most information flows among individuals (rather than through strong ties). Historically, these are the bonds that underlay the first development of “civility”. However, there is a further source of social relationships that historian and sociologist Turchin P. [7] dubs “ultrasociality”. It refers to the emergence of states, kingdoms, empires, etc. These new collective identities are possible thanks to a new type of behavior subtended by more sophisticate cultural arrangements. The idea of the “sociotype” tries to capture this whole bunch of relational domains [8].
The gist of our approach is that each layer of social bonding in our relationships is mainly served (created, maintained, erased) by a dedicated set of social emotions. Bonds are just informational constructs, mutually shared memory episodes between individuals, that are central in the realization of our social nature, of our individual sociotype. Emotions are the “forces” that move and orient our behavior amidst the whole social networks we interact with. In our daily deployment of rationality and emotional reactions, the episodes loaded with emotional overtones are specifically identified so that the corresponding emotions may be triggered (or inhibited, depending on the context). An essential point is the correspondence between bonds and emotions. Thereafter, the basic emotions, for instance, as described by Ekman P. [9], are essential for creating and maintaining our strong bonds. Secondary emotions, as described by Trivers, R. [10], are the emotions of give and take, of interindividual exchange, the “emotions of commerce”, and they would be subtending our structure of weak bonds. Finally, the maintenance of ultrasocial structures would mobilize that new, subtle set of emotional reactions related to our collective identities.
And here is the big mess—the enormous potential for manipulation around our inner structure of social emotions. Neither societies nor our emotional minds are organized in closed departments. The mixing of situations and the conflict among emotional occurrences are inevitable, as countless works of art have depicted historically. There appears an easy recourse to primary emotions in the interindividual and ultrasocial domains: false “brotherhood”, hatred, fear, xenophobia… It is a perennial emotional mismatch, perhaps unavoidable in complex societies. However, it has become an aggravated problem of our times, when the whole new media based on social networks are negatively contributing to the maintenance of civility, promoting offensive forms of primary emotions rather than the civilized relationships demanded by weak bonds and ultrasociality. Additionally, AI algorithms themselves are endlessly detecting and promoting our basic emotional reactions. In what extent would they be betraying and menacing the foundations of democratic societies? This is what we discuss next.

2. Socio-Political and Cultural Impact of AI and the Future of Societies

The increasing capabilities of AI systems raise concerns about its impact on social emotions. As AI continues to evolve, it is becoming more effective than humans in emotional intelligence and social skills, which are valuable in areas such as patient, children, and elderly care. While AI can make logical decisions and execute tasks with high precision, it will not replace the need for human interaction. This lack of human contact can lead to social isolation and a lack of bonding and empathy in both personal and professional settings.
Job displacement is another potential concern that has come into the picture with the increased use of AI. Traditionally, jobs that were done by humans will now be done by machines, which can lead to economic disruption. While it is expected to create new jobs, the transition may be slower in some sectors and quicker in others, leading to a time of transition. Therefore, governments need to work out a transition plan that ensures people are not left behind, especially those who are most vulnerable to job loss.
Furthermore, the continuously increasing power and influence of tech giants like Google and Amazon, which invest heavily in developing AI technologies, creates socio-political imbalances that can lead to a machine-controlled society, is a very concerning tendency. These tech giants have the ability to shape the way we interact with AI and with each other. Therefore, timely regulations need to be in place to ensure that the AI technology is not being used to exploit vulnerable populations. Security and privacy concerns are other significant unknowns associated with the advent of AI. AI-powered technologies have the capability to mine and process vast amounts of data, which is a cause of worry about the loss of privacy rights. It is essential to protect the human rights of privacy by using data only under the guidelines of the law.
The friendly co-existence and gradual synchronicity between AI and human societies should be nurtured in a generation-oriented, stepwise, and careful way. The implications of AI systems should be evaluated and monitored for their broader socio-political and cultural implications. Care should be taken to ensure that regulations are in place to address any ethical concerns that may arise from the use of AI systems.
The progress in the AI field should be made very cautiously and responsibly by keeping in mind the potential risks alongside the benefits. The introduction of AI, while on the one hand beneficial, on the other hand needs to be managed carefully to prevent unintended consequences. Policy makers must consider ethical and socio-political concerns while regulating the use of AI. Finally, it is vital to provide humans with the skills that can work with AI systems in a responsible, humane, and worthy manner.

3. The Seducing Role of AI in Transforming Information to Exploit Human Emotions and Rule over Masses

Artificial intelligence (AI) has played a seducing role in transforming information to exploit human emotions and to rule over masses. The ability of AI to process fast large volumes of data coupled with advanced algorithms has made it possible for corporations and governments to manipulate human behavior on a scale never before imagined. Today, the impacts of AI on various aspects of human behavior would include political and religious extremism, cultural revolution, gender self-determination, critical race theory, corporative control of democratic states, mass formation, groupthink, neurochemistry of cognitive deformation, and obedience.
Destructive cults, mind control, psychological damages, brainwashing, re-education, mind cleansing thought reform, and other tactics have all been employed by cult leaders and authoritarian regimes to control and manipulate individuals. A number of clinical psychologists and social scientists like Schein [11], Lifton [12,13], Taylor [14], Le Bon [15], and Desmet [16] highlight how these tactics work to control people’s behavior. AI can be used to support these tactics on a massive scale. It can influence what people see and read online, and it can tailor messages to appeal to specific emotions and personal and group beliefs to achieve segregation, disorientation and rioting. As a result, individuals can be manipulated to hold political, religious, or other extremist views, which can lead to psychological damages and long-lasting social and economic consequences, including civil wars and affirmation of totalitarian regimes. There are also serious concerns that AI could be used to support corporative control of democratic states, where corporations have more significant influence than elected officials. This form of mass formation could result in groupthink and obedience, where people do not question authority and follow orders blindly.
In conclusion, AI is full of potential. However, it has a seducing role in transforming information to manipulate human behavior. It can support destructive cults, extremist political and religious views, and authoritative regimes, resulting in long-lasting psychological damages for the human civilization. Institutions and governments must be aware of these risks and work to prevent them by promoting transparency and accountability.

Author Contributions

All authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Damasio, A. Descartes’ Error: Emotion, Reason, and the Human Brain; Putnam: New York, NY, USA, 1994. [Google Scholar]
  2. Kahneman, D. Thinking, Fast and Slow; Farrar, Straus and Giroux: New York, NY, USA, 2011. [Google Scholar]
  3. Picard, R.W. Affective Computing; Technical Report 321; M.I.T. Media Laboratory Perceptual Computing Section: Cambridge, MA, USA, 1995. [Google Scholar]
  4. Lanham, R.A. The Economics of Attention: Style and Substance in the Age of Information; The University of Chicago Press: Chicago, IL, USA, 2006. [Google Scholar]
  5. Zuboff, S. Surveillance Capitalism and the Challenge of Collective Action. In New Labor Forum; SAGE Publications: Los Angeles, CA, USA, 2019; Volume 28. [Google Scholar] [CrossRef]
  6. Granovetter, M.S. The Strength of Weak Ties. Am. J. Sociol. 1973, 78, 1360–1380. [Google Scholar] [CrossRef] [Green Version]
  7. Turchin, P. Ages of Discord; Beresta Books LLC: Chaplin, CT, USA, 2016. [Google Scholar]
  8. Marijuán, P.C.; Montero-Marin, J.; Navarro, J.; Garcia-Campayo, J.; del Moral, R. The “sociotype” construct: Gauging the structure and dynamics of human sociality. PLoS ONE 2017, 12, e0189568. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Ekman, P. Basic emotions. In Handbook of Cognition and Emotion; Dalgleish, T., Power, M., Eds.; John Wiley & Sons: Sussex, UK, 1999; pp. 45–60. [Google Scholar]
  10. Trivers, R. Social Evolution; Benjamin-Cummings Publishing Co.: San Francisco, CA, USA, 1985. [Google Scholar]
  11. Schein, E.H. Coercive Persuasion; WW Norton & Co.: New York, NY, USA, 1971; ISBN-10: 0393006131; ISBN-13: 978-0393006131. [Google Scholar]
  12. Lifton, R.J. Thought Reform and the Psychology of Totalism; The University of North Carolina Press: Chapel Hill, NC, USA, 1989; ISBN-10: 0807842532; ISBN-13: 978-0807842539. [Google Scholar]
  13. Lifton, R.J. Losing Reality: On Cults, Cultism, and the Mindset of Political and Religious Zealotry; The New Press: New York, NY. USA, 2019; ISBN-10: 1620974991; ISBN-13: 978-1620974995. [Google Scholar]
  14. Taylor, K. Brainwashing: The Science of Thought Control; Oxford University Press: Oxford, UK, 2004; ISBN-10: 0192804960; ISBN-13: 978-0192804969. [Google Scholar]
  15. Le Bon, G. The Crowd and The Psychology of Revolution; CreateSpace Independent Publishing Platform: Scotts Valley, CA, USA, 2015; ISBN-10: 1512207470; ISBN-13: 978-1512207477. [Google Scholar]
  16. Desmet, M. The Psychology of Totalitarianism; Chelsea Green Publishing: Chelsea, VT, USA, 2022; ISBN-10: 1645021726; ISBN-13: 978-1645021728. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Marijuán, P.C.; Simeonov, P.; Navarro, J. The AI Betrayal of Social Emotions. Comput. Sci. Math. Forum 2023, 8, 69. https://doi.org/10.3390/cmsf2023008069

AMA Style

Marijuán PC, Simeonov P, Navarro J. The AI Betrayal of Social Emotions. Computer Sciences & Mathematics Forum. 2023; 8(1):69. https://doi.org/10.3390/cmsf2023008069

Chicago/Turabian Style

Marijuán, Pedro C., Plamen Simeonov, and Jorge Navarro. 2023. "The AI Betrayal of Social Emotions" Computer Sciences & Mathematics Forum 8, no. 1: 69. https://doi.org/10.3390/cmsf2023008069

APA Style

Marijuán, P. C., Simeonov, P., & Navarro, J. (2023). The AI Betrayal of Social Emotions. Computer Sciences & Mathematics Forum, 8(1), 69. https://doi.org/10.3390/cmsf2023008069

Article Metrics

Back to TopTop