Trust and Privacy in Our Networked World
Remarking on the relationship between the concepts of trust and privacy, Charles Fried (1990, p. 56) [1] writes:
Trust is the attitude of expectation that another will behave according to the constraints of morality… There can be no trust where there is no possibility of error. More specifically, man cannot know that he is trusted unless he has the right to act without constant surveillance so that he knows he can betray the trust. Privacy confers that essential right… Without privacy and the possibility of error which it protects that aspect of his humanity is denied to him.
The important relationship between trust and privacy that Fried describes is often overlooked in the contemporary literature on privacy, as well in the recent publications that focus on trust and trust-related topics. The six essays included in this special issue of Information, however, give us some additional insights into certain conceptual and practical connections involving the notions of trust and privacy. In this respect, the contributing authors expand upon the insight in Fried's classic work on the interconnection between the two concepts.
The above-cited passage from Fried was written in the pre-Internet era, when the kinds of concerns that people typically had about privacy violations and breaches of trust had not yet been exacerbated by information technology in general, and the World Wide Web in particular. However, the essays selected for inclusion in this special issue of Information provide readers with an analysis of some relatively recent privacy-and-trust related issues, especially as they result from our increasingly “networked world”.
In the first essay, “Some Forms of Trust”, Willem deVries (University of New Hampshire, USA) examines three different “forms”, as he calls them, the concept of trust can take: topic-focused trust, general trust, and personal trust. Of these, deVries argues that personal trust is the most fundamental, also pointing out that it is the one “most deeply connected with the construction of one's self”. He also argues that various forms of information technology (IT) now pose some serious challenges for us, both in “assessing and developing appropriate forms of the trust that is central to our personhood”. As deVries points out, however, his essay does not intend to address particular trust-and-privacy related issues affecting IT, but instead aims to “help sharpen our thinking” about the nature of these concepts and to show why issues involving trust and privacy that arise in the context of IT “should not be treated lightly”. In this sense, deVries' essay provides both a very nice philosophical and conceptual overview of the concept of trust (in general) and a rationale for why the specific kinds of trust-and-privacy-related issues examined in the five remaining essays in this special issue are indeed significant.
DeVries' essay is followed by two articles that examine issues affecting personal identity and the self in the context of trust-and-privacy related concerns involving IT in general, and computer networks in particular. Massimo Durante (University of Turin, Italy), in his essay on “Personal Identity through Trust and Privacy”, argues that there is a strong link between privacy and trust in their “competing attitudes to define the limits” of what he calls the “networked construction of personal identity”. Durante also argues that the two notions–trust and privacy–do not “oppose” one another, since both “cooperate to construct and define our personal identity”. He further argues that in order to better understand the “networked constructed” aspect of our personal identities, we first need to understand the “theoretical tension” that exists between privacy and trust. Next, Soraj Hongladarom (Chulalongkorn University, Thailand), in an essay titled “Pervasive Computing, Privacy and the Distribution of the Self”, describes some connections involving privacy, trust, and the notion of self in the context of an emerging technology–i.e., ambient intelligence, sometimes also referred to as ubiquitous computing. Here, Hongladarom investigates some implications for our notions of self and personal identity in what he calls an “ambient intelligence environment”. He argues that “since information about oneself can be actively distributed”, one's self can be “distributed throughout a pervasive or ubiquitous computing network”. Because of this phenomenon, Hangladarom believes that privacy protection needs to be extended to include personal information that is distributed across ubiquitous networks.
Next, Christian Fuchs (Uppsala University, Sweden) examines some privacy-related issues in the context of social networking sites (SNSs) in his essay, “An Alternative View of Privacy on Facebook”. Fuchs rejects the kinds of conceptual frameworks that have been used thus far to analyze privacy issues on Facebook, as well as on other SNSs (and in what he describes as the “Web 2.0 environment” in general). He worries that conventional privacy frameworks, including a model he refers to as “liberal privacy philosophy”, often ignore the key privacy concerns at stake in SNSs. Fuchs also worries that Facebook's understanding of privacy is based on a model that, by default, stresses: (a) self-regulation, and (b) an “individualistic understanding of privacy.” He then argues that this model needs to be replaced with a framework that connects “the phenomenon of online privacy” with the “political economy of capitalism”, in order to grasp the fuller dimension of the privacy issues affecting Facebook.
Fuchs' article is followed by Jeff Buechner's essay titled, “Trust, Privacy, and Frame Problems in Social and Business E-Networks, Part 1”. Buechner (Rutgers University, USA) analyzes some trust-and-privacy controversies that arise in the context of artificial agents (AAs). Specifically, he asks whether one can trust AAs with personal information in cases where an AA either: (i) “must access (or store) personal information”; or (b) is not expected to access (or store) that information. Buechner asks how we could determine that an AA could be trusted in either case, since no humans possess the “computational resources” needed to make that determination. He also believes that this problem can be couched in terms of one aspect of the classic “frame problem” in the field of artificial intelligence (AI). Buechner argues that the “trust relation between two agents (human or artificial)” serves as a “solution” to one variation of the frame problem in AI. He concludes by noting that the frame problem as it relates to the trust relationship between AAs can be solved without the need for what Buechner calls a “computationally infeasible explicit solution”.
In the closing essay, “Designing Data Protection Safeguards Ethically”, Ugo Pagallo (University of Turin, Italy) examines some privacy-and-design-related issues. Pagallo, like Buechner, draws upon some investigations involving AI research. However, Pagallo's essay focuses more specifically on connections pertaining to AI and the law, especially with respect to what he calls “values and modalities of design.” In order to resolve some of the current issues affecting privacy and data protection, Pagallo argues that we need to “adopt a stricter, more effective version” of the privacy-by-design model. He also believes that doing this will help to “reinforce people's pre-existing autonomy, rather than having to build it from scratch”.
The guest editors wish to express their gratitude to the authors who contributed their essays to this special issue. We believe that the readers of this issue of Information will find these essays stimulating—and especially informative—as we struggle to understand the plethora of trust-and privacy-related controversies that continue to emerge in our networked world.
Reference
- Fried, C. Privacy: A rational context. In Computers, Ethics, and Society; Ermann, M.D., Williams, M.B., Guiterrez, C., Eds.; Oxford University Press: New York, NY, USA, 1990; [Reprinted from Fried, C. An Anatomy of Human Values: Problems of Personal and Social Choice (Chapter IX); Harvard University Press: Cambridge, MA, 1970.]. [Google Scholar]
© 2011 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Tavani, H.T.; Arnold, D. Trust and Privacy in Our Networked World. Information 2011, 2, 621-623. https://doi.org/10.3390/info2040621
Tavani HT, Arnold D. Trust and Privacy in Our Networked World. Information. 2011; 2(4):621-623. https://doi.org/10.3390/info2040621
Chicago/Turabian StyleTavani, Herman T., and Dieter Arnold. 2011. "Trust and Privacy in Our Networked World" Information 2, no. 4: 621-623. https://doi.org/10.3390/info2040621
APA StyleTavani, H. T., & Arnold, D. (2011). Trust and Privacy in Our Networked World. Information, 2(4), 621-623. https://doi.org/10.3390/info2040621