Special Issue "Trust and Privacy in Our Networked World"

Quicklinks

A special issue of Information (ISSN 2078-2489).

Deadline for manuscript submissions: closed (30 November 2010)

Special Issue Editors

Guest Editor
Dr. Dieter M. Arnold

Albis Technologies Ltd., Albisriederstrasse 199, CH-8047 Zürich, Switzerland
Website | E-Mail
Interests: information theory; communication theory and technology; information security
Guest Editor
Prof. Dr. Herman Tavani

Philosophy Department, Rivier University, 420 South Main Street, Nashua, NH 03060, USA
Website | E-Mail
Interests: information and computer ethics; machine ethics; privacy; ethical aspects of bioinformatics; computational genomics; emerging technologies

Special Issue Information

Dear Colleagues,

Smart networks are becoming part of our daily life. Social networks and novel infrastructure networks are all around us and promise new exciting opportunities of communication, collaboration and doing business. These new adventures come at a high price though: Our privacy.
Online privacy is one of the major concerns in todays modern society. Far too often we trade privacy for trust in order to reap the promised potential of such smart networks. Since the fundaments of rights shifted from the person to social circumstances, it is increasingly difficult to protect the privacy of individuals sharing a public space, or performing public activities.

This issue collects tutorials and original contributions in the area of trust and privacy in networks. Contributions are welcome that contain theoretical work about trust and privacy in networks as well as present practical examples from our daily life.

Dr. Dieter M. Arnold
Guest Editor

Keywords

  • metrics for and fundamental limits of entropy in networks
  • network topologies of social networks
  • distributed trust models and security protocols
  • practical example of security in infrastructure networks such as eGovernment, eHealth, and SmartGrid

Published Papers (7 papers)

View options order results:
result details:
Displaying articles 1-7
Export citation of selected articles as:

Editorial

Jump to: Research

Open AccessEditorial Trust and Privacy in Our Networked World
Information 2011, 2(4), 621-623; doi:10.3390/info2040621
Received: 8 October 2011 / Accepted: 10 October 2011 / Published: 11 October 2011
Cited by 1 | PDF Full-text (49 KB) | HTML Full-text | XML Full-text
Abstract
Remarking on the relationship between the concepts of trust and privacy, Charles Fried (1990, p. 56) [1] writes: Trust is the attitude of expectation that another will behave according to the constraints of morality… There can be no trust where there is no possibility
[...] Read more.
Remarking on the relationship between the concepts of trust and privacy, Charles Fried (1990, p. 56) [1] writes: Trust is the attitude of expectation that another will behave according to the constraints of morality… There can be no trust where there is no possibility of error. More specifically, man cannot know that he is trusted unless he has the right to act without constant surveillance so that he knows he can betray the trust. Privacy confers that essential right… Without privacy and the possibility of error which it protects that aspect of his humanity is denied to him. The important relationship between trust and privacy that Fried describes is often overlooked in the contemporary literature on privacy, as well in the recent publications that focus on trust and trust-related topics. The six essays included in this special issue of Information, however, give us some additional insights into certain conceptual and practical connections involving the notions of trust and privacy. In this respect, the contributing authors expand upon the insight in Fried’s classic work on the interconnection between the two concepts.[...] Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)

Research

Jump to: Editorial

Open AccessArticle The Online Construction of Personal Identity Through Trust and Privacy
Information 2011, 2(4), 594-620; doi:10.3390/info2040594
Received: 4 August 2011 / Revised: 6 September 2011 / Accepted: 26 September 2011 / Published: 11 October 2011
Cited by 2 | PDF Full-text (218 KB) | HTML Full-text | XML Full-text
Abstract
Constructing a personal identity is an activity much more complex than elaborating a series of online profiles, which are only digital hints of the Self. The construction of our personal identity is a context-mediated activity. Our hypothesis is that young people are enabled,
[...] Read more.
Constructing a personal identity is an activity much more complex than elaborating a series of online profiles, which are only digital hints of the Self. The construction of our personal identity is a context-mediated activity. Our hypothesis is that young people are enabled, as digital natives and social network users, to co-construct the “context of communication” in which their narrative identities will be interpreted and understood. In particular, the aim of this paper is to show that such “context of communication”, which can be seen as the hermeneutical counterpart of the “networked publics” elaborated by Danah Boyd, emerges out of the tension between trust and privacy. In other terms, it is, on the one hand, the outcome of a web of trustful relations and, on the other, the framework in which the informational norms regulating teens’ expectations of privacy protection are set and evaluated. However, these expectations can be frustrated, since the information produced in such contexts can be disembedded and re-contextualized across time. The general and widespread use of information technology is, in fact, challenging our traditional way of thinking about the world and our identities in terms of stable and durable structures; they are reconstituted, instead, into novel forms. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Open AccessArticle Pervasive Computing, Privacy and Distribution of the Self
Information 2011, 2(2), 360-371; doi:10.3390/info2020360
Received: 5 April 2011 / Accepted: 20 May 2011 / Published: 27 May 2011
Cited by 1 | PDF Full-text (173 KB) | HTML Full-text | XML Full-text
Abstract
The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that
[...] Read more.
The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that have been brought about by emerging technologies. In this paper, I will present an investigation of what this means for the self and identity in our ambient intelligence environment. Since information about oneself can be actively distributed and processed, it is proposed that in a significant sense it is the self itself that is distributed throughout a pervasive or ubiquitous computing network when information pertaining to the self of the individual travels through the network. Hence privacy protection needs to be extended to all types of information distributed. It is also recommended that appropriately strong legislation on privacy and data protection regarding this pervasive network is necessary, but at present not sufficient, to ensure public trust. What is needed is a campaign on public awareness and positive perception of the technology. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Open AccessArticle Designing Data Protection Safeguards Ethically
Information 2011, 2(2), 247-265; doi:10.3390/info2020247
Received: 12 February 2011 / Revised: 2 March 2011 / Accepted: 14 March 2011 / Published: 29 March 2011
Cited by 5 | PDF Full-text (417 KB) | HTML Full-text | XML Full-text
Abstract
Since the mid 1990s, lawmakers and scholars have worked on the idea of embedding data protection safeguards in information and communication technology (ICT) with the aim to access and control personal data in compliance with current regulatory frameworks. This effort has been strengthened
[...] Read more.
Since the mid 1990s, lawmakers and scholars have worked on the idea of embedding data protection safeguards in information and communication technology (ICT) with the aim to access and control personal data in compliance with current regulatory frameworks. This effort has been strengthened by the capacities of computers to draw upon the tools of artificial intelligence (AI) and operations research. However, work on AI and the law entails crucial ethical issues concerning both values and modalities of design. On one hand, design choices might result in conflicts of values and, vice versa, values may affect design features. On the other hand, the modalities of design cannot only limit the impact of harm-generating behavior but also prevent such behavior from occurring via self-enforcement technologies. In order to address some of the most relevant issues of data protection today, the paper suggests we adopt a stricter, yet more effective version of “privacy by design.” The goal should be to reinforce people’s pre-existing autonomy, rather than having to build it from scratch. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Open AccessArticle Trust, Privacy, and Frame Problems in Social and Business E-Networks, Part 1
Information 2011, 2(1), 195-216; doi:10.3390/info2010195
Received: 24 December 2010 / Revised: 21 February 2011 / Accepted: 22 February 2011 / Published: 1 March 2011
Cited by 1 | PDF Full-text (164 KB) | HTML Full-text | XML Full-text
Abstract
Privacy issues in social and business e-networks are daunting in complexity—private information about oneself might be routed through countless artificial agents. For each such agent, in that context, two questions about trust are raised: Where an agent must access (or store) personal information,
[...] Read more.
Privacy issues in social and business e-networks are daunting in complexity—private information about oneself might be routed through countless artificial agents. For each such agent, in that context, two questions about trust are raised: Where an agent must access (or store) personal information, can one trust that artificial agent with that information and, where an agent does not need to either access or store personal information, can one trust that agent not to either access or store that information? It would be an infeasible task for any human being to explicitly determine, for each artificial agent, whether it can be trusted. That is, no human being has the computational resources to make such an explicit determination. There is a well-known class of problems in the artificial intelligence literature, known as frame problems, where explicit solutions to them are computationally infeasible. Human common sense reasoning solves frame problems, though the mechanisms employed are largely unknown. I will argue that the trust relation between two agents (human or artificial) functions, in some respects, is a frame problem solution. That is, a problem is solved without the need for a computationally infeasible explicit solution. This is an aspect of the trust relation that has remained unexplored in the literature. Moreover, there is a formal, iterative structure to agent-agent trust interactions that serves to establish the trust relation non-circularly, to reinforce it, and to “bootstrap” its strength. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Open AccessArticle An Alternative View of Privacy on Facebook
Information 2011, 2(1), 140-165; doi:10.3390/info2010140
Received: 18 November 2010 / Revised: 20 January 2011 / Accepted: 7 February 2011 / Published: 9 February 2011
Cited by 4 | PDF Full-text (193 KB) | HTML Full-text | XML Full-text
Abstract
The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the
[...] Read more.
The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor). Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Open AccessArticle Some Forms of Trust
Information 2011, 2(1), 1-16; doi:10.3390/info2010001
Received: 11 November 2010 / Revised: 25 November 2010 / Accepted: 14 December 2010 / Published: 10 January 2011
Cited by 5 | PDF Full-text (197 KB) | HTML Full-text | XML Full-text
Abstract
Three forms of trust: topic-focused trust, general trust, and personal trust are distinguished. Personal trust is argued to be the most fundamental form of trust, deeply connected with the construction of one’s self. Information technology has posed new problems for us in assessing
[...] Read more.
Three forms of trust: topic-focused trust, general trust, and personal trust are distinguished. Personal trust is argued to be the most fundamental form of trust, deeply connected with the construction of one’s self. Information technology has posed new problems for us in assessing and developing appropriate forms of the trust that is central to our personhood. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)

Journal Contact

MDPI AG
Information Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
information@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Information
Back to Top