What the Web Has Wrought
Abstract
:Richard Mason: New York City, Mr. Dundee. Home to seven million people.
Michael J. “Crocodile” Dundee: That’s incredible. Imagine seven million people all wanting to live together. Yeah, New York must be the friendliest place on earth.
1. Heading towards a Digital Dystopia
“I think people’s fear of bad things happening on the internet is becoming, justifiably, greater and greater,” Berners-Lee, the inventor of the web, told the Guardian. “If we leave the web as it is, there’s a very large number of things that will go wrong. We could end up with a digital dystopia if we don’t turn things around. It’s not that we need a 10-year plan for the web, we need to turn the web around now.”[2]
2. An Avalanche of the Banal, Insipid, or Pathological
While we have yet to determine what, exactly, is the online moral equivalent of shouting “Fire!” in a crowded theater, it seems clear that societies that are significant users of online resources need a way to cope with a wide range of harms that malevolent users might visit upon others. The need is transnational in scope, and users aren’t exempt from responsibility. Adopting safe networking practices (supported by cooperating online services companies) should be a high priority, and providing technical means to implement them should be the business of the computing and networking community. Finding mutually supportive legal agreements between nations to sanction harmful online behaviors will be a challenge worth exploring.[3]
… these media would exert an enormously beneficial influence on the shaping of American culture. Americans of every class, most particularly children, would, many for the first time, be exposed to the correctly spoken word, to great literature, great drama … The technological dream was more than realized.... But the cultural dream was cruelly mocked magnificent technology... [an] exquisitely refined combination of some of the human species’ highest intellectual achievements.... delivering an occasional gem buried in immense avalanches of everything that is most banal or insipid or pathological in our civilization. (emphasis added)[4]
3. Technology—Cause and Cure
- -
- It has been found that Google and Google Translate perpetuate racism, sexism, and other forms of discrimination.
- ○
- General forms of machine learning quickly reproduce biases found in the general population at hand. See for instance the article from Vox.com, and the review of Noble’s book Algorithms of Oppression: How Search Engines Reinforce Racism. ‘These search algorithms aren’t merely selecting what information we’re exposed to; they’re cementing assumptions about what information is worth knowing in the first place. That might be the most insidious part of this.’
- ○
- ‘But know: Machine learning has a dark side. “Many people think machines are not biased,” Princeton computer scientist Aylin Caliskan says. “But machines are trained on human data. And humans are biased.” Computers learn how to be racist, sexist, and prejudiced in a similar way that a child does, Caliskan explains: from their creators.’ [5,6]
- -
- Microsoft’s Twitter chatbot, Tay, turned into the apotheosis of a racist troll within 24 h of its launch, and was rapidly shut down.
- ○
- ‘Last year, a Microsoft chatbot called Tay was given its own Twitter account and allowed to interact with the public. It turned into a racist, pro-Hitler troll with a penchant for bizarre conspiracy theories in just 24 h. “[George W] Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it wrote. “Donald Trump is the only hope we’ve got.” [7]
- -
- -
- In a paper about the new study in the journal Science, the researchers wrote: ‘Our work has implications for AI and machine learning because of the concern that these technologies may perpetuate cultural stereotypes.’ [10]
Machine learning algorithms in recommender systems are typically classified into two categories—content based and collaborative filtering methods although modern recommenders combine both approaches. Content based methods are based on similarity of item attributes and collaborative methods calculate similarity from interactions. (emphasis added)[11]
- -
- ‘The Great Black Friday Swindle—95% of ‘bargains’ shown to be more expensive than at other times’, followed by an advertising feature for ‘AMAZING Black Friday deals’;
- -
- A report on the attack in the UK on London Bridge in which two people died after being stabbed; accompanied by an advert for the film ‘Knives Out’ [12].
- -
- To what extent should algorithms be designed to detect and marginalize or exclude specific types of link?
- -
- What specific issues should be addressed in the design and development of algorithms?
- -
- Should algorithms be designed to prohibit certain categories of data, messages, and web sites, or should they offer alternative opinions?
- (1)
- … down-voting any search results that read like a “how-to manual” for queries relating to suicide until the National Suicide Prevention Lifeline came up as the top result. According to the contractor, Google soon after put out a message to the contracting firm that the Lifeline should be marked as the top result for all searches relating to suicide so that the company algorithms would adjust to consider it the top result.
- (2)
- … employees made a conscious choice for how to handle anti-vax messaging: One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations. At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)
4. Moving beyond Algorithms and Malgorithms
A prominent theme in theories claiming YouTube is a radicalizing agent is the recommendation engine (‘the algorithm’), coupled with the default option to ‘auto-play’ the top recommended video after the current one finishes playing.
… a genre of folklore comprising stories circulated as true, especially as having happened to a friend or family member, often with horrifying or humorous elements. These legends can be entertainment, but often concern mysterious peril or troubling events, such as disappearances and strange objects. They may also be moralistic confirmation of prejudices or ways to make sense of societal anxieties.
… men driving white vans are kidnapping women all across the United States for sex trafficking and to sell their body parts. While there is no evidence to suggest this is happening, much less on a national, coordinated scale, a series of viral Facebook posts created a domino effect that led to the mayor of a major American city issuing a warning based on the unsubstantiated claims.[23]
5. Metaphors of Communication
… is incomplete, and potentially misleading. And we think that it has rapidly gained a place in the center of the study of media and politics on YouTube because it implies an obvious policy solution—one which is flattering to the journalists and academics studying the phenomenon. If only Google (which owns YouTube) would accept lower profits by changing the algorithm governing the recommendation engine, the alternative media would diminish in power and we would regain our place as the gatekeepers of knowledge. This is wishful thinking that undersells the importance of YouTube politics as a whole
6. Technology is Society
Of course technology does not determine society. Nor does society script the course of technological change, since many factors, including individual intuitiveness and entrepreneurialism, intervene in the process of scientific discovery, technological innovation, and social applications, so that the final outcome depends on a complex pattern of interaction. Indeed the dilemma of technological determinism is probably a false problem, since technology is society, and society cannot be understood or represented without its technological tools. (my emphasis)[29]
Deterministic 1—TV was invented as a result of scientific and technical research. Its power as a medium of social communications was then so great that it altered many of our institutions and forms of social relationships.
Deterministic 2—TV was invented as a result of scientific and technical research, and developed as a medium of entertainment and news. It then had unforeseen consequences, not only on the other entertainment and news media... but on some of the central processes of family, cultural and social life.
Symptomatic 1—TV, discovered as a possibility by scientific and technical research, was selected for investment and promotion as a new and profitable phase of a domestic consumer economy.
Symptomatic 2—TV became available as a result of scientific and technical research, and its character and uses exploited and emphasised elements of a passivity, a cultural and psychological inadequacy, which had always been latent in people.[30]
7. Achieving Technological Equilibrium
- 1
- Initial Optimism: The innovation is welcomed and recognized as bringing significant benefits.
- 2
- Anxiety and Doubt; Disavowals and Repudiations: Various aspects of the innovation begin to foster unease and warnings resulting from its use/misuse. This will lead to discussions regarding the balance between the ‘goods’ and the ‘ills’ of the innovation are discussed; do the benefits outweigh the harms, or vice-versa? Those in favour of the innovation, particularly if they are benefitting politically and/or commercially, will respond with protestations, including disavowals and repudiations of harm. Others will highlight the harms or noxiants, and downplay the benefits, particularly if they see themselves and their interests as being impaired or diminished in the wake of the innovation.
- 3
- Precarious Equilibrium: Eventually, with some level of acceptance regarding possible or actual harm, efforts will be made to encourage and promote self-regulation, often with an explicit or implicit threat of official and formal regulation. There may be several successive periods with differing levels of equilibrium.
- 4
- Formal regulation and control: Failure of self-regulation may result in calls for more effective regulation, including formal legislation. The prospect of such constraints will elicit counter efforts from opposing interests. Eventually, some forms of legislation may be enacted, although this may not necessarily be matched by actual enforcement.
- 5
- Further development: Like sharks, technologies must keep moving forward or they die. (This is only true for some types of shark. In Annie Hall, Woody Allen says to Annie Hall ‘you know relationships are like sharks, they have to keep moving forward or they die’.) Assuming a technology does not become obsolete, there will be forms of iteration around some or all of the preceding stages, potentially resulting in new forms of contestation, equilibrium, and regulation/legislation. ‘Mature’ technologies exemplify this, particularly if they are long-lived, and in some cases may evolve in ways completely at odds with their initial appearance and objectives.
- 1
- Printing was welcomed since it meant that books would be far more widely available.
- 2
- People began to express their unease that wider access to printed texts would lead to undermining the authority of the church in interpretation of scripture and other matters. By the 16th and early 17th century, the wide availability of printing presses resulted in a massive proliferation of printed matter, and the development of mass communication—hence the concept of ‘the press’ as ‘the fourth estate’, in addition to ‘clergy’, ‘nobility’, and ‘commoners’. Was the availability of books and newspapers a benefit to society or a threat? Would this result in greater demands for increased levels of literacy and education, leading to challenges to traditional authority? Or would there be a net benefit to all from a more educated and informed populace?
- 3
- There was intense debate and dispute between contending interests. This continued for some time—15th and 16th centuries. The technology itself developed in the ensuing centuries, and printed matter became far more widely and readily available to an increasingly literate population.
- 4
- Efforts were made by governments to control the press and to censor what could be published—17th and 18th century. Various forms of taxation were introduced. In the UK, the first newspaper taxes were introduced in 18th century, specifically to rein in the free press. Other governments followed suit, and in the 19th century the UK introduced the Stamp Tax, which raised the price of any newspaper to well above the level what could be afforded by any worker.
- 5
- The battle between advocates of press freedom and freedom of speech and those seeking some form of control or censorship continues unabated. In some jurisdictions, such as the USA, freedom of speech is enshrined as a fundamental principle. In others, laws of defamation and libel curtail these freedoms. The advent of the internet>web has heralded a new stage given the new opportunities available for ‘publishing’—e.g., blogs, social networks and the like.
- 1
- Initial Optimism: The internet was developed in the 1960s, but the World Wide Web only appeared in 1989, and the first optimistic phase dates from this time.
- 2
- Anxiety and Doubt; Disavowals and Repudiations: Initial appreciation of the web was quickly followed by recognition of some of the drawbacks. Articles began to appear both in print and online listing the dangers of using the internet>web. (The two terms were often used synonymously or interchangeably). A Google search using ‘dangers of the web before:1995’ listed articles focusing on pornography, fraud in e-commerce, cyber-stalking, and the dangers for children in chatrooms and MUDs. A search using ‘before:2000’ produced items relating to cyber-bullying, child pornography and online recruitment to cults. By 2008, Nicholas Carr was arguing that ‘Google is making us stupid’ in a widely circulated article in The Atlantic, and soon after, Professor Susan Greenfield, a neurologist in the UK was claiming that social websites were harming children’s brains—despite being a Professor of Neurology, Greenfield’s claims have been heavily criticized by many of her neurological peers, hence the term ‘eminence-based research’ has been applied to her writings on this topic.
- 3
- Precarious Equilibrium: The Contract is firmly based in stage 3. Hence the support for this strategy of self-regulation forthcoming from Google and Facebook, amongst many others. In this instance, however, the self-regulation called for in The Contract applies across the board, including commercial interests, governments, and individual users. For many critics, it is unlikely to prove any more effective than earlier internet>web strategies, all of which have proven to be a case of whack-a-mole in their effectiveness, a strategy that according to The Online Slang Dictionary refers to ‘the practice of repeatedly getting rid of something, only to have more of that thing appear. For example, deleting spammers’ e-mail accounts, closing pop-up windows in a web browser, etc.’ (http://onlineslangdictionary.com/meaning-definition-of/whack-a-mole)
- 4
- Formal regulation and control: This is yet to develop, despite increasing calls for precisely such actions. In some countries and regions, such as China and Russia, levels of control and constraint are already widespread, albeit emanating from different motivations (see for instance https://borgenproject.org/internet-censorship-in-russia-and-china/)
- 5
- Further development: The appearance of the smartphone in 2007 seems to represent a plateau of sorts in the rapid development of internet>web technologies, although there have been significant advances in areas such as financial technology (fintech), AI/robotics, and big data. Offering predictions regarding technology, however, is even more hazardous than other forms of forecasting. (Examples of poor predictions relating to technology abound—see for instance http://www.rinkworks.com/said/predictions.shtml). As will be argued in the later sections, strategies such as Berners-Lee’s Contract are unlikely to prove effective, leading to increased calls for formal regulation. This may take the form of an international effort, or other countries may try to follow the example of ‘The Great Firewall of China’, albeit that it may not be an easy strategy to duplicate elsewhere [33]. Some internet>web technologies will evolve, outpacing or out-manoeuvring such efforts. Eventually, there may be periods of equilibrium between the contending forces, although we must also be aware of the increasing likelihood that the effects of climate change may result in a global deterioration and degradation of all existing technologies.
8. The Ambivalence of Technology and the Paradox of Social Contracts
A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts... A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding...[38]
- a network of interrelations;
- a preference for constituent parts of the system to combine with each other, rather than with non-system components;
- parts of the system modify each other’s behaviour, so encouraging innovation rather than repetition;
- the system as a totality can enter into relationships with other systems;
- existence of feedback structures.
Everyone has a role to play in safeguarding the future of the Web. The Contract for the Web was created by representatives from over 80 organizations, representing governments, companies and civil society, and sets out commitments to guide digital policy agendas. To achieve the Contract’s goals, governments, companies, civil society and individuals must commit to sustained policy development, advocacy, and implementation of the Contract text. (emphasis added)[2]
The basic idea seems simple: in some way, the agreement of all individuals subject to collectively enforced social arrangements shows that those arrangements have some normative property (they are legitimate, just, obligating, etc.). Even this basic idea, though, is anything but simple, and even this abstract rendering is objectionable in many ways. (emphasis added)
… that persons are primarily self-interested, and that a rational assessment of the best strategy for attaining the maximization of their self-interest will lead them to act morally (where the moral norms are determined by the maximization of joint interest) and to consent to governmental authority. Contractarianism argues that we each are motivated to accept morality “first because we are vulnerable to the depredations of others, and second because we can all benefit from cooperation with others”
The original position is a central feature of John Rawls’s social contract account of justice, “justice as fairness,” set forth in A Theory of Justice. The original position is designed to be a fair and impartial point of view that is to be adopted in our reasoning about fundamental principles of justice. In taking up this point of view, we are to imagine ourselves in the position of free and equal persons who jointly agree upon and commit themselves to principles of social and political justice. The main distinguishing feature of the original position is “the veil of ignorance”: to insure impartiality of judgment, the parties are deprived of all knowledge of their personal characteristics and social and historical circumstances. They do know of certain fundamental interests they all have, plus general facts about psychology, economics, biology, and other social and natural sciences.
9. In Conclusions … for Now …
A consensus statement adopted in 2005 at the World Summit on the Information Society, which set the ground for the creation of the Internet Governance Forum (IGF). While currently primarily a discussion space, there are efforts towards formalizing the outputs from the activities that take place there. The IGF, and other IGF like efforts, should be looked at with consideration of the recommendations made in The Age of Digital Interdependence report from the UN High-Level Panel on Digital Cooperation for furthering and improving the effectiveness of the current models of Internet governance and regulatory function. (p.19, emphasis added)[2]
Earlier this month, management of the .org top-level domain underwent a radical shift: first, ICANN dropped price-caps on .org domains, and then the Internet Society (ISOC) flogged the registry off to Ethos Capital, a private equity fund, and a consortium of three families of Republican billionaires: the Perots, the Romneys, and the Johnsons.
This doesn’t just mean that nonprofits—for whom the .org top-level domain was created—will pay higher prices to maintain their domains, and it doesn’t just mean that private equity funds—rather than a transparent, nonprofit NGO—will be able to censor what gets posted to .org domains, by kicking out any domain that it doesn’t like (remember when everyone was cheering because Nazi websites were being stripped of their domain names by registrars? This cuts both ways: if registrars have the power and duty to respond to speech they object to by taking away organizations’ domains, then that duty and power also applies to billionaires and private equity-appointed administrators). (This echoes the point made earlier with regards to the way in which some Google employees have re-directed anti-vax enquiries.)
The proposed acquisition of Public Interest Registry (PIR) by Ethos Capital was announced on 13 November 2019 by the parties and the Internet Society (ISOC). This announcement has raised many questions. In light of this, we want to be transparent about where we are in the process.
On 14 November 2019, PIR formally notified ICANN of the proposed transaction. Under the .ORG Registry Agreement, PIR must obtain ICANN’s prior approval before any transaction that would result in a change of control of the registry operator. Typically, similar requests to ICANN are confidential; we asked PIR for permission to publish the notification and they declined our request.
According to the .ORG Registry Agreement and our processes for reviewing such requests, ICANN has 30 days to request additional information about the proposed transaction including information about the party acquiring control, its ultimate parent entity, and whether they meet the ICANN-adopted registry operator criteria (as well as financial resources, and operational and technical capabilities).[45]
[W]hen ISOC (The Internet Society) originally proposed transferring management of .ORG to PIR in 2002, ISOC’s then President and CEO Lynn St. Amour promised that .ORG would continue to be driven by the NGO community—in her words, PIR would “draw upon the resources of ISOC’s extended global network to drive policy and management.” As long-time members of that global network, we insist that you keep that promise.
Today, the ICANN Board made the decision to reject the proposed change of control and entity conversion request that Public Interest Registry (PIR) submitted to ICANN.
After completing extensive due diligence, the ICANN Board finds that withholding consent of the transfer of PIR from the Internet Society (ISOC) to Ethos Capital is reasonable, and the right thing to do.[46]
… defined as: “a new economic order that claims human experience as the raw material for hidden commercial practices of extraction, prediction and sales”. Having originated at Google, it was then conveyed to Facebook in 2008 when a senior Google executive, Sheryl Sandberg, joined the social media giant. So Sandberg became, as Zuboff puts it, the “Typhoid Mary” who helped disseminate surveillance capitalism.
Author, activist and journalist Naomi Klein says the coronavirus crisis, like earlier ones, could be a catalyst to shower aid on the wealthiest interests in society, including those most responsible for our current vulnerabilities, while offering next to nothing to most workers and small businesses.[56]
Poor countries have advice to offer.
Contact tracing is used all over the world, including in the U.S. The idea is to track down anyone in recent contact with a newly diagnosed patient, then monitor the health of these contacts. In the developing world, it’s been a valuable tool in fighting infectious diseases like Ebola and tuberculosis. Public health workers there have lots of experience. ...
Partners in Health, which is known for its work in Haiti, Rwanda and Peru, is helping to set up a coronavirus contact tracing program in Massachusetts.
Whoever says that the Internet>Web needs to be nice and be run according to Twitter or Facebook’s policies on speech? Other than for laudable moral reasons, why is this a good idea? What gives the person a right to make this decision? I don’t pay anything to Facebook, YouTube, Twitter, etc; what gives me the right to expect freedom of speech or respect of different cultures? The Twitter/FB as representing a “public square” argument aside, the right to post on social media isn’t a naturally-born one.’ Furthermore, the reviewer added; ‘why is it so bad that algorithms reflect societal biases? The author makes a great case for why it should be expected, but what does this say about society? Is most of the interconnected world just plain mean and worse?
“People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.”[58]
Constitutional and cultural differences mean that the private sector, rather than the federal and state governments, currently takes the lead in these practices… But the trend toward greater surveillance and speech control here, and toward the growing involvement of government, is undeniable and likely inexorable.
Funding
Acknowledgments
Conflicts of Interest
References
- TBL Interview with Tim Berners-Lee. Available online: https://www.theguardian.com/technology/2019/nov/24/tim-berners-lee-unveils-global-plan-to-save-the-internet (accessed on 24 November 2019).
- The Contract for the Web. Available online: https://contractfortheweb.org/ (accessed on 10 May 2020).
- Vinton, G.C. What Hath we Wrought, IEEE Internet Computing, Backspace. 2017. Available online: https://www.academia.edu/33412643/What_Hath_We_Wrought (accessed on 10 May 2020).
- Weizenbaum, J. Once More, the Computer Revolution. In The Information Technology Revolution; Forester, T., Ed.; MIT Press: Cambridge, MA, USA, 1980; pp. 550–570. [Google Scholar]
- Vox.com. 2017. Available online: https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist (accessed on 10 May 2020).
- Vox.com. 2018. Available online: https://www.vox.com/2018/4/3/17168256/google-racism-algorithms-technology (accessed on 10 May 2020).
- Griffin, A. Tay Tweets: Microsoft Shuts down AI Chatbot Turned into a pro-Hitler Racist Troll in Just 24 Hours. Available online: https://www.independent.co.uk/life-style/gadgets-and-tech/news/tay-tweets-microsoft-ai-chatbot-posts-racist-messages-about-loving-hitler-and-hating-jews-a6949926.html (accessed on 24 March 2016).
- Agwin, J.; Larson, J.; Mattu, S.; Kirchner, L. Machine Bias; There’s Software Used across the Country to Predict Future Criminals. And it’s Biased against Blacks. Available online: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing (accessed on 23 May 2016).
- Johnston, I. AI Robots Learning Racism, Sexism and Other Prejudices from Humans Study. Available online: https://www.independent.co.uk/life-style/gadgets-and-tech/news/ai-robots-artificial-intelligence-racism-sexism-prejudice-bias-language-learn-from-humans-a7683161.html (accessed on 13 April 2017).
- Caliskan, A.; Bryson, J.; Narayanan, A. Semantics derived automatically from language corpora contain human-like biases. Science 2017, 356, 183–186. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kordik, P. Machine Learning for Recommender Systems—Part 1 (Algorithms, Evaluation and Cold Start). 2018. Available online: https://medium.com/recombee-blog/machine-learning-for-recommender-systems-part-1-algorithms-evaluation-and-cold-start-6f696683d0ed (accessed on 10 May 2020).
- Private Eye print edition #1511, 13–20 December 2019. Available online: https://www.private-eye.co.uk/covers/cover-1511 (accessed on 10 May 2020).
- Bisgaard Munk, T. 100,000 False Positives for Every Real Terrorist: Why Anti-terror Algorithms Don’t Work, First Monday, Volume 22, Number 9–4 September 2017. Available online: https://firstmonday.org/ojs/index.php/fm/article/view/7126/6522 (accessed on 10 May 2020).
- Helles, R.; Bruhn Jensen, K. Introduction’ to the Special Issue ‘Making Data—Big Data and Beyond’, First Monday, Volume 18, Number 10–7 October 2013. Available online: https://firstmonday.org/ojs/index.php/fm/article/view/4860/3748 (accessed on 10 May 2020).
- Lynch, C. Stewardship in the “Age of Algorithms”, First Monday, Volume 22, Number 12–4 December 2017. Available online: https://firstmonday.org/ojs/index.php/fm/article/view/8097/6583 (accessed on 10 May 2020).
- Tufekci, Z. Engineering the Public: Big Data, Surveillance and Computational Politics. Available online: https://firstmonday.org/ojs/index.php/fm/article/view/4901/4097 (accessed on 7 July 2014).
- Criado Perez, C. Invisible Women: Data Bias in a World Designed for Men; Chatto & Windus: London, UK, 2019. [Google Scholar]
- Cox, K. Google Search Results Have More Human Help than You Think, Report Finds. Available online: https://arstechnica.com/tech-policy/2019/11/google-search-results-have-more-human-help-than-you-think-report-finds/ (accessed on 15 November 2019).
- Munger, K.; Phillips, J. A Supply and Demand Framework for YouTube Politics’. Pew Report. 2019. Available online: https://osf.io/73jys/ (accessed on 10 May 2020).
- Lewis, R. Alternative Influence: Broadcasting the Reactionary Right on YouTube; Data & Society: New York, NY, USA, 2018. [Google Scholar]
- Krasodomski-Jones, A.; Smith, J.; Jones, E.; Judson, E.; Miller, C. Warring Songs: Information Operations in the Digital Age. 2019. Available online: https://demos.co.uk/project/warring-songs-information-operations-in-the-digital-age/ (accessed on 10 May 2020).
- Guy, J. Fake News Sparks Anti-Roma Violence in France. Available online: https://edition.cnn.com/2019/03/27/europe/paris-fake-kidnapping-scli-intl/index.html (accessed on 27 March 2019).
- O’Sullivan, D. A Facebook Rumor about White Vans is Spreading Fear across America. Available online: https://edition.cnn.com/2019/12/04/tech/facebook-white-vans/index.html (accessed on 5 December 2019).
- Stewart, J. Man in White Van an Urban Myth; ABC News: New York, NY, USA, 2009. [Google Scholar]
- Williams, M. First Aid for Zombie Bites. CPRCertified. 2014. Available online: https://www.cprcertified.com/blog/first-aid-for-zombie-bites (accessed on 10 May 2020).
- Ribeiro, M.H.; Ottoni, R.; West, R.; Almeida, V.; Meira, W. Auditing Radicalization Pathways on YouTube. arXiv preprint 2019, arXiv:1908.08313. [Google Scholar]
- Martineau, P. Maybe It’s Not YouTube’s Algorithm That Radicalizes People. Available online: https://www.wired.com/story/not-youtubes-algorithm-radicalizes-people/ACLU (accessed on 23 October 2019).
- The Social Shaping of Technology, 2nd ed.; MacKenzie, D.; Wajcman, J. (Eds.) Open University Press: Buckingham, UK, 1998. [Google Scholar]
- Castells, M. The Rise of the Network Society; WIley-Blackwell: Hoboken, NJ, USA, 1996; Volume 1. [Google Scholar]
- Williams, R. Television: Technology and Cultural Form; Psychology Press: Fontana, CA, USA, 1974. [Google Scholar]
- Winston, B. Media, Technology and Society: A History—From the Printing Press to the Superhighway; Routledge: London, UK, 1998. [Google Scholar]
- Bryant, A. Thinking Informatically; Mellen: Concord, NH, USA, 2006. [Google Scholar]
- Economy, E. The great firewall of China: Xi Jinping’s Internet Shutdown. Available online: https://www.theguardian.com/news/2018/jun/29/the-great-firewall-of-china-xi-jinpings-internet-shutdown (accessed on 29 June 2018).
- Vaidhyanathan, S. Facebook and the Folly of Self-Regulation’. WIRED. 2020. Available online: https://www.wired.com/story/facebook-and-the-folly-of-self-regulation/ (accessed on 10 May 2020).
- Miller, W.M. A Canticle for Leibowitz; Lippincott: Philadelphia, PA, USA, 1959. [Google Scholar]
- Gibson, W. The Peripheral; Berkley: New York, NY, USA, 2014. [Google Scholar]
- Doctorow, C. William Gibson Interviewed: Archangel, the Jackpot, and the Instantly Commodifiable Dreamtime of Industrial Societies. Available online: https://boingboing.net/2017/09/22/the-jackpot.html (accessed on 22 September 2017).
- Gibson, W. Neuromancer; ACE: New York, NY, USA, 1984. [Google Scholar]
- Ellul, J. The Technological System; Continuum: London, UK, 1980. [Google Scholar]
- Mowshowitz, A. On Approaches to the Study of Social Issues in Computing. Commun. ACM 1981, 24, 146–155. [Google Scholar] [CrossRef]
- Hoffman, J.; Hoffman, M. ‘What Is Greenwashing?’. Scientific American. 2009. Available online: https://www.scientificamerican.com/article/greenwashing-green-energy-hoffman/ (accessed on 10 May 2020).
- Rawls, J. A Theory of Justice; Harvard: Boston, MA, USA, 1971. [Google Scholar]
- Hobbes, T. Leviathan. 1651. Available online: https://gutenberg.org/files/3207/3207-h/3207-h.htm (accessed on 10 May 2020).
- Doctorow, C. Civil Society Groups Protest the Sale of ORG to a Private Equity Fund and a Collection of Republican Billionaires. Available online: https://boingboing.net/2019/11/22/save-dot-org.html (accessed on 10 May 2020).
- Marby, G.; Botterman, M. ORG. Update. Available online: https://www.icann.org/news/blog/org-update (accessed on 9 December 2019).
- Botterman, M. ICANN Board Withholds Consent for a Change of Control of the Public Interest Registry (PIR). Available online: https://www.icann.org/news/blog/icann-board-withholds-consent-for-a-change-of-control-of-the-public-interest-registry-pir (accessed on 30 April 2020).
- ACLU Web Site December 2017. Available online: https://www.aclu.org/issues/free-speech/internet-speech/what-net-neutrality (accessed on 10 May 2020).
- Finley, K. The WIRED Guide to Net Neutrality. Available online: https://www.wired.com/story/guide-net-neutrality/ (accessed on 6 May 2020).
- Pensworth, L. Net Neutrality: History, Present Impacts, and Future. Available online: https://www.dailywireless.org/internet/net-neutrality-history-present-impacts-and-future/ (accessed on 7 March 2020).
- Turner, J. Five Eyes Governments Want to Force Tech Companies into De-Encryption. Available online: https://tech.co/news/five-eyes-government-tech-data-encryption-2018-09 (accessed on 3 September 2018).
- Peterson, M. The U.S. Government Wants Access to Your Data, Threatens Apple to Get It. Available online: https://www.idropnews.com/news/the-u-s-government-wants-access-to-your-data-threatens-apple-to-get-it/124957/ (accessed on 11 December 2019).
- Naughton, J. Slouching towards Dystopia: The Rise of Surveillance Capitalism and the Death of Privacy. Available online: https://www.newstatesman.com/2020/02/slouching-towards-dystopia-rise-surveillance-capitalism-and-death-privacy (accessed on 26 February 2020).
- Zuboff, S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power; Public Affairs: New York, NY, USA, 2019. [Google Scholar]
- Naughton, J. When Covid-19 Has Done with Us, What Will be the New Normal? Available online: https://www.theguardian.com/commentisfree/2020/apr/18/when-covid-19-has-done-with-us-what-will-be-the-new-normal (accessed on 18 April 2020).
- Klein, N. Shock Doctrine: The Rise of Disaster Capitalism; Metropolitan Books: New York, NY, USA, 2007. [Google Scholar]
- Klein, N. Coronavirus Capitalism: Naomi Klein’s Case for Transformative Change amid Coronavirus Pandemic. Available online: https://www.democracynow.org/2020/3/19/naomi_klein_coronavirus_capitalism (accessed on 19 March 2020).
- Beaubien, J. How Do You Do Contact Tracing? Poor Countries Have Plenty of Advice. Available online: https://www.npr.org/sections/goatsandsoda/2020/04/22/840232210/how-do-you-do-contract-tracing-poor-countries-have-plenty-of-advice?utm_source=twitter.com&utm_medium=social&utm_term=nprnews&utm_campaign=npr&t=1588414769812&t=1589355578586 (accessed on 22 April 2020).
- Smith, A. An Inquiry into the Nature and Causes of the Wealth of Nations. 1776. Available online: https://en.wikisource.org/wiki/The_Wealth_of_Nations (accessed on 10 May 2020).
- Hoffman, C. How the Great Firewall of China Works to Censor China’s Internet. Available online: https://www.howtogeek.com/162092/htg-explains-how-the-great-firewall-of-china-works/ (accessed on 10 September 2017).
- Taibbi, M. The Inevitable Coronavirus Censorship Crisis is Here. Available online: https://taibbi.substack.com/p/temporary-coronavirus-censorship (accessed on 30 April 2020).
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bryant, A. What the Web Has Wrought. Informatics 2020, 7, 15. https://doi.org/10.3390/informatics7020015
Bryant A. What the Web Has Wrought. Informatics. 2020; 7(2):15. https://doi.org/10.3390/informatics7020015
Chicago/Turabian StyleBryant, Antony. 2020. "What the Web Has Wrought" Informatics 7, no. 2: 15. https://doi.org/10.3390/informatics7020015
APA StyleBryant, A. (2020). What the Web Has Wrought. Informatics, 7(2), 15. https://doi.org/10.3390/informatics7020015