Next Article in Journal
Assessment of Entrepreneurial Potential in the Training of a New Generation of Change Agents in Spain
Previous Article in Journal
Work–Life Conflict and Job Satisfaction: The Moderating Role of Gender and Household Income in Western Europe
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Surveillance Capitalism in Mental Health: When Good Apps Go Rogue (and What Can Be Done about It)

School of Business, National College of Ireland, D01Y300 Dublin, Ireland
*
Author to whom correspondence should be addressed.
Soc. Sci. 2023, 12(12), 679; https://doi.org/10.3390/socsci12120679
Submission received: 28 September 2023 / Revised: 1 November 2023 / Accepted: 4 December 2023 / Published: 8 December 2023

Abstract

:
Research shows that a large proportion of the world’s population has experience with mental health difficulties, and reliable as well as scalable care is urgently needed. Digital mental health seems to be an obvious solution to provide the better delivery of care but also the delivery of better care. With an imagined future of real-time information sharing, improved diagnosis and monitoring of mental health conditions, and remote care, supported by advances in artificial intelligence, many tech companies have emerged over the last three decades to plug the treatment gap and provide services. The evidence base seems compelling: some online treatments have the capability to treat individuals quite successfully. However, the introduction, utilisation, and expansion of digital mental health technologies have not always focused on public health only. Using a surveillance capitalism perspective, this paper approaches the democratisation–privatisation dichotomy in digital mental health with a critical lens. In particular, the paper details how (commercially valuable) mental health data are extracted, “shared”, and claimed as an asset by big tech companies. Reviewing the terms, conditions, and practices of ten popular mental health apps, the paper polemically argues that mental digital health cannot unlock real value for society—better treatment, good quality care, and efficient delivery—if power, politics, and profits remain in the hands of big tech companies. To conclude, the paper draws attention to contemporary discourses that seek to promote democracy and public value for digital mental health apps, technologies, and solutions.

1. Introduction

Hang on! Help is on the way…your app is downloading. There are apps for disorders such as depression, anxiety, gender dysphoria, and bipolar disorder but also for mental-health-related issues such as mood, sleep, mindfulness, concentration, and unhelpful thinking patterns, and these apps are very popular! The sheer volume of mental health apps (up to 20,000 on the market), their download numbers (in the millions), the valuations of these app companies (also in the millions), and the relative longevity of the apps speak to this point. But how do these apps really work, what value do they create, and, more importantly, for whom?
These issues are important: mental health is not just a defining frontier of modernity (Bemme and Kirmayer 2020) but also a profound public health concern (Jack 2020). Mental health is a “state of well-being in which an individual realizes his or her own abilities, can cope with the normal stresses of life, can work productively and is able to make a contribution to his or her community”, according to the World Health Organization (WHO 2022, n.p.) Public health is “the science of protecting and improving the health of people and their communities” (CDC 2023a, n.p.). While in-person treatments for mental disorders, e.g., anxiety, depression, bipolar disorder, post-traumatic stress disorders, schizophrenia, dissocial disorders, neurodevelopmental disorders, and eating disorders, exist, many people do not have access to effective care (CDC 2023a). Digital health appears to be an obvious solution here to provide treatment at scale and thus improve the health of populations (WHO 2023a). Digital health encompasses mobile health (e.g., apps), health information technology, wearable devices, eHealth, telehealth and telemedicine, and personalised medicine, and these leverage data, algorithms and artificial intelligence, computing platforms, connectivity, software, and sensors to work (FDA 2020). Topol (2012) argues that technology can create better healthcare in and through the “creative destruction of medicine”, and the promises of digital health are both significant and compelling—to accurately diagnose and treat disease, with better quality of care for the individual, operational efficiencies, scalable healthcare, and patient empowerment and patient-centredness, just to name a few (FDA 2022; Kraus et al. 2021). Yet, many modern public healthcare systems have struggled with the modernisation and digitalisation of health systems and infrastructures (CDC 2023b; EU Parliament 2023). In response, commercial digital health technology companies have sprung up over the past three decades to provide essential services and innovate how mental health is approached and delivered in practice (Geiger and Gross 2017; Pickersgill 2019). Contemporary digital mental health services include wellbeing and mental health apps, online platforms, internet-delivered Cognitive Behavioural Therapy (iCBT), and chatbots (Pickersgill 2019). Apps are of particular interest to us, as there are between 10,000 and 20,000 mental wellbeing apps available on the market now, and these have become particularly popular in the US and the EU (Wylie 2023). Examples include apps and platforms such as Calm, Happify, Headspace, Sanvello, BetterHelp, Talkspace, Circles Up, Moodfit, and Moodkit.
It is clear that technology companies have now become part and parcel of public health systems and infrastructures, yet critical commentators have long warned against putting public health matters into the hands of these for-profit companies. Many tech companies turn digital information and data resources into assets (Birch et al. 2021; Geiger and Gross 2021) and letting them extract value from public health in this way is leading to a “tragedy of digital commons in health” (Prainsack 2019; Sharon 2018). The tragedy of the commons is the loss of the common good as a result of the single-minded pursuit of the individual good (Hardin 1968). This is particularly a problem for mental health: mental health difficulties and conditions are highly prevalent around the world. In some European countries (EU) as well as the United States (US), over 40 percent of adults meet the criteria for a mental health condition (WHO 2022); many people are in desperate need of good-quality and accessible care. Promising technology solutions and services are available, but many of these companies have multi-sided platforms: their revenue model is built around the collection and sale of data. Data can be demographic and geographic information but also psychographic and behavioural insights. The latter two are important predictors of consumer behaviour (Gajanová et al. 2019; Samuel 2016), and our argument is that mental health apps have privileged access when it comes to a person’s (or patient’s) problems, personality, lifestyle, emotions, activities, interests, opinions, or attitudes—arguably more than other apps or platforms do. During the pandemic in particular, mental health apps connected with people when they were “at their most vulnerable” and made them “part of a hidden supply chain for the marketplace” (Cosgrove et al. 2020, p. 611). This value-creation-and-extraction process is not always readily clear or visible to the app’s users or indeed the wider public—hence this paper—yet platform business models like Google and Facebook have showcased just how profitable behavioural data really are. Up to 90 percent of Google’s revenue was made from selling personal data, and this amounted to USD 282.84 billion in 2022 alone (Investopedia 2023). Facebook made nearly USD 114 billion in 2022 from ads (Statista 2023).
This provocative paper speaks about the intricate relationship between democratisation and privatisation within the realm of digital mental health. In terms of organisation, we review how mental health has emerged as a commercial market before we discuss the promises and perils of mental health apps. We then expand on surveillance capitalism as the theoretical lens that has inspired this paper. Surveillance capitalism, according to Shoshana Zuboff (Zuboff 2019), is the widespread collection and commodification of personal data by corporations. Using five themes prevalent in the surveillance capitalism literature—the behavioural surplus, data accumulation and data sharing, institutionalised secrecy, breaking the social contract, and asserting rights—this paper abductively lays out how commercially valuable mental health data are accumulated, extracted, “shared”, and claimed as an asset by big tech companies. We use empirical illustrations from ten popular mental health apps to substantiate our points. Our main argument is the following: while digital mental health holds a lot of promise, the current surveillance capitalist practices of tech businesses fail to deliver public value. We also make the larger point in the paper that global public health issues cannot and should not be addressed in and through the services of private tech businesses. We conclude our paper by drawing together contemporary discourses that seek to push back against powerful and data-hungry tech business models.
But let us take a step back for a moment: how did we even get to a point where no data seem off-limits?

2. Theoretical Framework

2.1. Digital Mental Health as a “Market”

Technologies have been used for the management and treatment of mental health for over 30 years, including the electronic management of patient records, telepsychiatry, online therapy portals, internet Cognitive Behavioural Therapy (iCBT), mobile apps, and chatbots (Andrews et al. 2018; Kambeitz-Ilankovic et al. 2022; Pickersgill 2019). In the 1990s, digital mental health was based on biomedical virtues (Pickersgill 2019), and the imagined future was focused on improving public health: better connections, real-time information sharing, improved diagnosis and monitoring of chronic conditions, remote care, and disease prevention. These promises shaped the market (Araujo 2009) until around 2010, when technology development suddenly exploded, and an unprecedented number of highly competitive tech players entered the healthcare market to unlock this “emergent opportunity” (FDA 2022; Geiger and Gross 2017). Tech players received significant support in the United States and the EU at that time as states moved to adopt “market-based solutions” to deal with a looming crisis in public health (Powell and Arvanitis 2015; Zuboff 2019). However, faced with complex regulations (e.g., clinical trials and efficiency) and reluctant patients and healthcare providers, as well as sluggish payment systems (Geiger and Kjellberg 2021), many tech companies could not break into the public healthcare market. In response, they started to re-orientate their business models towards the direct-to-consumer market between 2011 and 2015 (Geiger and Gross 2017)—a market that needed no costly clinical trials or regulatory approvals: consumer apps and platforms are not medical devices after all. Even data protection legislation was light during that time, as landmark legislation like GDPR or the American Data Privacy and Protection Act did not come into effect until later. What is more, since the early and mid-2010s, consumer-patients have become empowered, ready, and willing to become enmeshed with these cool and seemingly helpful technologies (Geiger and Gross 2017; Powell and Arvanitis 2015). Even when data protection legislation eventually came into effect (e.g., GDPR in 2018), consumer apps found ways (and they still do!) to work within the confines of the laws to extract and assetise data—without providing much public value in return (Zuboff 2019). The recent pandemic has also acted as an accelerator here: when people became overwhelmed with the stresses and strains of the pandemic, and in-person mental health treatments became even less available, many resorted to downloading mental health apps, often with the approval and support of the leading medical journals and regulators (Cosgrove et al. 2020). As a result, downloads boomed in 2020 (Kirkpatrick 2022). GDPR legislation became truly tested (Christofidou et al. 2021), and when the EU decided to relax the GDPR rules to allow for contact tracing and technology-enabled care (EU Commission 2020), personal and medical data flowed at unprecedented levels.
Consumer-facing mental health apps are now quite abundant and widely used, many because they are accessible, easy to use, discreet, convenient, portable, and often cheap (Investopedia 2023; Wylie 2023). They are, commercially speaking at least, highly valuable, too. The digital mental health market was valued at USD 19.5 billion in 2022, with a projected growth of USD 72.3 billion by 2032 (Market Research Future 2023). The mental health app market was worth USD 5.19 billion in 2022, and its projected growth is USD 26 billion by 2032 (Precedence Research 2023). Calm, which is one of the most popular mental health apps, generated revenue of USD 200 million in 2020 alone (Precedence Research 2023).
It is worth mentioning here that some tech companies did not follow the direct-to-consumer market pathway, however. Digital therapeutics (DTx) companies use evidence-based therapeutic interventions driven by software to prevent, manage, or treat a medical disorder or disease (European Data Protection Supervisor 2023). DTx interventions and tech solutions are classified as medical devices (Fürstenau et al. 2023). As these apps, platforms, and tech services deliver clinically evaluated software to patients, they are arguably a real alternative to in-person treatments (Digital Therapeutics Alliance 2023). DTx is often directed towards a specific medical diagnosis, condition, and purpose, such as anxiety, depression, substance use disorders, autism spectrum disorder, chronic pain, post-traumatic stress disorder (PTSD), stress and burnout, attention-deficit hyperactivity disorder (ADHD), or insomnia (Fürstenau et al. 2023). The DTx market was valued at USD 5.09 billion in 2022 (Vaidya 2023), and examples of DTx companies include Pear Therapeutics, Mindstrong, Silvercloud, and Akili. DTx companies make clear and compelling contributions to public health: patient education, precision medicine, better clinical and patient decision making, scale, quality of care, and the generation of new knowledge (Fürstenau et al. 2023; Rohaj and Bulaj 2023). Yet, many DTx companies are struggling to move from bench to bedside, mainly due to fragmented healthcare approaches and regulations, the reluctance of healthcare providers and patients to adopt, the high costs of tech development, and a lack of change in antiquated payment systems (Wang et al. 2023). Some DTx companies, like Pear and Mindstrong, have even gone out of business in recent times.
Many patients are still left in a position where they cannot get access to in-person treatments or access DTx solutions, so they download and use direct-to-consumer apps. These widely distributed apps also provide (some) mental health solutions and services to people who are in need of care and support; however, as they have built their business models around their users’ data (Lupton 2014; Torous et al. 2016), these apps contribute greatly to the commodification of health and therefore to the loss of healthcare as a common good. The next section highlights the promises that these direct-to-consumer apps make but also lays out evidence to further explore what value these apps create and for whom.

2.2. The Promises and Perils of Consumer Mental Health Apps

Mental health apps make a lot of promising claims to provide help and care, such as “hundreds of hours of guided meditation covering anxiety, stress, sleep” (Calm), “overcome negative thoughts, stress and life challenges” (Happify), “apply effective strategies of professional psychology to your everyday life” (Moodkit), “human-to-human support by Ginger, backed by science, and boosted by technology “ (Headspace Health), and “talk about the ups and downs of grief, divorce, infertility, neurodiversity, and more” with a “diverse blend of community members, licensed experts, and thought leaders who want to help you through and share the hope” (Circles Up).
These promising claims have since been put to the test. iCBT, for instance, is a great showcase of how effective digital mental health approaches and treatments can be. First introduced in 1990 and later delivered over the internet (including through smartphone apps), evidence suggests that iCBT not only is effective for anxiety and depressive disorders but also may be equally effective as face-to-face CBT—the traditional gold-standard intervention for these disorders—demonstrating the genuine potential of digital interventions to reduce mental distress and improve mental health outcomes (Andrews et al. 2018; Kambeitz-Ilankovic et al. 2022). Beyond iCBT, research over the past few years has shown that other digital mental health approaches, such as those based on meditation, also show the potential to improve mental health outcomes (e.g., stress reduction). For example, in a single randomised control trial of the Calm app, the participants showed significantly improved stress levels, as well as better mindfulness scores (O’Daffer et al. 2022). Fourteen further studies of the Headspace app also showed significant positive effects: an improved mental disposition, better (learning) retention, less stress and aggression, greater feelings of happiness and satisfaction, less depression, better resilience and improved wellbeing, more compassion, and the feeling of social support (O’Daffer et al. 2022).
Beyond the use of specific brands, Gál et al. (2021) conducted a meta-analysis of 34 randomised controlled trials examining the effectiveness of mindfulness meditation apps to improve mental health outcomes, including 7566 participants in total. Small-to-moderate positive effects of mindfulness meditation apps on mental health outcomes such as stress, anxiety, and depression were reported compared to control interventions. However, only very few studies used control conditions that were carefully matched to the experimental conditions, such as a sham meditation using the same app and interface (e.g., Noone and Hogan’s 2018 study used a sham meditation provided by Headspace). The use of this type of control condition can provide stronger evidence that the effects observed are attributable to mindfulness meditation itself (the claim that is usually made), rather than another feature of the intervention. Gál and colleagues (Gál et al. 2021) note that the meta-analysis results should be interpreted with caution due to the small number of studies published and the uncertain risk of bias (e.g., selection bias). Overall, the data are promising, but more studies evaluating the effectiveness of meditation apps, including more carefully designed control conditions, would help further strengthen the current evidence.
Evaluating the effectiveness of app-based interventions for mental health more generally, another recent paper that reviewed “14 meta-analyses representing 145 randomized controlled trials and 47,940 participants” concluded the following: more than half of the mobile-phone-based interventions for mental health on the market rely on “half-baked science”, have zero clinical robustness, and show no net health benefits (Goldberg et al. 2022, n.p.). It is equally evident and concerning that the value generated by so many mental health apps—dare we even say most of them—is neither based on clinical effectiveness nor focused on public health outcomes. The next section discusses what value creation really means for digital mental apps and tech companies: the extraction and commodification of personal (or literally any!) data.

2.3. Surveillance Capitalism

Fourcade and Healy (2017) have long recognised that modern organisations are driven by a data extraction imperative, whereby maximising all data from all sources is part and parcel of value creation and value extraction. While the datafication and commodification of healthcare are nothing new (Hoeyer et al. 2019; Ruckenstein and Schull 2016; Timmermans and Almeling 2003), what is new is the “unprecedented” scale at which this is happening (Zuboff 2019). Once platform business models like Google and Facebook discovered the “behavioural surplus” in the 2000s, which is the behavioural data of the user, it did not take long for those companies to engage in surveillance capitalism (Zuboff 2019). Ever since, many tech platform providers have followed the surveillance capitalist’s business model, turning their users’ experience, information, and data into raw materials for use in marketing and advertising.
Surveillance capitalism, according to Zuboff (2019), is the widespread collection and commodification of personal data by corporations. It is the unilateral claiming of private human experiences as free raw material for translation into behavioural data and thus into assets. In practical terms, once a mental health portal is used or an app is downloaded by the user, the data become dispossessed (from the user) by incursion and are extracted with high velocity (and tech companies are quite “adaptable, flexible and dynamic” in their extraction processes to work around regulations!) before being directed into the fold of the tech companies’ business models (Zuboff 2019), where they become a ‘prized’ asset (Birch and Muniesa 2020; Geiger and Gross 2017, Pickersgill 2019). Any tech platform that enters the market nowadays has almost no choice but to engage in surveillance capitalism (Zuboff 2019), and as the paper will showcase, this is no different for mental health apps. But how does surveillance capitalism in mental health apps really work, and why are these practices so problematic?

3. Materials and Methods

We started this paper interested in the intersection between mental health, technology, and business practices. To explore the connection between the theory, the context, and the empirical world, we chose to work abductively (Sætre and Van de Ven 2021). Abduction enabled us to engage in “observing and confirming an anomaly, and generating and evaluating hunches that may explain the anomaly, for subsequent deductive constructing and inductive testing” (Sætre and Van de Ven 2021, p. 684). In essence, we were able to systematically combine and re-combine our ideas with relevant empirical data. Figure 1 illustrates our research process and method.

3.1. The Case and Theoretical Lens

This paper started with our selection of mental health as an interesting and important case or context to work with (Yin 1994). Many people worldwide experience mental health problems or disorders in their lifetime (Dattani et al. 2021; WHO 2022). The recent COVID-19 pandemic had a compounding effect on mental health and wellbeing, whereby people prone to psychological problems suffered most (Cullen et al. 2020). In 2023, depression was the leading cause of disability worldwide, and suicide was the fourth leading cause of death among 15–29-year-olds (WHO 2023b). The numbers are compelling: mental, emotional, and psychological health is a significant public health issue (Jack 2020), and effective as well as scalable treatments are urgently needed. Research also strongly suggests that great technologies are available but that they have yet to unlock their full value to society.
In and through our research, we have monitored the digitalisation, datafication, and commodification trends in healthcare over the last decade. This led us to choose surveillance capitalism as the theoretical perspective—or “hunch”, as Sætre and Van De Ven (2021) called it—to work with. We felt that the digital mental health literature had yet to discuss surveillance capitalism in more depth; however, we also understood that surveillance capitalism scholars would benefit from insight into the practices of direct-to-consumer mental health apps, particularly with regard to the types of data (personal, psychographic, and behavioural) that these apps have such privileged access to. We wanted to position this paper as a polemic piece, whereby the data collected would provide insights as well as serve to illustrate our points.

3.2. The Empirical World

To research how mental health data are extracted, shared, and claimed by big tech companies in and through consumer apps, the researchers looked towards the so-called “top 10” mental health apps of 2023. By Googling keywords such as “best mental health apps” and “mental health apps that work”—just like a consumer-patient would—we found that the following app brands came up quite frequently (see Dowart 2023, Leamy 2023 or Modglin 2023, for example): Calm, Happify, Headspace, Circles Up, Moodfit, Moodkit, Sanvello, Breathe Think Do with Sesame, Talkspace, and Betterhelp. While this selection is not unbiased, and other apps could have been chosen (we found around 25 initially), we performed serious checks to ensure that our selection was robust, representative, and insightful. First, we checked that these apps are in fact direct-to-consumer/wellness apps. Unlike DTx, consumer apps are freely available on the market and not FDA-approved. Second, we checked that these apps are provided by for-profit companies. Third, we checked that the apps were successful: by success, we mean longevity, valuation, and downloads. Forth, we carefully reviewed each app’s “therapeutic focus” to ensure that we had included a good selection here.

3.3. A Multi-Method Data Collection

We deployed a multi-method qualitative approach (Gross and Geiger 2023) to collect relevant data from the ten apps selected for further investigation and to generate a rich illustration of the theoretical points chosen. To gather data (see Figure 2), we researched external sites, such as Crunchbase, Owler, TechCrunch, Forbes, The Wall Street Journal, or StatNews, to explore the apps’ business models and gather financial/funding information, pricing, or download information. Academic journals were also consulted to research the business models, as well as the clinical evidence base, of direct-to-consumer mental health apps (but also DTx).
We also looked the apps up in the two main app stores: iTunes and Google PlayStore. This step was taken to understand what they do/sell, how they position themselves, and how many downloads they have but also to corroborate that consumer-patients place their trust in these apps. We downloaded the apps’ terms and conditions, consent information, and all available privacy policy documents—223 typed pages, which made up the bulk of our data—to further explore the surveillance capitalism practices and approaches in mental health. All data were stored in shared folders. To triangulate, we visited the websites of the 10 companies chosen and took selective screenshots of their landing pages, marketing materials (website, PR releases, and selected social media posts), and cookie agreements. These materials were also stored in our shared folders. We chose the marketing materials reflexively (Gross and Geiger 2023)—whatever piqued our interest; however, as the folder grew, we quickly realised that the apps followed a similar pattern of making big marketing promises whilst deploying secretive data extraction practices. Furthermore, as both authors had personal experience with some of these mental health apps, e.g., the Calm and Headspace apps, we decided to include certain personal insights in the data collection, mainly to supplement and triangulate the data. While personal experience can introduce bias into data collection, a debriefing session allowed us to understand that our experiences fell in line with those of ordinary consumer-patients. We had trusted these apps, fallen for their click-and-wrap agreements, failed to read the terms and conditions, and been naïve about the data collected from us. When we used them, we found them fun and relaxing, though it is difficult for us to ascertain whether or how much they worked to improve our wellbeing.
Our data collection was predominantly based on secondary materials. As no individual subjects (other than us and our personal experience) were involved in the research, the research was exempt from an institutional ethical review. However, we acknowledge that identifying mental health apps and calling out their business practices with regard to surveillance capitalism has the potential (and a purpose!) to undermine big tech’s business models and initiate a privacy awakening among customers.

3.4. Theory-Led Content Analysis and Data Presentation

We deployed a content analysis (Mayring 2000) to analyse our data. Content analysis allows researchers to analyse a wide variety of texts and communications, make qualitative text interpretations, and inductively test. Having stored our data (Figure 1) centrally, we reviewed and evaluated the documents and artefacts carefully. Inspired by Zuboff’s (2019) work on surveillance capitalism, we chose a directed content analysis (Hsieh and Shannon 2015), which is theory-led. While prior research exists in the field, we felt that digital mental health would reveal further insights with regard to surveillance capitalism, mainly because of the magnitude and type of data generated; thus, further research was warranted. Revisiting Zuboff’s work, we extrapolated theoretical constructs that we could use as pre-determined categories to look at the data but also to present it in the findings section: the behavioural surplus, data accumulation and data sharing, institutionalised secrecy, breaking the social contract, and asserting rights. The variety of data collected enabled us to triangulate our findings and establish trustworthiness (Elo et al. 2014). Our long-standing expertise when it comes to data collection, analysis, and reporting (including publishing) supported our quest to make the research credible and dependable. Our research background—one author is in psychology and the other is in business and society—allowed us to acknowledge and reduce biases in our interpretation. Multiple collection sources have also ensured that our findings are transferrable to other contexts where apps collect and monetise highly personal (and/or medical) data. Our findings are presented and discussed next.

4. Findings

4.1. The Behavioural Surplus

Many digital technologies have a clear extraction imperative (Zuboff 2019). Previous research (Cosgrove et al. 2020, p. 618) highlights that many mental health apps, in that case, Mindstrong, “operate at a powerful intersection of digital surveillance technologies in the service of markets and the cultural legitimacy granted to the psy-disciplines”. The mental health apps examined in this paper also work to procure consumer data at scale and on an ongoing basis. An examination of the terms and conditions (T&Cs), as well as the privacy policies (PPs), showcases that these apps absorb the user’s personal information (which may include name, DOB, phone number and email, home address, social media info to log in, feedback, goals, reflections, biomedical info like weight and moods, etc.). Furthermore, they also take usage data, transactional data, log data, device data, recorded phone and video, location, analytics, and cookies data. Circles Up, for instance, collects any data and content that may come out of a circle—threads, chats, and phone calls (recordings and transcripts) between customers, peers, and/or facilitators. Consumers are also encouraged to keep the information flowing through surveys (e.g., Calm), optional features (e.g., Moodkit, see Table 1), or “My Progress” or “Buddy” features. Headspace’s Buddy features —depicted in marketing materials in and through cute, smiley and trustworthy-looking animations—invite customers to share more (and more private) data: “Our community is full of like-minded people on a path to more mindfulness. And with the newest version of our Buddy feature, you can now add unlimited buddies, share your progress, and send messages to inspire one another” (Headspace 2019).
In addition, apps like Sesame Workshop (Breathe, Think, Do with Sesame) and Thriveport (Moodkit) have multiple apps in their portfolio. This enables them to extract data across multiple platforms and deepen data profiles so that they tap into lucrative future behavioural markets (Cosgrove et al. 2020; Zuboff 2019), including the sale of data to third parties but also the targeting of their own content and downstream products. Cosgrove et al. (p. 618) stated it well: mental health apps cloak their value proposition in the “objective and scientific language of mental disorders”, yet by targeting behavioural surplus data and putting the needs of the market first, they violate “a humanistic focus on dignity, meaning making, and the sociopolitical determinants of well-being”. Not only is the intersection between the apps (“the market”) and public health a showcase of ethical entanglements (Zuboff 2019), but it also opens the wider debate around the politics of healthcare and the commodification of health in particular.

4.2. Data Accumulation and Data “Sharing”

Having amassed millions of downloads (see Table 2), mental health apps have economies of scale (Zuboff 2019). This scale ensures that there is enough behavioural information coming so it can be aggregated and subsequently “shared”. These apps have gained millions of dollars in investments and have high valuations, which speaks to the point that future behavioural data—and mental health data are arguably the crème de la crème here—are extremely valuable in the market.
Data sharing means that the extracted data are being transferred to “others”. Others refers to anyone and anything that trades in markets in future behaviour (Cosgrove et al. 2020; Zuboff 2019), and that means that any actor with an interest in buying behavioural information about people or influencing their behaviour through data can “pay to play in markets where the behavioural fortunes of individuals, groups, bodies and things are told and sold” (Zuboff 2019, p. 96). However, the process and practice of data sharing are deeply buried in the platforms’ T&Cs and PPs, cloaked under descriptions such as third parties, partners, affiliates, business associates, sponsors, groups, communities, or research collaborators (see Table 3).
While some apps claim that they do not “sell data”, it is evident that data are still being shared, and this is a common practice of surveillance capitalists according to Zuboff (2019). What is clear is that customers/users/patients have very little information about, insight into, or choice of what happens to their data once they have given their “informed consent”.

4.3. Institutionalised Secrecy

Institutionalised secrecy, according to Zuboff (2019), refers to the process of keeping the user in the dark when it comes to the “dispossession” of their data. Secrecy is created by obscuring what is being collected, how this is occurring, what is being sold (or leaked), and to whom. The behaviour surplus section illustrated what is being collected, and the data accumulation/sharing section showcased with whom the apps share. The “how” and “what is sold” questions are equally problematic. All ten platforms feature T&Cs, PPs, and cookie preference agreements. For instance, Calm’s (2023, n.p.) cookie agreements – set by default to ‘accept’- reads as follows:
“Calm uses cookies to understand the way you use our website and help us to improve it, as well as to personalise content and target ads, including by working with third party analytics partners. By clicking ‘Accept All’ below, you consent to our use of cookies. You can withdraw your consent or learn more information on our Cookie Policy”.
Similarly, Talkspace (2023, n.p.) states that “By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts”. Again, the default was set to ‘accept all cookies’. Only by clicking into cookie settings, and specifically ‘targeting cookies’, a user would find out that Talkspace’s (2023, n.p.) “cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites”.
“Click and wrap” agreements ensure that users engage with the app and platform quickly and in a frictionless manner; yet the cookie agreements alone point to the extraction imperative of these apps. Zuboff (2019) explains that popular sites collect as many as 6000 cookies, 83 percent of which are from unrelated third-party sites. Illustratively, the Calm app cookie agreement shows a reference to third parties’ analytics partners. Almost all users click these “blurbs” away without understanding what happens to their data, where they go, and how they are being used. Many sites also give users no option but to agree to their data extraction intents and practices. Happify (2023), for instance, states in bold letters: “IF YOU DO NOT AGREE TO THE TERMS OF THIS PRIVACY POLICY AND THE USE OF YOUR PERSONAL DATA AS STATED HEREIN IN THEIR ENTIRETY, YOU MAY NOT ACCESS OR OTHERWISE USE THE SITE AND/OR HAPPIFY™ OFFERINGS.” Circles Up has a similar statement: while users are told in the Privacy Policy that they do not have to submit any personal information, “User may not be able to become a Group Member or Peer Group Member and/or use the Services” (Circles Up 2023, n.p.)
These apps also have lengthy, cumbersome (and at times well hidden) PPs and T&C contracts. Personal experience with these apps indicates that the terms, conditions, and privacy policies are accessible and readable on the surface but could be challenging to understand at a deeper level for users lacking knowledge and training on these issues, raising the possibility of many users simply being unaware of how companies are using their data. However, once users agree—or ignore, in the case of most T&Cs or PPs—their personal information, biomedical data, and any other experience/data become “or sale” (while being fully GDPR-compliant!). And even when consumer-patients are fully aware of cookies and consent and the apps have reasonable data security features in place, the data are still not safe. A recent investigation of one million websites by Norton Labs showed that 80 percent of websites leak user data (Kats et al. 2022). Up to 90 percent of the leaked data surreptitiously end up in the hands of major data extractors, including Google, Facebook, and Twitter (Zuboff 2019). This not only underscores the pervasive and secretive nature of surveillance capitalism but also highlights the practical shortcomings of data protection laws like GDPR.
Finally, secrecy also means keeping consumers in the dark about what kind of personal, behavioural, and psychographic data are for sale on the market. A recent report by Keegan and Eastwood (2023) gives a rare insight into how advertisers label people—anything from “depression-prone”, “easily-deflated”, “getting a raw deal out of life”, “trapped neurotic”, “receptive to emotional messaging”, “aspiration/happiness seeker”, “having bottled up stress”, “lone wolf”, “concerned with self-image”, or “stress-reactor”. There are as many as 650.000 ways or micro-categories to label a person, which means that their personality and problems, as well as their medical history and health, are up for “commercial grabs” (more data mean better targeting and therefore higher profits for data extractors and data buyers alike). Zuboff (2019, p. 45) problematises this point particularly well: “Our expectations of psychological self-determination are the grounds upon which our dreams unfold, so the losses we experience in the slow burn of rising inequality, exclusion, pervasive competition, and degrading stratification are not only economic”.

4.4. Breaking the Social Contract

Mental health apps make a lot of promising claims, some of which have been mentioned earlier. On its landing page, Happify (2023, n.p.), states the app helps to “break old patters and form new habits”. Specifically, the app advertises itself as such:
“How you feel matters! Whether you’re feeling sad, anxious, or stressed, Happify brings you effective tools and programs to help you take control of your feelings and thoughts Our proven techniques are developed by leading scientists and experts who’ve been studying evidence-based interventions in the fields of positive psychology, mindfulness, and cognitive behavioral therapy for decades” Happify (2023, n.p.).
However, these apps do not provide clinical or medical advice or care (see Calm, Happify, Headspace, and Moodfit), nor can their services be equated to those of a doctor (see Sanvello) (see Table 4). Users can only use the apps at their own risk and without any warranty (see Moodkit and Sesame Workshop), and no platform is ever liable for the services provided, not even when the service is provided by licensed therapists (see Talkspace and Betterhelp). Previous research has found that platform business models in health tend to valorise elements like empowerment (you can be a quasi-doctor or scientist), self-care (look after yourself, you matter), and scientific progress to the user (Geiger and Gross 2021). Yet, they also obscure the lack of biomedical virtues, lack of science, and uncertainties around data ownership (Geiger and Gross 2021). This is the same for mental health apps here: they collect precious data related to mental health, emotions, and wellbeing only to assetise such data if and when the opportunity arises. It is thus fair to say that these apps fail to give a fair, reciprocal value back to their users, an act that Zuboff (2019) has previously compared to a “one way mirror”.

4.5. Asserting Rights

Finally, the success of a surveillance capitalist business model depends on the tech platform businesses’ decision to claim property rights over their users’ behavioural (and/or any other) data. As their T&Cs as well as PPs show (see Table 5), the users’ data become part of the tech platform’s intellectual property and thus can be sold as an asset. An asset is both a resource and property that generates an income (Birch et al. 2021). The Wall Street Journal estimates that personal data were worth USD 455.3 billion in 2021 (Haggin 2021), and medical information is worth as much as USD 1000 per person/year (Stack 2017). Whilst some companies agree to delete the data after a reasonable time, some companies stake a claim on the data for a long time after the user has ceased to use the service: for instance, “Happify™ will not retain data for more than 7 years beyond the date the user last logs in to Happify” (Happify 2023). Retaining data for as long as possible gives the apps the chance to maximise value, which means accumulating, analysing, and assetising the consumer’s data for as long as possible. In addition, all companies reserve the right to change or modify their services, T&Cs, and PPs at any time and in any way they see fit: “We reserve the right to make changes to, or to suspend or discontinue (temporarily or permanently), any portion of the Services.” (e.g., Sanvello).
Summing up, it becomes clear that (1) behavioural and other user data related to health are commercially valuable, (2) data collection and extraction are pervasive practices in the mental health app industry, and (3) these practices are exploitative, whereby the tech platform sees the user through a “one way mirror” (Sadowski 2019; Zuboff 2019). What is problematic here is that the behavioural and/or any other data of mental health app users are not neutral, nor are these data used for light-hearted marketing and advertising purposes only. Mental health and wellbeing data are behavioural data, which reveal how people make sense of, interpret, and interact with the world (Kitchin 2014; Zuboff 2019). In the world of commerce, behavioural data become (valuable) psychographic information that can be used to influence consumer behaviour (Gajanová et al. 2019; Samuel 2016; Zuboff 2019). This paper illustrates just how hard big tech works to collect these data and ensure that the data keep coming (Geiger and Gross 2021). Equally, it also shows that the power–knowledge relationship is off-kilter, as tech companies have managed to make significant profits from data kidnapping, cornering, and competing over the past decade (Thatcher et al. 2016; Timmermans and Almeling 2003; Zuboff 2019)—without delivering any significant public health benefits in return. The next section will discuss contemporary discourses that are pushing back against powerful and profit-hungry tech business models.

5. Democracy, Public Health, and Health as a Common Good

In digital mental health, surveillance capitalism has taken the reigns, and economic profits have been prioritised over privacy, public health, and health as a common good. While these developments seem pervasive, we also believe that practices are not set in stone and how the mental health crisis is addressed in and through digital technologies can be re-orientated. The next section maps three contemporary discourses that act to push back on the surveillance capitalist model in consumer mental health apps: privacy awakening, data justice, and a data solidarity movement.

5.1. A Privacy Awakening

As far back as the 1980s, critics recognised that people have a right to be left alone (Warren and Brandeis 1890). Similarly, Zuboff (2019) states that people have a right to dignity, privacy, and the opportunity to live an effective life. Yet, surveillance capitalism has turned people into objects from which raw materials (i.e., data) are often secretly extracted and expropriated for present and future market transactions (Zuboff 2019). The business models and practices of consumer tech have rendered the concept of privacy essentially dead. That said, civic society has recently started to wake up to this “epistemic trauma” (Hayles 2009; Powell 2020; Zuboff 2019). Looking past the alluring promises of technology and the facade of “freely flowing information,” people have started to grow concerned about their data and are beginning to doubt the trustworthiness of apps. Perhaps they are awakening to the extent of what they have already surrendered. Consumers should (and have started to) demand dignity, autonomy, privacy, and sanctuary in the digital space, including health (Zuboff 2019). Sanctuary means a place (and a future) where people feel emancipated, valued, and safe. However, given the power asymmetries that are in place—created by secretive practices, complex business models, and a lack of meaningful state intervention—civic society has limited options when it comes to launching lasting action. They can stop downloading and using apps or platforms or delete them altogether. There has been some evidence of push-back from “the market”, i.e., users and healthcare providers, as mental health app installations dropped by as much as 30 percent between 2020 and 2021 (Kirkpatrick 2022). However, there is still scope for consumer education as well as public support to guide this privacy-awakening process further.
Efforts to educate the public on the issue of data privacy and empower them with knowledge could follow the “boost approach” described by Hertwig and Grüne-Yanoff (2017). A boost is a behavioural science intervention that aims to improve people’s competence to make their own decisions, for example, educating people about how to understand statistics, health information, and financial decision making. Lorenz-Spreen and colleagues (Lorenz-Spreen et al. 2021) illustrated that a short boost intervention that made participants reflect on their personality managed to increase their ability to identify advertising targeting them by up to 26 percent. We suggest that these kinds of boost strategies could potentially help empower citizens to identify situations in which their behaviour is being exploited for commercial interests.

5.2. A Data Justice Movement

The data justice movement recognises that “the way data is generated, collected and used in society and everyday life has become an increasingly prominent and contentious issue” (Dencik et al. 2022, p. 873). The datafication of society, coupled with the emergence of surveillance capitalism, has resulted in power asymmetries that require both further research and critique. Data justice acknowledges that social justice is at risk in our datafied society and thus needs to be protected (Taylor 2017). Data are neither good nor bad; however, the processing of data has an impact on what is knowledge and what is known, what kinds of information are of “value” and how this information is valued, and how this information is acted upon, by whom, and why. As different actors, interests, and social forces come together in the market, data thus need to be understood in relation to a broad set of social practices (Dencik et al. 2022). For digital mental health, it is worth drawing a distinction between DTx and consumer apps here once again. DTx solutions, even if they are provided by for-profit companies, are FDA-approved medical devices that are prescribed by healthcare providers to deliver clinically evaluated software to patients (Digital Therapeutics Alliance 2023; Fürstenau et al. 2023; Vaidya 2023). As they are being reimbursed like other medical treatments (e.g., through the national healthcare system or insurance), they do not rely on selling data on multi-sided market platforms—though DTx companies also tend to collect and analyse data. As this paper has shown, the value creation process for consumer mental health apps is highly contentious from a data justice perspective: secretive, unfair, and possessive.
Taylor (2017) suggests the following three pillars of data justice to connect digital rights and freedoms globally. One is “visibility”, which deals with the need for both privacy and further representation, which means understanding how much of these data are considered a common good. This raises the crucial question for both citizens and regulators: should any health-related data have been allowed for commercialisation in the first place, or indeed any longer? Of course, this means pushing back against the corporate reality of big tech and big data and creating an ecosystem that supports the public good. And some progress has been made here in recent times: academics have developed and launched a publicly available software tool called PLUTO, which measures the public value of (specific) data (El-sayed and Prainsack 2022). Audits, as well as initiatives like this, help to ensure that digital mental health technologies, apps, and platforms (and their business models!) stay closely connected with public health purposes. However, given that big tech’s business model is deeply anchored in power, politics, and profits, more significant changes—beyond CSR or other company-led voluntary efforts—will need to be initiated and directed in and through regulation and legislation. The second, “engagement with technology”, relates to sharing the data’s benefits and enhancing people’s autonomy regarding the technology (i.e., the choices they have) (Taylor 2017). For mental health apps, this means fully laying out—in an accessible, fair, and clear way—how their data-driven business model really works and giving people more and more informed choices when it comes to the collection, processing, and sharing of their data. Again, voluntary approaches or soft standards are not likely to dislodge power relations and create a lasting change here. Third is non-discrimination, which relates to the ability to challenge bias and prevent discrimination (Taylor 2017). This means privacy awakening and education, unveiling secretive business and market practices (see findings sections), and again, enforcement via much stricter regulations and laws than the current ones.

5.3. Data Solidarity

When surveillance capitalism took over the “market”, a tragedy of digital commons in public health also unfolded (Greco and Floridi 2003; Prainsack 2019; Sharon 2018). An antidote to surveillance is a deeper focus on data solidarity, which is “to distribute the risks and benefits of digital practices more equitably by facilitating data uses that create public value and by preventing and mitigating harm”—bringing sharing and caring back in line (Prainsack 2023, n.p.). Data solidarity acknowledges that fairness and equality cannot easily be achieved, as there are many entrenched “social structures of recognition and concrete experiences of discrimination and injustice” (Braun and Hummel 2022, n.p.). Data solidarity relates to the shared practices of individuals or groups, particularly when it comes to the risks and benefits that come with technology and data. Some suggestions have been made above when it comes to organisational and regulatory approaches that facilitate solidarity. Another way to approach solidarity is to use shared data pools. Data pools are an emergent reality in Europe whereby tech companies and public institutions come together to innovate and advance digital health but also to balance the public good with commercial interests—though policies and regulations (data protection, single market, and competition law), or amendments to them, are still likely to be needed to facilitate solidarity (Schneider 2022).
Data solidarity also relates to the creation of social bonds and shared goals, and it is a recognition of what has previously been excluded from social practices. Data justice means bringing previously marginalised groups—the user/consumer/patient, in our case—back into the fold of public (mental) health, either through digital (e.g., DTX or heavily “tamed” consumer apps) or traditional approaches. When it comes to mental health apps, this means engaging in business and data practices that give people meaningful control over data but also facilitate fairness and public value (Prainsack and El-Sayed 2023; Prainsack et al. 2022). It also means detecting individual as well as epistemic injustices, facilitating solidaristic movements (such as the aforementioned privacy awakening), and developing a dedication to shared value. Lastly, solidarity needs to reach far beyond the simple flagging of the issues at hand: it is about creating change by establishing rules, standards, and structures, and our paper has given suggestions here to that point.
Summing up, Prainsack et al. (2022) make the point that we need to rethink how we own, oversee, and govern data and make changes so that data can be put to good use. By good use, Prainsack and colleagues mean preventing and mitigating harm and/or returning profits back to the common good: we fully agree with this point. We believe that as long as power, politics, and profits remain in the hands of big tech, public value cannot be delivered to people and their communities. Our paper contributes by building an evidence base that scrutinises, judges, and critiques the practices of big tech, and this is evidence important to dislodge the instrumentarian power of big tech, strengthen collective control, and re-establish a focus on democracy. We also wanted to reemphasise the point that better rules, standards, and structures are needed to ensure that the benefits, costs, and risks of digital health are born collectively, fairly, and democratically (Prainsack et al. 2022). Figure 3 summarises the discussion points brought forward in this paper.

6. Concluding Remarks

This paper has showcased just how vulnerable public health and mental healthcare are to the unprecedented, strong presence of surveillance capitalism in digital mental health, which is now a reality. Yet, the future is not set in stone. In the words of Zuboff (2019, p. 62),
“If the digital future is to be our home, then it is we who must make it so. We will need to know. We will need to decide. We will need to decide who decides. This is our fight for a human future.”
Critics have emerged and consumer-patients are starting to awaken: the surge in the literature, as well as the falling number of app downloads (and their increasing failure, too), speaks to that point. Taming surveillance capitalism starts with unravelling, naming, and shaming (Zuboff 2019). This paper has added to the debate by mapping out business practices and friction points in digital mental health. Change will not be easy, given the entrenched nature of power, the pervasiveness of politics, and the influence of profits. Also, the unprecedented nature of surveillance capitalism has so far escaped any major contest (Zuboff 2019). Nevertheless, we have outlined several approaches that can initiate meaningful changes. In this paper, we have spoken out about the future that we want and suggested actions that will get us there, and we would encourage others—including citizens, organisations, regulators, and legislators—to similarly embrace these ideas and take further action.

Author Contributions

Conceptualization: N.G. and D.M.; methodology and analysis: N.G.; validation: N.G. and D.M.; writing (original and revision): N.G. and D.M.; project administration: N.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study publicly accessible but can also be made available on request from the corresponding author.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Andrews, Gavin, Ashna Basu, Pim Cuijpers, Michelle G. Craske, Peter McEnvoy, Cara L. English, and Jill M. Newby. 2018. Computer therapy for the anxiety and depression disorders is effective, acceptable and practical health care: An updated meta-analysis. Journal of Anxiety Disorders 55: 70–78. [Google Scholar] [CrossRef] [PubMed]
  2. Araujo, Luis, and Kjellberg Hans. 2009. Shaping exchanges, performing markets: The study of marketing practices. In The SAGE Handbook of Marketing Theory. Edited by Pauline Mclaran, Michel Saren, Barbara Stern and Mark Tadajewski. Newcastle upon Tyne: Sage, pp. 195–218. [Google Scholar]
  3. Bemme, Dörte, and Laurence Kirmayer. 2020. Global Mental Health: Interdisciplinary challenges for a field in motion. Transcultural Psychiatry 57: 3–18. [Google Scholar] [CrossRef] [PubMed]
  4. Birch, Kean, and Fabian Muniesa. 2020. Turning Things into Assets in Technoscientific Capitalism. London: MIT Press. [Google Scholar]
  5. Birch, Kean, D. T. Cochrane, and Callum Ward. 2021. Data as asset? The measurement, governance, and valuation of digital personal data by Big Tech. Big Data & Society 8: 20539517211017308. [Google Scholar] [CrossRef]
  6. Braun, Matthias, and Patrick Hummel. 2022. Data justice and data solidarity. Patterns 23: 1–8. [Google Scholar] [CrossRef]
  7. Calm. 2023. Cookie Preferences. Available online: https://www.calm.com/ (accessed on 27 October 2023).
  8. CDC. 2023a. What Is Public Health? Available online: https://www.cdcfoundation.org/what-public-health, (accessed on 25 September 2023).
  9. CDC. 2023b. CDC’s Global Digital Health Strategy. Available online: https://www.cdc.gov/globalhealth/topics/gdhs/index.html (accessed on 25 September 2023).
  10. Christofidou, Maria, Nathan Lea, and Pascal Coorevits. 2021. A Literature Review on the GDPR, COVID-19 and the Ethical Considerations of Data Protection During a Time of Crisis. Yearbook of Medical Informatics 30: 226–32. [Google Scholar] [CrossRef] [PubMed]
  11. Circles Up. 2023. PRIVACY Policy. Available online: https://circlesup.com/privacy/ (accessed on 27 October 2023).
  12. Cosgrove, Lisa, Justin M. Karter, Zenobia Morrill, and Mallaigh McGinley. 2020. Psychology and Surveillance Capitalism: The Risk of Pushing Mental Health Apps During the COVID-19 Pandemic. Journal of Humanistic Psychology 60: 611–25. [Google Scholar] [CrossRef]
  13. Cullen, Walter, Gautam Gulati, and Brendan D. Kelly. 2020. Mental health in the COVID-19 pandemic. QJM 113: 311–12. [Google Scholar] [CrossRef]
  14. Dattani, Saloni, Lucas Rodes-Guirao, Hannah Ritchie, and Max Roser. 2021. Mental Health. Available online: https://ourworldindata.org/mental-health (accessed on 9 June 2023).
  15. Dencik, Lina, Arne Hintz, Joanna Redden, and Emiliano Trede. 2022. Data Justice. New York: Sage Publications. [Google Scholar]
  16. Digital Therapeutics Alliance. 2023. What Is a DTx? Available online: https://dtxalliance.org/understanding-dtx/what-is-a-dtx/ (accessed on 25 September 2023).
  17. Dowart, Laura. 2023. Best Mental Health Apps. Available online: https://www.verywellmind.com/best-mental-health-apps-4692902 (accessed on 9 June 2023).
  18. Elo, Satu, Maria Kääriäinen, Outi Kanste, Tarja Pölkki, Kati Utriainen, and Helvi Kyngäs. 2014. Qualitative Content Analysis: A Focus on Trustworthiness. SAGE Open 4: 2158244014522633. [Google Scholar] [CrossRef]
  19. El-sayed, Seliem, and Barbara Prainsack. 2022. Assessing Public Value: A Tool for Structured Assessment. Available online: https://digitize-transformation.at/news-und-events/detailansicht/news/plutopubval-public-value-tool/?tx_news_pi1%5Bcontroller%5D=News&tx_news_pi1%5Baction%5D=detail&cHash=793c41c8614e5c82ae5df20fa6c72168 (accessed on 3 August 2023).
  20. EU Commission. 2020. Assessment of the EU Member States’ Rules on Health Data in the Light of GDPR. Available online: https://health.ec.europa.eu/system/files/2021-02/ms_rules_health-data_en_0.pdf (accessed on 29 June 2022).
  21. EU Parliament. 2023. Public Health. Available online: https://www.europarl.europa.eu/factsheets/en/sheet/49/public-health#:~:text=EU%20public%20health%20policy%20aims,prevent%20and%20address%20future%20pandemics (accessed on 25 September 2023).
  22. European Data Protection Supervisor. 2023. Digital Therapeutics (DTx). Available online: https://edps.europa.eu/press-publications/publications/techsonar/digital-therapeutics-dtx_en (accessed on 25 September 2023).
  23. FDA. 2020. What Is Digital Health. Available online: https://www.fda.gov/medical-devices/digital-health-center-excellence/what-digital-health (accessed on 9 June 2023).
  24. FDA. 2022. Focus Area: Digital Health Technologies. Available online: https://www.fda.gov/science-research/focus-areas-regulatory-science-report/focus-area-digital-health-technologies (accessed on 25 September 2023).
  25. Fourcade, Marion, and Kieran Healy. 2017. Seeing like a market. Socio Economic Review 15: 9–29. [Google Scholar] [CrossRef]
  26. Fürstenau, Daniel, Martin Gersch, and Stefanie Schreiter. 2023. Digital Therapeutics (DTx): Apps on Prescription. Business & Information Systems Engineering 65: 349–60. [Google Scholar]
  27. Gajanová, Gajanova, Margareta Nadányiová, and Dominika Moravčíková. 2019. The Use of Demographic and Psychographic Segmentation to Creating Marketing Strategy of Brand Loyalty. Scientific Annals of Economics and Business 66: 65–84. [Google Scholar] [CrossRef]
  28. Gál, Eva, Simona Ștefan, and Iona A. Cristea. 2021. The efficacy of mindfulness meditation apps in enhancing users’ well-being and mental health related outcomes: A meta-analysis of randomized controlled trials. Journal of Affective Disorders 279: 131–42. [Google Scholar] [CrossRef] [PubMed]
  29. Geiger, Susi, and Hans Kjellberg. 2021. Market mash ups: The process of combinatorial market innovation. Journal of Business Research 124: 445–57. [Google Scholar] [CrossRef]
  30. Geiger, Susi, and Nicole Gross. 2017. Does hype create irreversibilities? Affective circulation and market investments in digital health. Marketing Theory 17: 435–54. [Google Scholar]
  31. Geiger, Susi, and Nicole Gross. 2021. A tidal wave of inevitable data? Assetization in the consumer genomics testing industry, Business & Society 60: 614–49. [Google Scholar]
  32. Goldberg, Simon B., Sin U. Lam, Otto Simonsson, John Torous, and Shufang Sun. 2022. Mobile phone-based interventions for mental health: A systematic meta-review of 14 meta-analyses of randomized controlled trials. PLOS Digital Health 1: e0000002. [Google Scholar] [CrossRef]
  33. Greco, Gian Maria, and Luciano Floridi. 2003. The Tragedy of the Digital Commons. Ethics and Information Technology 6: 73–81. [Google Scholar] [CrossRef]
  34. Gross, Nicole, and Susi Geiger. 2023. A Multimethod Qualitative Approach to Exploring Multisided Platform Business Models in Health Care. In Sage Research Methods: Business. New York: SAGE Publications Ltd. [Google Scholar]
  35. Haggin, Patience. 2021. Personal Data Is Worth Billions. These Startups Want You to Get a Cut. Wall Street Journal. December 4. Available online: https://www.wsj.com/articles/personal-data-is-worth-billions-these-startups-want-you-to-get-a-cut-11638633640 (accessed on 9 June 2023).
  36. Happify. 2023. Legal. Available online: https://www.happify.com/public/legal/#legal (accessed on 25 September 2023).
  37. Hardin, Garreth. 1968. The Tragedy of the Commons. New Series 162: 1243–48. [Google Scholar] [CrossRef]
  38. Hayles, Katherine. 2009. Waking up to the Surveillance Society. Surveillance & Society 6: 313–16. [Google Scholar]
  39. Headspace. 2019. X. October 9. Available online: https://twitter.com/Headspace/status/1182005140238172161 (accessed on 27 October 2023).
  40. Hertwig, Ralph, and Till Grüne-Yanoff. 2017. Nudging and Boosting: Steering or Empowering Good Decisions. Perspectives on Psychological Science 12: 973–86. [Google Scholar] [CrossRef]
  41. Hoeyer, Klaus, Susanne Bauer, and Martyn Pickersgill. 2019. Datafication and accountability in public health: Introduction to a special issue. Social Studies of Science 49: 459–75. [Google Scholar] [CrossRef] [PubMed]
  42. Hsieh, Hsui-Fang, and Sarah E. Shannon. 2015. Three Approaches to Qualitative Content Analysis. Qualitative Health Research 15: 1277–88. [Google Scholar] [CrossRef] [PubMed]
  43. Investopedia. 2023. How Google (Alphabet) Makes Money: Advertising and Cloud. Available online: https://www.investopedia.com/articles/investing/020515/business-google.asp (accessed on 25 September 2023).
  44. Jack, Leonard. 2020. Mental Health Is a Global Public Health Issue. Available online: https://www.cdc.gov/pcd/collections/Mental_Health_Is_a_Global_Public_Health_Issue.htm (accessed on 25 September 2023).
  45. Kambeitz-Ilankovic, Lana, Uma Rzayeva, Laura Völkel, Julian Wenzel, Johanna Weiske, Frank Jessen, Ulrich Reininghaus, Peter J. Uhlhaas, Mario Alvarez-Jimenez, and Joseph Kambeitz. 2022. A systematic review of digital and face-to-face cognitive behavioral therapy for depression. NPJ Digital Medicine 5: 144. [Google Scholar] [CrossRef] [PubMed]
  46. Kats, Daniel, Johann Routurier, and David Luz Silva. 2022. 8 in 10 Websites Leak Your Search Terms. Available online: https://www.nortonlifelock.com/blogs/norton-labs/search-privacy-research (accessed on 9 June 2023).
  47. Keegan, Jon, and Joel Eastwood. 2023. From “Heavy Purchasers” of Pregnancy Tests to the Depression-Prone: We Found 650,000 Ways Advertisers Label You. The MarkUp. June 8. Available online: https://themarkup.org/privacy/2023/06/08/from-heavy-purchasers-of-pregnancy-tests-to-the-depression-prone-we-found-650000-ways-advertisers-label-you (accessed on 26 October 2023).
  48. Kirkpatrick, Tara. 2022. Mental health app installs decline more than 30% since January 2021. Apptopia. May 10. Available online: https://blog.apptopia.com/mental-health-app-installs-decline-more-than-30-percent (accessed on 9 June 2023).
  49. Kitchin, Rob. 2014. The Data Revolution Big Data, Open Data, Data Infrastructures sand Their Consequences. London: Sage. [Google Scholar]
  50. Kraus, Sacha, Francesco Schiavone, Anna Pluzhnikova, and Anna Chiara Invernizzi. 2021. Digital transformation in healthcare: Analyzing the current state-of-research. Journal of Business Research 123: 557–67. [Google Scholar] [CrossRef]
  51. Leamy, Taylor. 2023. 7 Best Mental Health Apps to Fight Depression and Anxiety. Available online: https://www.cnet.com/health/mental/7-best-mental-health-apps-to-start-using-today/ (accessed on 9 June 2023).
  52. Lorenz-Spreen, Phillipp, Michael Geers, Thorsten Pachur, Ralph Hertwig, Stephan Lewandowsky, and Stefan M. Herzog. 2021. Boosting people’s ability to detect microtargeted advertising. Scientific Reports 11: 15541. [Google Scholar] [CrossRef]
  53. Lupton, Deborah. 2014. Critical perspectives on digital health technologies. Sociology Compass 8: 1344–59. [Google Scholar] [CrossRef]
  54. Market Research Future. 2023. Global Digital Mental Health Market Overview. Available online: https://www.marketresearchfuture.com/reports/digital-mental-health-market-11062 (accessed on 25 September 2023).
  55. Mayring, Philipp. 2000. Qualitative Content Analysis. Forum Qualitative Sozialforschung 1: 20. [Google Scholar]
  56. Modglin, Lindsay. 2023. Best Mental Health Apps to Try in 2023. Available online: https://www.forbes.com/health/mind/best-mental-health-apps/ (accessed on 9 June 2023).
  57. Noone, Chris, and Michael J. Hogan. 2018. A randomised active-controlled trial to examine the effects of an online mindfulness intervention on executive control, critical thinking and key thinking dispositions in a university student sample. BMC Psychology 6: 13. [Google Scholar] [CrossRef]
  58. O’Daffer, Alison, Susannah F. Colt, Akash R. Wasil, and Nancy Lau. 2022. Efficacy and Conflicts of Interest in Randomized Controlled Trials Evaluating Headspace and Calm Apps: Systematic Review. JMIR Mental Health 9: e40924. [Google Scholar] [CrossRef]
  59. Pickersgill, Martyn. 2019. Digitising psychiatry? Sociotechnical expectations, performative nominalism and biomedical virtue in (digital) psychiatric praxis. Sociology of Health and Illness 41: 16–30. [Google Scholar] [CrossRef]
  60. Powell, Alvin. 2020. An awakening over data privacy. The Harvard Gazette. February 27. Available online: https://news.harvard.edu/gazette/story/2020/02/surveillance-capitalism-author-sees-data-privacy-awakening/ (accessed on 25 September 2023).
  61. Powell, John, and Theodoros N. Arvanitis. 2015. Welcome to the Digital Health revolution. Digital Health 1: 2055207614561571. [Google Scholar] [CrossRef] [PubMed]
  62. Prainsack, Barbara. 2019. Logged out: Ownership, exclusion and public value in the digital data and information commons. Big Data & Society 6: 2053951719829773. [Google Scholar] [CrossRef]
  63. Prainsack, Barbara. 2023. Data solidarity: Why sharing is not always caring. UNESCO. January 16. Available online: https://en.unesco.org/inclusivepolicylab/analytics/data-solidarity-why-sharing-not-always-caring%C2%A0#:~:text=Data%20governance%20lies%20at%20the,by%20preventing%20and%20mitigating%20harm. (accessed on 25 September 2023).
  64. Prainsack, Barbara, and Seliem El-Sayed. 2023. Beyond Individual Rights: How Data Solidarity gives People Meaningful control over Data. The Americal Journal of Bioethics 23: 36–39. [Google Scholar] [CrossRef] [PubMed]
  65. Prainsack, Barbara, Seliem El-Sayed, Nikolaus Forgo, Lukasz Szoszkiewicz, and Philipp Baumer. 2022. Data solidarity: A blueprint for governing health futures. The Lancet 4: E773–74. [Google Scholar] [CrossRef] [PubMed]
  66. Precedence Research. 2023. The Global Mental Health Apps Market Size. Available online: https://www.precedenceresearch.com/mental-health-apps-market (accessed on 25 September 2023).
  67. Rohaj, Aaraushi, and Grzegorz Bulaj. 2023. Digital Therapeutics (DTx) Expand Multimodal Treatment Options for Chronic Low Back Pain: The Nexus of Precision Medicine, Patient Education, and Public Health. Healthcare 11: 1469. [Google Scholar] [CrossRef] [PubMed]
  68. Ruckenstein, Minna, and Natasha Schull. 2016. The Datafication of Health. Annual Review of Anthropology 46: 1–18. [Google Scholar] [CrossRef]
  69. Sadowski, Jathan. 2019. When data is capital: Datafication, accumulation, and extraction. Big Data & Society 6: 2053951718820549. [Google Scholar] [CrossRef]
  70. Sætre, Alf Steinar, and Andrew H. Van de Ven. 2021. Generating Theory by Abduction. Academy of Management Review 46: 684–701. [Google Scholar]
  71. Samuel, Alexandra. 2016. Psychographics Are Just as Important for Marketers as Demographics. Harvard Business Review. Available online: https://hbr.org/2016/03/psychographics-are-just-as-important-for-marketers-as-demographics (accessed on 25 September 2023).
  72. Schneider, Giulia. 2022. Health Data Pools Under European Data Protection and Competition Law: Health as a Digital Business. Berlin: Springer. [Google Scholar]
  73. Sharon, Tamar. 2018. When digital health meets digital capitalism, how many common goods are at stake? Big Data & Society 5: 2053951718819032. [Google Scholar] [CrossRef]
  74. Stack, Brian. 2017. Here’s How Much Your Personal Information Is Selling for on the Dark Web. Experian. December 6. Available online: https://www.experian.com/blogs/ask-experian/heres-how-much-your-personal-information-is-selling-for-on-the-dark-web/ (accessed on 25 September 2023).
  75. Statista. 2023. Advertising revenues generated by Facebook worldwide from 2017 to 2027. Available online: https://www.statista.com/statistics/544001/facebooks-advertising-revenue-worldwide-usa/#:~:text=Global%20Facebook%20advertising%20revenue%202017%2D2027&text=In%202022%2C%20Facebook%20generated%20nearly,of%20the%20global%20ad%20revenue (accessed on 25 September 2023).
  76. Talkspace. 2023. Cookie Preferences. Available online: https://www.talkspace.com/ (accessed on 27 October 2023).
  77. Taylor, Linnet. 2017. What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society 4: 2053951717736335. [Google Scholar] [CrossRef]
  78. Thatcher, Jim, David O’Sullivan, and Dillon Mahmoudi. 2016. Data colonialism through accumulation by dispossession: New metaphors for daily data. Environment and Planning D: Society and Space 34: 990–1006. [Google Scholar] [CrossRef]
  79. Timmermans, Stefan, and Rene Almeling. 2003. Objectification, standardization, and commodification in health care: A conceptual readjustment. Social Science & Medicine 69: 21–7. [Google Scholar]
  80. Topol, Eric. 2012. The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care. New York: Basic Book. [Google Scholar]
  81. Torous, John, Steven Chan, and John Luo. 2016. Are your patients using ‘digital supplements’? Psychiatric Times. Available online: https://psychnews.psychiatryonline.org/doi/full/10.1176/appi.pn.2016.11b16 (accessed on 27 October 2023).
  82. Vaidya, Anuja. 2023. Key Technologies in the Digital Therapeutics Arena. mHealthIntelligence. July 31. Available online: https://mhealthintelligence.com/features/key-technologies-in-the-digital-therapeutics-arena (accessed on 25 September 2023).
  83. Wang, Changwong, Chungkeun Lee, and Hangsik Shin. 2023. Digital therapeutics from bench to bedside. npj Digital Medicine 6: 38. [Google Scholar] [CrossRef] [PubMed]
  84. Warren, Samuel D., and Louis D. Brandeis. 1890. The Right to Privacy. Harvard Law Review 4: 193–220. [Google Scholar] [CrossRef]
  85. WHO. 2022. Mental Health. Available online: https://www.who.int/news-room/fact-sheets/detail/mental-health-strengthening-our-response (accessed on 9 June 2023).
  86. WHO. 2023a. Digital Health. Available online: https://www.who.int/health-topics/digital-health#tab=tab_1 (accessed on 25 September 2023).
  87. WHO. 2023b. Depression. Available online: https://www.who.int/health-topics/depression#tab=tab_1 (accessed on 25 September 2023).
  88. Wylie, Louise. 2023. Wellness App Revenue and Usage Statistics (2023). Available online: https://www.businessofapps.com/data/wellness-app-market/ (accessed on 25 September 2023).
  89. Yin, Robert K. 1994. Case Study Research: Design and Method. New York: Sage Publications. [Google Scholar]
  90. Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: Public Affairs. [Google Scholar]
Figure 1. The abductive process.
Figure 1. The abductive process.
Socsci 12 00679 g001
Figure 2. Multi-method qualitative data collection.
Figure 2. Multi-method qualitative data collection.
Socsci 12 00679 g002
Figure 3. A paradigm shift towards public health and value in digital mental health.
Figure 3. A paradigm shift towards public health and value in digital mental health.
Socsci 12 00679 g003
Table 1. Examples of additional data collection.
Table 1. Examples of additional data collection.
App NameEvidence from Terms and Conditions and Privacy Policies
Calm“We offer various financial incentives. For example, we may provide incentives to customers who participate in a survey or provide testimonials.”
Headspace Health
  • “Headspace’s legitimate interest… “Our business interest in requesting that you partake in Product surveys in order to better understand your needs and expectations.”
  • “If you choose to engage with the “My Progress” feature, we will collect and store your responses to both the Perceived Stress Scale questionnaire and free-form text box.”
Moodkit
  • “We ask for your explicit permission to access your facial data, which is accessed through Apple Face Tracking. If you permit Moodnotes to access this data we will utilise Apple’s Face Tracking sensor to map certain facial data points and Moodnotes will attempt to guess your mood which you can verify in app. The purpose of this optional feature is to help you gain a better understanding of your mood patters.”
Table 2. Top 10 mental health apps and their therapeutic foci and key metrics.
Table 2. Top 10 mental health apps and their therapeutic foci and key metrics.
App NameTherapeutic Area/App FocusKey Metrics
CalmMeditation
Sleep
Stress and Anxiety
Focus
Self-improvement
Founded: 2012
Type: for-profit; VC-backed
Funding amount: USD 218m to date (crunchbase.com, accessed on 10 November 2023)
Downloads: 2.33m in Q4 2022 (down from peak 9.54m in Q2 2020) (statistica.com, accessed on 10 November 2023)
Cost: 7 days free and then USD 69.99/month or USD 399.99/life
Happify HealthPositive Psychology
Mindfulness
Pattern breaking
Cognitive Behavioural Therapy (CBT)
Founded: 2012
Type: for-profit; VC-backed
Funding amount: USD 118.7m to date (crunchbase.com, accessed on 10 November 2023)
Downloads: 500,000+ (https://play.google.com/, accessed on 10 November 2023)
Cost: USD 14.99/month or 139.99/year
Headspace HealthMeditation
Mindfulness
Sleep
Happiness
Resilience
Focus
Sleep
Founded: 2010
Type: for-profit; VC-backed
Funding amount: USD 215m to date (crunchbase.com, accessed on 13 November 2023)
Downloads: 1.32m in Q4 2022 (down from peak 4.18m in Q4 2018) (statistica.com, accessed on 13 November 2023)
Cost: 14 days free and then USD 12.99/month or 69.99/year
Circles UpOnline Emotional Support
Wellbeing
Personal Struggles (Grief, Anxiety, Fertility, Relationships, Parenting)
Peer Support
Professional Support
Founded: 2021
Type: for-profit; VC-backed
Funding amount: USD 16.5m to date (crunchbase.com, accessed on 13 November 2023)
Downloads: 10,000+
Cost: freemium but extended version is USD 49.99 per month or USD 149.99/year
MoodfitMood
Gratitude
CBT
Mindfulness
Sleep
Self-esteem
Negative habits
Founded: 2017
Type: for-profit; VC-backed
Funding amount: USD 580k to date (crunchbase.com, accessed on 13 November 2023)
Downloads: 50,000+ (https://play.google.com/, accessed on 13 November 2023)
Cost: freemium, premium subscription is tiered (USD9.99, USD 19.99, and USD 39.99)
Thriveport/MoodkitMood and feelings
Bias and distorted thinking
Stress
CBT
Founded: 2010
Type: for-profit; VC-backed; Thriveport also owns Moodnotes and Sleepzy
Funding amount: n/a but estimate of USD 10,919 in 2020 based on ad revenue (https://thriveport.com.siteindices.com/, accessed on 13 November 2023)
Downloads: 10,000+ in 2015 (https://techcrunch.com/, accessed on 13 November 2023)
Cost: USD 4.99 for the download
Sanvello HealthAnxiety
Depression
Stress
Mood
Emotions
CBT
Online Therapy
Founded: 2019
Type: for-profit
Revenue: USD 7.3m in 2022 (https://growjo.com/, accessed on 13 November 2023)
Downloads: 3m+ (https://www.sanvello.com/, accessed on 13 November 2023)
Cost: freemium; premium subscription is tiered USD 8.99/month, USD 53.99/year, or USD 199.99/life
Sesame Workshop/Breathe, Think, Do with SesameEmotions
Breathing Techniques
Self-control
Resilience
Founded: 1969
Type: for-profit; also has 18 other children’s apps in portfolio
Funding amount: USD 100M to date, some grant-based (crunchbase.com, accessed on 14 November 2023)
Downloads: 100,000+ (https://play.google.com/, accessed on 14 November 2023)
Cost: free
TalkspaceOnline therapy for:
Depression
Anxiety
Bipolar
OCD and PDST
Postpartum Depression
Panic Disorders
Founded: 2012
Type: for-profit; VC-backed; IPO 2021
Funding amount: USD 413. to date (crunchbase.com, accessed on 14 November 2023)
Downloads: 500,000+ (https://play.google.com/, accessed on 14 November 2023)
Cost: app/site is free, but therapy sessions cost between USD 276 and USD 436 per month (https://www.everydayhealth.com/, accessed on 14 November 2023)
Teladoc Health/BetterhelpOnline therapy for:
Depression
Anxiety and Coping
Gender and Sex
Unhelpful thinking patterns
Founded: 2013
Type: for-profit; acquired by Teladoc Health in 2015 for USD 4.5m (crunchbase.com); Teladoc IPO 2015; Teladoc has many other services in portfolio (general medical, primary, mental health, specialists, and wellness)
Funding amount: USD 172m to date for Teladoc (crunchbase.com)
Downloads: 350,000 users in 2022 (https://bhbusiness.com/)
Cost: app/site is free, but therapy sessions cost between USD 240 and USD 360 per month (https://www.betterhelp.com/)
Table 3. “Data Sharing” as per T&Cs and PPs.
Table 3. “Data Sharing” as per T&Cs and PPs.
App NameEvidence from Terms and Conditions and Privacy Policies
Calm
  • “We share information [usage, transactional, log, device, recorded phone and video, location, cookies] about you as follows… with companies and contractors that perform services for us… (with your consent and your direction) with third-party social media services via the integrated tools we provide via our services”.
  • “We use the information we collect to…personalize your online experience and the advertisements you see on other platforms based on your preferences, interests, and browsing behaviour”.
Happify Health
  • Disclosure of personal information: “We may share your data with our affiliates … Happify does NOT sell personal information to third parties… Partners: We may share your data [personal information, non-personal Information, and aggregate data] with other companies, such as companies with whom we jointly offer products and services. Disclosure Of Personal Information…Third Party Service Providers: We may share personal information with certain service providers…these include….data optimization and marketing services, content providers…
Headspace Health
  • “We use information [personal data, device information, tracking data and internet activity, ‘my progress’ and ‘buddy’ data] held about you in the following ways… To inform Partners about your registration and use of the Products as described under Corporate and Other Community Sharing… To serve our advertisements to you through third party platforms, such as Facebook or Google, on other sites and apps or across your devices”
  • “The security of your personal data is important to us… we do not provide your personal data to any third party without your specific consent… We do not sell your personal information to third parties.”
  • “Headspace is also offered through partnerships with organizations [‘partners’—corporations, governments, hospitals, universities and other organizations and groups] … when an entity provides access to Headspace to others, we call those “Communities” … The Partner may also have access to your community’s aggregated and anonymized general usage data (including ‘my progress’)”
Circles Up
  • The Site and the Content may contain icons and links to third party websites (“Third Party Websites“), as well as other content from third parties (collectively “Third Party Content“).Circles has no control over the terms of use and privacy policies of third party websites and User accesses any such third party website at User’s own risk…Each User is advised to thoroughly review such third parties’ privacy policies and terms of use before making any use of such third party’s products and services…By clicking on a link and/or icon to a third-party website or service, a third party may transmit cookies to User.”
  • “To facilitate and customize the User’s experience of the Services and to track User’s use of the Services Circles may utilize cookies and other industry standard technologies.”
  • “Circles may be required to disclose Personal Information to relevant national, state and local law enforcement authorities, whom may further disclose such Personal Information”
Moodfit
  • “We do not sell your User Provided data to third parties. Only aggregated, anonymized data may be periodically transmitted to external services to help us improve the application and our service. We will share your information with third parties only in the ways that are described in this privacy statement”
Thriveport/Moodkit
  • “We may use the information we collect for a variety of purposes…Internal Research… Auditing Interactions with Consumers… Advertising/Marketing…
  • “We may match information collected from you through different means or at different times, including both personal information and Automated Information, and use such information along with information obtained from other sources. We may also aggregate and/or de-identify any information that we collect, such that the information no longer identifies any specific individual. We may use, disclose and otherwise process such information for our own legitimate business purposes—including historical and statistical analysis and business planning—without restriction.”
  • “We may share information about you with the following categories of third-party providers… Customer Communications and Insights Platforms… Internal Business Insights Platforms… Customer support… Measurement and Attribution… Other technology providers… Advertising/Marketing providers”
Sanvello Health
  • “We may use or disclose your health …to Business Associates that perform functions on our behalf or provide us with services if the information is necessary for such functions or services. Our business associates are required, under contract with us and pursuant to federal law, to protect the privacy of your information and are not allowed to use or disclose any information other than as specified in our contract and permitted by law… For Research Purposes such as research related to the evaluation of certain treatments or the prevention of disease or disability”
Sesame Workshop/Breathe, Think, Do with Sesame review
  • “We use third party service providers to help us collect and understand Usage Information and to support our marketing efforts.”
  • “We use personal information…to provide support to you when you request it, services, and security in partnership with our third-party service providers” and “to operate our organization, including by sharing with our subsidiaries, affiliates and other related entities.
  • “We may receive Personal Information about you from other sources, including our data broker services, data enhancement companies, list rental services, third-party analytics providers, and social media-owned databases, including via your interaction with our social media pages (this includes aggregate data on our social media followers (e.g., age, gender and location), engagement data (e.g., “likes,” comments, shares, reposts and clicks), awareness data (e.g., number of impressions and reach) and individual users’ public profiles).”
Talkspace
  • “What we do with it [personal data] … build, modify, and develop new products, features, and services… conduct clinical and other academic research, internally and with approved research partners and identify summary trends or insights for use in external communications… create anonymized and/or aggregated data to improve and deliver our services… analyze how our services are used so we can improve your experience… marketing, including tailoring advertising”
  • “Talkspace conducts or participates in research studies with select universities… Information from these studies may be published by third parties including through various media platforms/academic journals.”
Teladoc Health/Betterhelp
  • “We Process Visitor Data, Onboarding Data, Account Registration Data, User ID, Transaction Data, Therapy Quality Data, Therapist Data and Therapist Engagement Data to connect you with therapy services…communicate with you… to monitor and improve therapy quality…to personalize your web or app experience… to understand how you use our services, how we can improve our products and services to make them more effective and convenient, and offer you new features”
  • “We may share certain data with Service Providers… examples include… Data hosting and storage providers, Technology Service Providers…Customer Service Providers…Email management and communication Service Providers… Reporting and analytics Service Providers”
Table 4. App disclaimers.
Table 4. App disclaimers.
App NameDisclaimers in Terms and Conditions and Privacy Policies
Calm
  • “The Services, Products and Content you receive from Calm are non-clinical in nature, provided for informational purposes only” and “Calm is not a licensed medical care provider”.
Happify
  • “HAPPIFY™ IS NOT A MEDICAL OR HEALTH SERVICES ORGANIZATION PROVIDER” and “WE DO NOT ENGAGE IN PATIENT DIAGNOSIS OR THE PRACTICE OF MEDICINE”
Headspace Health
  • “We are not a health care or medical device provider, nor should our Products be considered medical advice.”
Circles Up
  • “CIRCLES DOES NOT PROVIDE ANY KIND OF MEDICAL SERVICES OR EMERGENCY SERVICE.”
  • … IN CASE YOU ARE SEEKING PROFESSIONAL PSYCHOLOGICAL COUNSELING, PLEASE REFER TO A PSYCHOTHERAPIST OR A PSYCHOLOGIST OR ANY OTHER MENTAL HEALTH PROFESSIONAL. PLEASE NOTE THAT CIRCLES DOES IN NO WAY PROVIDE ANY PSYCHOLOGICAL ADVICE.”
  • User hereby acknowledges that Circles does not in any way represent, warrant or guarantee any specific outcome or result of User’s use of the Services”
Thriveport/ Moodkit
  • “YOUR USE OF THE APP AND ANY INFORMATION OR RECOMMENDATIONS PROVIDED IN THE APPS ARE AT YOUR SOLE RISK. THE ENTIRE RISK ARISING OUT OF USE OR PERFORMANCE OF THE APPS, INCLUDING ANY INFORMATION OR SUGGESTIONS PROVIDED IN ANY APP, REMAINS SOLELY WITH YOU.”
Sanvello Health
  • “Sanvello is the brand name…Sanvello, Inc. does not practice clinical social work or any other licensed profession and does not interfere with the practice of healthcare professionals, each of whom is responsible for his or her services and compliance with the requirements applicable to his or her profession and license.”
Sesame Workshop/Breathe, Think, Do with Sesame review
  • “THE SITES AND THE SITE CONTENT ARE PROVIDED “AS IS” AND “AS AVAILABLE” WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTY FOR INFORMATION, DATA, DATA PROCESSING SERVICES, UPTIME OR UNINTERRUPTED ACCESS, ANY WARRANTIES CONCERNING THE AVAILABILITY, ACCURACY OR USEFULNESS OF SITE CONTENT AND ANY IMPLIED WARRANTIES OF MERCHANTABILITY, SATISFACTORY QUALITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE OR NONINFRINGEMENT, WHICH ARE EXCLUDED FROM THIS AGREEMENT TO THE EXTENT THAT THEY MAY BE EXCLUDED AS A MATTER OF LAW”
Talkspace
  • “DO NOT USE THIS SERVICE FOR EMERGENCY MEDICAL NEEDS” … “UNDER NO CIRCUMSTANCES SHALL TALKSPACE, ANY TALKSPACE LICENSOR OR SUPPLIER, OR ANY THIRD PARTY WHO PROMOTES THE SERVICE OR PROVIDES YOU WITH A LINK TO THE SERVICE BE LIABLE IN ANY WAY FOR YOUR USE OF THE SERVICE OR ANY OF ITS CONTENT”
Teladoc Health/Betterhelp
  • “The Therapists are independent providers who are neither our employees nor agents nor representatives. The Platform’s role is limited to enabling the Therapist Services. The Therapists themselves are responsible for the performance of the Therapist Services.”
Table 5. Staking a claim on the user’s data.
Table 5. Staking a claim on the user’s data.
App NameEvidence from Terms and Conditions and Privacy Policies
Calm
  • “Calm and its licensors exclusively own all right, title and interest in and to the Services, Products and Content, including all associated intellectual property rights. “and “Subject to these Terms, Calm grants you a limited, non-exclusive, non-transferable, non-sublicensable, revocable license to access and use the Content solely… solely for your personal and non-commercial purposes. “
Happify
  • “Happify™ reserves the right to transfer and/or sell aggregate or group data about Happify™ users (including Usage Data not linked to Personal Information)”
  • “We may share personal information with third parties in connection with a transaction, such as a merger, sale of company assets or shares, reorganization, financing, change of control or acquisition of all or a portion of our business, or in the event of a bankruptcy or related or similar proceedings.”
Headspace Health
  • In the event that we sell or buy any business or assets, in which case we may disclose your personal data to the prospective seller or buyer of such business or assets… If Headspace or substantially all of our assets are acquired by a third party, in which case personal data held by us about our customers will be one of the transferred assets”.
  • “All materials (including software and content whether downloaded or not) contained in the products are owned by Headspace (or our affiliates and/or third-party licensors, where applicable), unless indicated otherwise”
Circles Up
  • “In the event that Circles is sold, whether by merger, sale of assets or otherwise, Personal Information collected hereunder may be one of the assets sold in connection with such transaction. Personal Information collected hereunder may also be disclosed in connection with a commercial transaction where Circles is seeking financing, investment, or support.”
Moodfit
  • If the Company is involved in a merger, acquisition, or sale of all or a portion of its assets, you will be notified via email and/or a prominent notice on our Web site of any change in ownership or uses of this information, as well as any choices you may have regarding this information.”
Thriveport/Moodkit
  • “We may share information about you in connection with (including during the evaluation or negotiation of) a corporate change or dissolution, including for example a merger, acquisition, reorganization, consolidation, bankruptcy, liquidation, sale of assets or wind-down of a business (each a “Corporate Transaction”)”
Sanvello Health
  • “All rights, title and interest in and to the Website, including the Content, and all intellectual property rights, including all copyright, trademark, patent and trade secret rights therein shall remain with the Company and our licensors and vendors, and no ownership interest is transferred to you or any other entity.”
Sesame Workshop/Breathe, Think, Do with Sesame review
  • “To transfer your Personal Information to a new or reorganized entity in the event of a reorganization, merger, sale, assignment, bankruptcy, or similar change, for the new entity to use in accordance with this Privacy Policy.”
Talkspace
  • “All Content available on or through the Service is the property of Talkspace or its licensors and is protected by copyright, trademark, patent, trade secret and other intellectual property law.”
  • “Talkspace hereby grants you a limited, revocable, non-transferable, and non-exclusive license to use the software, network facilities, content, and documentation on and in the Service to the extent, and only to the extent, necessary to access and use the Service.”
Teladoc Health/Betterhelp
  • “We aren’t paid by anyone for any data. However, in California, the laws define “sale” broadly to include the sharing of personal information in exchange for anything of value”.
  • “We may share some of your data in connection with an asset sale, merger or bankruptcy.”
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gross, N.; Mothersill, D. Surveillance Capitalism in Mental Health: When Good Apps Go Rogue (and What Can Be Done about It). Soc. Sci. 2023, 12, 679. https://doi.org/10.3390/socsci12120679

AMA Style

Gross N, Mothersill D. Surveillance Capitalism in Mental Health: When Good Apps Go Rogue (and What Can Be Done about It). Social Sciences. 2023; 12(12):679. https://doi.org/10.3390/socsci12120679

Chicago/Turabian Style

Gross, Nicole, and David Mothersill. 2023. "Surveillance Capitalism in Mental Health: When Good Apps Go Rogue (and What Can Be Done about It)" Social Sciences 12, no. 12: 679. https://doi.org/10.3390/socsci12120679

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop