Next Article in Journal
A Spoofing Speech Detection Method Combining Multi-Scale Features and Cross-Layer Information
Previous Article in Journal
Recommending Actionable Strategies: A Semantic Approach to Integrating Analytical Frameworks with Decision Heuristics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploring Privacy Leakage in Platform-Based Enterprises: A Tripartite Evolutionary Game Analysis and Multilateral Co-Regulation Framework

1
School of Humanities, Jiaozuo University, Jiaozuo 454000, China
2
School of Information Management, Zhengzhou University, Zhengzhou 450001, China
*
Authors to whom correspondence should be addressed.
Information 2025, 16(3), 193; https://doi.org/10.3390/info16030193
Submission received: 22 January 2025 / Revised: 24 February 2025 / Accepted: 25 February 2025 / Published: 2 March 2025

Abstract

:
Currently, some platform-based enterprises are selling users’ private information to capture high revenue, which poses a great threat to users’ privacy and security and also poses a challenge to the work of regulators. This paper constructs a tripartite evolutionary game model among platform-based enterprises, users, and regulators by combining theories of daily activities. This paper explores the equilibrium strategy for the leakage of users’ privacy information on online platforms under the concept of multiparty governance and designs four simulation experiments based on the revenue intervals of platform-based enterprises’ decisions. Finally, the model is validated by means of simulation. The results show that when platform-based enterprises make less profit from compromising users’ privacy, users are motivated to participate in shared governance with minimal incentives from the regulator. Regulators can effectively deter privacy leakage by fining platform-based enterprises for leaking user privacy information and requiring them to compensate users for their losses. At this point, they can achieve effective control over privacy leakage by using traditional regulation. When platform-based enterprises make high profits from leaking users’ privacy, they will adhere to their privacy leaking strategy. At this point, privacy security is at its most serious, and users’ privacy rights are not effectively protected. Users and regulators will tend to cooperate to form a multiparty regulatory system, but this does not influence the strategy choice of platform-based enterprises.

1. Introduction

Online platforms have become indispensable intermediaries in the deep integration of information technology with socio-economic development, serving as critical bridges connecting consumers and enterprises. By December 2024, China had 1.108 billion internet users and an internet penetration rate of 78.6%, with each user spending an average of approximately 28.7 h online every week [1]. These statistics position China as the largest and most dynamic digital society globally. Within this digital ecosystem, users have become accustomed to a wide range of platform services, often voluntarily disclosing personal information to access efficient, convenient, and personalized offerings [2]. Concurrently, platform-based enterprises leverage advanced data mining techniques to collect, aggregate, and analyze fragmented user data from websites, mobile devices, applications, and sensors. These data collection, aggregation, and analysis practices enable precise recommendations and targeted marketing strategies, significantly enhancing customer engagement and conversion rates [3]. The inherent attributes of online behavior—specifically its anonymity, virtual nature, and indirect interactions—give rise to significant information asymmetries between users and platforms. As a result, when privacy breaches occur, users often face considerable disadvantages, hindered by the complexity of rights protection processes, prohibitive costs, and the difficulties of evidence collection [4]. In some instances, users choose to remain silent due to factors such as an inability to effectively address the issue, inadequate responses to their concerns, privacy concerns regarding data collection, or technical complexities in smart environments [5].
The structural characteristics of online platforms—characterized by diverse stakeholders, varied activities, and multifaceted behavioral patterns—significantly complicate the detection of user privacy breaches. These structural factors, combined with the economic factors of low breach costs and high illicit rewards, create conditions that incentivize unauthorized acquisition and sale of personal information. Existing research highlights mobile applications, malicious software development kits (SDKs), and Internet of Things (IoT) devices as prominent channels for privacy leakage. The financial incentives driving the sale of personal data have given rise to a complex criminal and gray market supply chain, encompassing upstream, midstream, and downstream segments [6]. While online platforms provide users with efficient and personalized services, they have also become primary vectors for privacy breaches. Relying solely on game-theoretical interactions between users and platform enterprises is insufficient to safeguard consumer privacy. Consequently, robust governmental regulatory measures are imperative. Such regulations should clearly define the rights and obligations of platforms in handling personal information and establish comprehensive frameworks for personal data protection. These efforts are essential not only for ensuring the orderly development of the digital economy but also for enhancing user security, well-being, and satisfaction.
At the legal level, the Data Security Law of the People’s Republic of China [7] and the Personal Information Protection Law of the People’s Republic of China [8] explicitly address platform misconducts, including tampering, destruction, disclosure, unauthorized acquisition, misuse, excessive collection, and even the public sale of personal information. These laws integrate such actions into a stringent regulatory framework, establishing an “information notification” system that enables users to control their data through explicit consent requirements, and imposing specific privacy protection obligations on large platform-based enterprises, including conducting privacy impact assessments and appointing data protection officers. While China’s approach emphasizes centralized governance and proactive regulatory obligations, it contrasts with the U.S.’s sectoral model that relies heavily on self-regulation and ex-post enforcement, as exemplified by laws like the California Consumer Privacy Act (CCPA). Both jurisdictions face similar challenges in balancing privacy protection with technological innovation, though China’s comprehensive oversight model incurs substantial financial and administrative burdens, particularly in enforcing regulations across diverse and segmented industries [9]. Moreover, traditional regulatory information channels remain limited, reducing the efficiency of detecting privacy breaches by platform enterprises [10]. To bridge the divide between limited regulatory resources and the extensive scope of regulatory subjects—and to address challenges such as user ‘silent governance’ and regulatory authorities’ ‘limited governance’—the China Personal Information Security and Privacy Protection Report [11] proposes a multi-stakeholder collaborative governance model. This approach positions privacy leakage by platform enterprises as a complex societal issue, unsolvable through corporate self-regulation or government oversight alone. Instead, it emphasizes the active participation of diverse societal actors in the governance process, advocating a transition from the traditional, government-centric regulatory model to a diversified, multi-stakeholder framework. This paradigm shift holds both theoretical and practical significance. It enhances the efficiency of identifying regulatory breaches, fosters broader societal participation in addressing privacy violations and reduces the likelihood of privacy leakage by platform enterprises. Consequently, this collaborative governance model emerges as a robust and effective strategy for mitigating the risks associated with personal data breaches.
To enhance the effective protection of user privacy rights and improve the regulation of platform-based enterprises’ disclosure of user privacy information, this study investigates the privacy decision-making behavior of platform-based enterprises. An evolutionary game model is constructed to analyze interactions among three key stakeholders—platform-based enterprises, users, and regulatory agencies—under various payoff scenarios. First, a theoretical analysis of platform-based enterprises’ privacy disclosure behavior is conducted using routine activity theory. Second, guided by the principles of multi-stakeholder governance, an evolutionary game model is developed to capture the dynamic interactions among platform-based enterprises, users, and regulatory agencies. Finally, numerical simulations are employed to evaluate equilibrium strategy choices among these stakeholders, assess the robustness of the findings, and identify directions for future research. This study extends the application of multi-stakeholder governance to the domain of privacy breach research, offering novel insights into the privacy decision-making processes of platform-based enterprises. It also provides practical implications for designing and implementing systematic and comprehensive privacy protection frameworks, contributing to both theoretical advancement and improved regulatory practices.

2. Literature Review

2.1. Privacy Leakage-Related Research

Privacy leakage refers to the unauthorized disclosure of personal or organizational transaction data, email data, social media activity, location information, medical records, or identity information to insecure or harmful environments, whether intentionally or unintentionally [12]. Privacy leakage can be categorized into active leakage and passive leakage [13]. Active leakage pertains to the intentional act of individuals or platforms disclosing private information to third parties during online activities, often to obtain convenient services or excessive profit, as exemplified by Facebook’s Cambridge Analytica scandal, where user data were shared without explicit consent for political advertising purposes. In contrast, passive leakage refers to the unintentional collection of personal or platform-related data by third-party organizations through illegal methods, such as hacking, malicious web scraping, or unauthorized data trading (e.g., the 2020 Weibo user data leak involving 538 million records). The specific form of privacy leakage investigated in this study primarily concerns online platforms disclosing user privacy to third parties without users’ awareness. Current research on privacy leakage predominantly focuses on identifying its underlying causes, conceptual measurement, detection, early warning systems, risk assessment, and the development of protective technologies, with particular attention to emerging challenges such as cross-border data flows [14] and AI-driven privacy violations [15].
In terms of causal analysis, Xu et al. [16] integrated the Theory of Planned Behavior and Privacy Calculus Theory to categorize the factors influencing users’ voluntary disclosure of private information. These factors are conceptualized as a cost–benefit trade-off between perceived benefits and privacy concerns, with privacy concerns influenced by elements such as privacy sensitivity, perceived privacy risk, information control, and subjective norms. Several theoretical frameworks have been applied to analyze the voluntary disclosure of private information: Rational Choice Theory [17] emphasizes cost–benefit tradeoffs in decision-making; Social Cognitive Theory [18] examines how social observation and self-efficacy shape information control; Privacy Calculus Theory [19] quantifies risk-reward dynamics; Dual-Process Theory (System II) [20] explains deliberate disclosure patterns under high cognitive load; Agency Theory [21] reveals power imbalances in data control negotiations; and Social Contract Theory [22] frames disclosure as an act of fulfilling digital citizenship obligations.
In terms of conceptual measurement, Zhu et al. proposed a privacy disclosure measurement algorithm utilizing WordNet reasoning. They validated the model’s effectiveness using publicly accessible information, and their findings demonstrated that this algorithm could more accurately identify actual instances of privacy disclosure through reasoning and measurement [23]. Many scholars, both domestically and internationally, have conducted extensive research on privacy leakage detection methods within social networks such as Weibo, WeChat, Facebook, and Twitter, proposing various detection and early warning models. For instance, Xu et al. explored the aggregation effects of different channels through which privacy information is leaked, particularly when sensitive information is embedded in unpublished text. They developed a multi-dimensional feature-based Convolutional Neural Network (MF-CNN) model for the rapid detection of privacy leakage in real-time published texts. The results demonstrated that this approach achieved high detection accuracy and met real-time detection requirements [24]. Similarly, Kang et al. highlighted a limitation in most existing privacy leakage detection systems, arguing that these systems are primarily designed for companies and enterprises and are, therefore, inadequate for individual user analysis. To address this, they conducted a survey of the WeChat platform and identified eight key social network features as indicators of potential privacy leakage. These features included account passwords shared during chats, WeChat wallet consumption records with non-friends, WeChat wallet transfer records with friends, strangers’ Moments settings, nearby person settings, friend addition settings, friends’ Moments settings, and information access from mini-programs. Based on these insights, they proposed an intelligent privacy leakage detection method utilizing Support Vector Machines (SVM) to identify and predict potential breaches [25].
Privacy leakage risk assessment serves as a crucial early warning mechanism to evaluate the extent to which platform-based enterprises infringe on user privacy. It aims to reduce the risk of user privacy leakage and promote a healthy online ecosystem. This research area has garnered significant attention among scholars. For instance, Zhu et al. introduced a risk coefficient based on shared preference probability distributions to measure both the access permission distribution of mobile applications and the patterns of privacy violations. To address the balance between user personalization preferences and the risks of privacy infringement, they developed an application recommendation algorithm called AppURank [26].
Current research on privacy protection primarily focuses on three main strategies: data distortion techniques, data encryption techniques, and restricted data disclosure techniques. Among these, data distortion techniques include randomization, blocking, aggregation, and differential privacy protection; data encryption techniques involve secure multi-party computation (SMC) and distributed anonymization; and restricted data disclosure techniques encompass methods such as K-anonymity, L-diversity, and T-closeness. For instance, Li and Chen proposed a personalized trajectory privacy protection scheme (PTPP) based on relationship strength. Their research explored location obfuscation algorithms utilizing noise radius methods to ensure geographical irreversibility and location clustering techniques. Furthermore, they introduced a hybrid social relationship strength computation model (HCM), which improves the logical accuracy of social relationship strength assessment. They then analyzed the usability and security of the PTPP algorithm within social network application scenarios, demonstrating its effectiveness in protecting user trajectory privacy data [10].

2.2. Privacy Regulation-Related Research

In response to the dynamic and complex online environment and the serious societal consequences of user privacy leakage, scholars have increasingly focused on exploring regulatory issues related to privacy security. Hoffmann et al. found that government adoption of intervention policies depends on factors such as the level of competition among senders, whether recipients exhibit caution regarding the collection of personalized data by senders, and whether companies possess the capability to implement differential pricing strategies. They noted that when sufficient competition exists among senders, policy intervention becomes unnecessary. However, in cases involving personalized pricing and insufficient competition, policy intervention proves beneficial by reducing the risk of user privacy exposure while ensuring that users can fully exercise their rights [27]. Tang Yaijia contended that open and transparent market mechanisms foster the efficient flow of privacy information and enhance the operational performance of the digital economy, thereby promoting overall social welfare. However, factors such as cognitive biases, bounded rationality, and heuristic constraints impair consumers’ ability to make fully rational privacy decisions. Furthermore, significant information asymmetries, insufficient bargaining power, and operational obstacles within personal privacy data markets exacerbate the tendency of platform-based enterprises to excessively disclose, misuse, or sell user privacy information without authorization. These issues lead to market failures in privacy protection, underscoring the necessity for strong market regulatory interventions to ensure effective privacy security governance. Importantly, Tang Yaijia emphasized that market mechanisms and government regulation should not be considered opposing forces but rather complementary and mutually reinforcing approaches. Coordinated efforts between these two mechanisms are essential for the successful governance of privacy leakage and the comprehensive protection of user privacy information [28].
In addition, several scholars have applied game theory methods to investigate privacy security issues. Du et al. developed an evolutionary game theory model incorporating community structures to simulate and analyze privacy protection within social networks exhibiting such structures. Their findings indicated that the protection of users’ private information is influenced not only by individual users’ strategies but also by the strategies adopted by other users [29]. Gopal et al. proposed a bilateral economic model, revealing that the revenue of publisher websites initially increases and then decreases as users’ privacy concerns grow. Meanwhile, the user surplus on platforms initially declines and later rises with increasing user privacy concerns, while the surplus of third-party entities consistently declines as user privacy concerns intensify. Furthermore, they observed that higher market concentration leads to greater information control by a few third parties, thereby heightening information asymmetries and exacerbating privacy security challenges [30]. Sun et al. developed an incomplete information evolutionary game model, incorporating adversarial deep learning incentives and the assumption of user-bounded rationality. This model seeks to aid users in balancing immediate privacy costs against long-term service quality benefits. Simulation experiments were conducted, and the results validated the effectiveness of the proposed strategy [31].
The findings from the above research indicate the following key points: Firstly, current studies on privacy leakage predominantly focus on technical dimensions, such as model development and algorithmic improvements. Few studies adopt a systems science perspective to explore the internal mechanisms underlying platform-based enterprises’ privacy decision-making and the interactive processes among various stakeholders. Secondly, while the existing literature on privacy breach regulation often highlights government regulatory efforts as the primary mechanism for addressing market failures in privacy protection, limited research has compared the differences in regulatory efficiency across various approaches and their distinct impacts on privacy governance. Lastly, most current research examines competitive relationships among platforms, users, third parties, hackers, and software vendors from an enterprise operation standpoint. However, there is a noticeable gap in analyses from a social governance perspective, especially regarding multi-party co-governance and the cooperative evolutionary game dynamics among diverse stakeholders.

3. Model Assumptions and Construction

The Routine Activity Theory [32] identifies three key elements that lead to criminal behavior: a motivated offender (Motive), a suitable target (Opportunity), and a lack of capable guardianship (Control). This framework highlights how crime arises from the convergence of these factors in specific situations. It has been widely applied to both traditional and modern contexts, such as cybercrime, where offenders exploit vulnerabilities in the absence of effective safeguards. In this study, motive is defined as the intrinsic driving force that prompts an actor to take action in order to achieve their intended goals. This concept is used to explain why platform-based enterprises disclose user privacy information. Opportunity refers to advantageous circumstances characterized by timeliness, risk, concealment, and competitiveness. It is used to explain why platform-based enterprises have the ability to disclose user privacy information. Control refers to the monitoring and corrective processes carried out by supervisory entities to ensure that organizational plans are implemented as intended. It explains why regulatory agencies have the responsibility to protect user privacy information from the risks associated with opportunities and motivations.
The motivator is the platform-based enterprise; the opportunity’s target is the user; and the source of control is primarily the regulatory agencies. Because all participants are limited in rationality and subject to information asymmetry, compared to non-cooperative games based on the assumption of complete rationality, using an evolutionary game model grounded in biological evolution and genetic theory can more accurately depict the stable evolution strategies among the three parties involved in the game. The routine activity theory [32] assumes that the motivation for platform-based enterprises to disclose user privacy information arises from the fact that the costs of privacy leakage are currently low while the benefits are high. Driven by opportunistic motives, platforms may disclose user privacy to third parties in order to gain substantial profit returns. Therefore, in the model, the benefits of platform-based enterprises disclosing user privacy information are assumed to be higher than the benefits of not disclosing user privacy information. Secondly, opportunities are primarily reflected in users, as they are the main participants and experiencers of the daily activities of platform-based enterprises. The extent to which users are willing or able to report actual situations to regulatory agencies through multi-party governance when their own or others’ privacy is leaked and they suffer losses is a key factor. Finally, control mainly comes from regulatory agencies, which, in the model, is primarily reflected as the oversight of whether platform-based enterprises disclose user privacy information. This can be categorized into two methods: traditional regulation and co-governance regulation. Among these, traditional regulation refers to the operation of government regulatory systems and monitoring functions relying solely on government mechanisms. In contrast, co-governance regulation is a collaborative oversight model born under the guidance of the concept of social co-governance. This model involves multiple parties (including users, industry associations, and other relevant stakeholders) working together. Utilizing advancements in information dissemination technologies and increasing user participation in privacy protection, regulatory agencies conduct multidimensional analyses of evaluations and feedback from these various parties regarding platform privacy leakage. Through this, they establish privacy breach early-warning systems and protection mechanisms, thereby developing a scientific and efficient approach to regulating privacy leakage. Therefore, although regulatory agencies maintain consistent procedures, investment costs, and penalties for addressing instances of platform-based enterprises disclosing user privacy information after identification, the co-governance regulatory model requires significant initial upfront investment from regulatory agencies. Based on the above analysis, this study proposes the following hypotheses.
Hypothesis 1.
The game dynamics of privacy leakage and governance involve three key participants: users (Participant 1), platform-based enterprises (Participant 2), and regulatory agencies (Participant 3). All participants operate under conditions of bounded rationality, a theoretical framework emphasizing the constraints of human decision-making due to limited cognitive resources, incomplete information, and time constraints, as well as information asymmetry, which influence their decision-making processes. Over time, their strategic choices evolve through iterative interactions, gradually converging toward a state of equilibrium.
Hypothesis 2.
The profits earned by platform-based enterprises from the legitimate use of user privacy information for commercial purposes are denoted as I , primarily reflecting gains from precision marketing and personalized advertising, which lower advertising costs, enhance efficiency, and improve marketing effectiveness. The operational cost for a platform-based enterprise not disclosing user privacy information is denoted as C p , while the cost for disclosing user privacy information is denoted as C n , C p > C n . The cost for a user to participate in governance (e.g., providing feedback to regulatory agencies on platform privacy practices) is denoted as C k , whereas the cost of not participating is 0. The costs incurred by regulatory agencies for enforcement strategies are denoted as C e , and the initial investment cost for adopting a co-governance regulatory approach when users participate in governance is represented by C d .
Hypothesis 3.
The reputation of platform-based enterprises is critical in fostering a positive corporate image and generating reputation-driven economic benefits. When platform-based enterprises choose not to disclose user privacy information, user participation in cooperative governance signals positive evaluations of the platform to society and regulatory agencies. This reputation effect results in social benefits for the platform, denoted as η 1 , and provides direct benefits to the platform itself, represented by G a (primarily reflected in the minimum gain offered by regulatory bodies to incentivize participation in cooperative governance). Conversely, if users abstain from cooperative governance, their corresponding benefit is 0.
When platform-based enterprises adopt a strategy of disclosing user privacy, they gain direct profits denoted as H . If users choose not to participate in cooperative governance, they incur a loss represented by K . Conversely, if users choose to engage in cooperative governance, the affected users receive compensation from the platform, denoted as G b . Furthermore, the platform suffers reputation losses, represented by η 2 , as a result of users providing negative feedback to regulatory agencies.
Hypothesis 4.
The strategy space for users is defined as α = α 1 , α 2 = P a r t i c i p a t e   i n   c o g o v e r n a n c e ,   D o   n o t   p a r t i c i p a t e   i n   c o g o v e r n a n c e , The proportion of users choosing α 1 is x 0 x 1 , while the proportion choosing α 2 is 1 x . The strategy space for platform-based enterprises is defined as β = β 1 , β 2 = D o   n o t   d i s c l o s e   u s e r   p r i v a c y   i n f o r m a t i o n ,   D i s c l o s e   u s e r   p r i v a c y   i n f o r m a t i o n , where the proportion of platform-based enterprises choosing β 1 is y 0 y 1 and the proportion of enterprises choosing β 2 is 1 y . The strategy space for regulatory agencies is defined as γ = γ 1 , γ 2 = C o g o v e r n a n c e   r e g u l a t i o n ,   a n d   T r a d i t i o n a l   r e g u l a t i o n , where the proportion of regulatory agencies choosing γ 1 is z 0 z 1 and the proportion of regulatory agencies choosing γ 2 is 1 z .
Hypothesis 5.
When regulatory agencies adopt a co-governance regulation strategy, the probability of identifying platform-based enterprises’ behavior of disclosing user privacy information is ε 1 0 ε 1 1 . When regulatory agencies adopt a traditional regulation strategy, the probability of identifying such behavior is ε 2 0 ε 2 1 . Furthermore, because co-governance supervision relies on information technology and other methods to improve the efficiency of information transmission and analysis, ε 1 > ε 2 . The maximum penalty amount imposed by regulatory agencies on platform-based enterprises for disclosing user privacy information after detection is M . The penalty severity is denoted as φ 0 φ 1 . Whether through co-governance regulation or traditional regulation, the supervision by regulatory agencies on whether platform-based enterprises leak users’ privacy serves to protect users’ legitimate privacy rights from infringement while fostering a favorable environment for industry development. This will bring social benefits S to the regulatory agencies, including reputation, public trust, and user confidence.
Based on the above assumptions, the main parameters of the constructed model are shown in Table 1:
Based on the above assumptions, this study constructs the mixed-strategy game matrix for platform-based enterprises, users, and regulatory agencies, as shown in Table 2:

4. Evolutionary Game Model Analysis

4.1. Model Building and Solving

In evolutionary game models, the behavior strategies of different participants mutually influence and constrain each other. To maximize expected payoffs, participants continuously adjust their strategies based on the actions of other participants, ultimately reaching a long-term dynamic equilibrium.
Based on this, the mixed-strategy game matrix of platform-based enterprises, consumers, and regulatory agencies (Table 2) is used to establish the dynamic replicator equations for each participant. These equations are then employed to compute the equilibrium points of the tripartite evolutionary game system, thus enabling analysis of the formation conditions and processes of the evolutionary game.
The solution process is as follows:
(1) Platform-based enterprises side
The expected payoff for a platform-based enterprise choosing not to disclose consumer privacy information is given by
E x 1 = z y I + η 1 C p + 1 z y I + η 1 C p + 1 y z I C P + 1 y 1 Z I C P = y I + η 1 C p + 1 y I C P = I C p + η 1 y .
The expected payoff for a platform-based enterprise choosing to disclose consumer privacy information is given by
E x 2 = z y I + H C n G b ε 1 η 2 M φ ε 1 + 1 z y I + H C n G b ε 2 η 2 M φ ε 2 + 1 y z I + H C n M φ ε 2 G b ε 2 + 1 y 1 z I + H C n M φ ε 2 G b ε 2 = I + H C n y η 2 G b + M φ y z ε 1 + 1 y z ε 2 .
The average expected payoff for a platform-based enterprise’s strategy choice is given by
E x ¯ = x E x 1 + 1 x E x 2 .
It can be observed that the replication dynamic equation for a platform-based enterprise not disclosing user privacy information is given by
F x = d x d t = x E x 1 E x ¯ = x E x 1 x E x 1 1 x E x 2 = x 1 x E x 1 E x 2 = x 1 x y η 1 + y η 2 C p H + C n + G b + M φ y z ε 1 + 1 y z ε 2 .
(2) User side
The expected payoff for a user choosing to participate in co-governance is given by
E y 1 = x z G a C k + x 1 z G a C k + 1 x z G b ε 1 C k K + 1 x 1 z G b ε 2 C k K = x G a C k 1 x C k + K + 1 x z G b ε 1 + 1 x 1 z G b ε 2 = x G a C k 1 x K + 1 x G b z ε 1 + 1 z ε 2 .
The expected payoff for a user choosing not to participate in co-governance is given by
E y 2 = 1 x z G b ε 2 K + 1 x 1 z G b ε 2 K = 1 x G b ε 2 K .
The average expected payoff for a user is given by
E y ¯ = y E y 1 + 1 y E y 2 .
Based on the above, the replicator dynamic equation for users participating in co-governance is given by
F y = d y d t = y E y 1 E y ¯ = y E y 1 y E y 1 1 y E y 2 = y 1 y E y 1 E y 2 = y 1 y x G a C k + 1 x G b · z ε 1 ε 2 .
(3) Regulatory agencies side
The expected payoff for the regulatory agencies choosing co-governance regulation is given by
E z 1 = x y S C e C d + x 1 y S C e + 1 x y S ε 1 + M φ ε 1 C e C d + 1 x 1 y S ε 2 C e + M φ ε 2 = x S C e y C d + 1 x S + M φ y ε 1 + 1 y ε 2 .
The expected payoff for the regulatory agencies choosing traditional regulation is given by
E z 2 = x y S C e + x 1 y S C e + 1 x y S ε 2 + M φ ε 2 C e + 1 x 1 y S ε 2 C e + M φ ε 2 = x S + 1 x S ε 2 + M φ ε 2 C e .
The average expected payoff for the regulatory agencies’ strategy choice is given by
E z ¯ = z E z 1 + 1 z E z 2 .
Based on the above, the replication dynamic equation for the regulatory agencies’ choice of co-governance is given by
F z = d z d t = z E z 1 E z ¯ = z 1 z E z 1 E z 2 = z 1 z y C d + 1 x S + M φ y ε 1 y ε 2 .
The Jacobian matrix is constructed as follows.
1 2 x y η 1 + y η 2 C p H + C n + G b + M φ y z ε 1 + 1 y z ε 2 x 1 x η 1 + η 2 + G b + M φ z ε 1 z ε 2 x 1 x G b + M φ y ε 1 y ε 2 y 1 y G a G b z ε 1 ε 2 1 2 y x G a C k + 1 x G b z ε 1 ε 2 y 1 y 1 x G b ε 1 ε 2 z 1 z S + M φ y ε 2 y ε 1 z 1 z 1 x S + M φ ε 1 ε 2 C d 1 2 z 1 x S + M φ y ε 1 y ε 2 y C d
The system equilibrium points and their corresponding eigenvalues can be determined by solving F x = 0 ,     F y = 0 ,     F z = 0 , as shown in Table 3.

4.2. Analysis of the System Equilibrium Points in the Tripartite Evolutionary Game

(1) The replicator dynamic equation for the probability of the platform-based enterprises, choosing the “not to disclose user privacy” strategy:
F x = x 1 x y η 1 + y η 2 C p H + C n + G b + M φ y z ε 1 + 1 y z ε 2 .
Let
D z = y η 1 + y η 2 C p H + C n + G b + M φ y z ε 1 + 1 y z ε 2 .
According to the stability theorem of differential equations, the probability that the platform-based enterprises choose the “not to disclose user privacy” strategy must satisfy the following conditions to be in a stable state:
F x = 0   a n d   F x / d x < 0 .
Since D y / y > 0 , D z is an increasing function with respect to z . Therefore, taking the first-order derivative of F x with respect to x yields:
d F x d x = 1 2 x y η 1 + y η 2 C p H + C n + G b + M φ y z ε 1 + 1 y z ε 2 .
Therefore, when z = y η 1 + η 2 C p H + C n + G b + M φ ε 2 y G b + M φ ε 2 ε 1 , D z = 0 , d F x / d x 0 , F x 0 . At this point, all x are in a stable evolution state.
Based on the analysis, we can derive the following:
When z > y η 1 + η 2 C p H + C n + G b + M φ ε 2 y G b + M φ ε 2 ε 1 , d F x / d x > 0 , D y > 0 , x * = 1 is the stable evolution strategy. Conversely, when z < y η 1 + η 2 C p H + C n + G b + M φ ε 2 y G b + M φ ε 2 ε 1 , d F x / d x < 0 , D y < 0 , then x * = 0 is the stable evolution strategy.
(2) The replication dynamic equation for the probability of users choosing to participate in co-governance is expressed as
F y = y 1 y x G a C k + 1 x G b z ε 1 ε 2
Let
J z = x G a C k + 1 x G b z ε 1 ε 2
If the probability that the platform-based enterprise chooses not to disclose user privacy information shall be kept in a stable state, it must satisfy the following:   F y = 0 and d F y / d y < 0 .
Given that J z / z > 0 , J z is an increasing function with respect to z . Therefore, taking the first derivative of F y with respect to y gives
d F y d y = 1 2 y x G a C k + 1 x G b z ε 1 ε 2 .
Therefore, when z = x G a C k G b 1 x ε 1 ε 2 , J z = 0 , d F y / d y 0 , F y 0 . Under these conditions, all values of y are at a stable evolution state.
It can be derived that z > x G a C k G b 1 x ε 1 ε 2 , d F y / d y > 0 , D y > 0 , y * = 1 is the stable evolution strategy. When z < x G a C k G b 1 x ε 1 ε 2 , d F y / d y < 0 , D y < 0 , then y * = 0 is the stable evolution strategy.
(3) The differential equation for the evolutionary dynamics of the probability of the regulatory agencies choosing a co-governance strategy can be expressed as
F z = z 1 z y C d + 1 x S + M φ y ε 1 y ε 2
Let
R x = y C d + 1 x S + M φ y ε 1 y ε 2
If the probability that the platform-based enterprise chooses not to disclose user privacy information is to be kept in a stable state, it must meet the following conditions: F z = 0   and   F z / d z < 0 .
Since R x / x < 0 , R x is a decreasing function with respect to x . Therefore, taking the first derivative of F z gives
d F z d z = 1 2 z y C d + 1 x S + M φ y ε 1 y ε 2 .
Therefore, when x = 1 C d S M φ ε 1 ε 2 , R x = 0 , d F z / d z 0 , F z 0 . At this point, all values of z are in a stable evolution state.
It follows that when x > 1 C d S M φ ε 1 ε 2 , d F z / d z < 0 , R x < 0 , z * = 0 is the stable evolution strategy. Also, when x < 1 C d S M φ ε 1 ε 2 , d F z / d z > 0 , R x > 0 , z * = 1 is the stable evolution strategy.
Based on Table 3, when all the eigenvalues in the Jacobian matrix are non-negative, the equilibrium point of the replicator dynamic system is asymptotically stable; however, if at least one eigenvalue in the Jacobian matrix is positive, the equilibrium point of the replicator dynamic system is either unstable or meaningless [33]. From the above analysis of the evolutionary game model’s system equilibrium points, it can be observed that the stable evolution strategies among the platform-based enterprises, users, and regulatory agencies exhibit the following four equilibrium strategy combinations.

4.3. Combination Strategy

4.3.1. High-Loss Scenario

The high-loss scenario is calculated as   C p C n + H < M φ + G b ε 2 . This strategy indicates that the platform-based enterprise’s benefits from choosing to leak users’ privacy information are relatively small, and less than the sum of the fines paid to the regulatory agencies and the compensation paid to users under traditional regulatory strategies. Under this scenario, the equilibrium points are located at E 5 1 0 0 , E 6 1 0 1 , and E 7 1 1 0 , It can be observed that when the platform-based enterprise’s benefits from leaking user privacy information are small—less than the fines and compensation paid to the regulatory agencies and users—, it tends to avoid loss by adopting the strategy of not leaking user privacy information. In this case, the loss of user privacy being leaked is minimized, and privacy-related issues are effectively controlled. The choice of whether the regulatory agencies use co-governance or traditional regulatory methods will not impact the effectiveness of privacy governance. For users, when G a < C k , meaning that the user’s benefits from participating in co-governance are less than the costs paid, the stable equilibrium points are E 5 1 0 0 and E 6 1 0 1 . Under these conditions, if the platform-based enterprise remains stable in not leaking user privacy information, the loss of privacy leakage is low, making users unwilling to incur additional costs to gather information and provide feedback to the regulatory agencies. As a result, users tend to adopt the strategy of not participating in co-governance. In this scenario, any regulatory strategy, whether traditional or co-governance-based, would suffice. When G a > C k , meaning that users derive greater benefits from participating in co-governance than their incurred costs, the stable equilibrium point is E 7 1 1 0 . This means that the platform-based enterprise adopts a strategy of not leaking user privacy information, while users stabilize at the strategy of participating in co-governance. Users’ stable strategy is incentivized by the minimum gain provided by the regulatory agencies to encourage their participation. Under this condition, the regulatory agencies can increase rewards for users’ participation in co-governance. This would have the effect of encouraging user involvement while simultaneously placing pressure on platform-based enterprises by monitoring their actions, thereby stabilizing the strategy of not leaking user privacy information. In this case, multi-party participation in co-governance could be enhanced, and the incentive structure would ensure that platform-based enterprises adopt behavior that minimizes the loss of leaking user privacy information.

4.3.2. Low-Loss Scenario

In the low-yield scenario, the condition is defined as M φ + G b ε 2 < C p C n + H < M φ + G b ε 2 + η 1 + η 2 . This strategy indicates that the expected benefits for platform-based enterprises from choosing to disclose user privacy information are higher than the sum of the fines paid to the regulatory authority and the compensations paid to users under traditional regulatory strategies. However, these benefits are still lower than the combined costs of fines paid to the regulatory authority, compensations paid to users, and the reputation losses experienced by the platform-based enterprise due to user participation in co-governance under traditional regulatory strategies.
At this point, depending on the initial strategy selection of the three parties, the equilibrium points are located at E 1 0 0 0 , E 2 0 0 1 , and E 7 1 1 0 . Specifically, when G b ε 1 ε 2 < C k , the equilibrium condition corresponding to E 2 0 0 1 is satisfied, and when G a > C k , the equilibrium condition corresponding to E 7 1 1 0 is satisfied.
From the above equilibrium strategy combinations, it can be observed that the strategy choices of platform-based enterprises are largely influenced by user strategies. Specifically, when users adopt a non-participation strategy in co-governance, platform-based enterprises tend to choose to disclose users’ private information. Conversely, when users stabilize in a co-governance participation strategy, network platforms stabilize in a strategy of not disclosing users’ private information.
Therefore, regulatory agencies should enhance guidance and incentives for users to participate in co-governance and increase penalties for platform-based enterprises disclosing users’ private information, along with compensation for users. These measures are essential to effectively avoid stable strategy combinations such as (disclosing users’ private information, non-participation in co-governance, traditional regulation) or (disclosing users’ private information, non-participation in co-governance, co-governance regulation).
Moreover, when G a > C k , meaning that the benefits users gain from participating in co-governance exceed the costs they incur, users are incentivized to participate in co-governance to safeguard their rights while deriving additional benefits. Consequently, user strategies stabilize through co-governance participation. Simultaneously, platform-based enterprises stabilize with a strategy of not disclosing users’ private information, while regulatory agencies stabilize at traditional regulation.
Under this strategy combination, regulatory agencies can take two approaches to enhance the effectiveness of user participation in co-governance. First, they can increase guidance and incentives for users to participate in co-governance. Second, they can improve the exposure of platform-based enterprises’ privacy violations during user co-governance efforts, such as adding offending platforms to cybersecurity governance blacklists. These measures strengthen the rewards-and-penalties mechanism and amplify its impact.

4.3.3. Moderate Benefit Scenario

This occurs when M φ + G b ε 2 + η 1 + η 2 < C p C n + H < M φ + G b ε 1 + η 1 + η 2 . This strategy indicates that the benefits a platform-based enterprise gains from disclosing user privacy information exceed the sum of penalties paid to regulatory agencies, compensation paid to users, and reputation losses incurred due to user participation in co-governance under traditional regulatory approaches. However, these benefits are still lower than the combined penalties, compensations, and reputation losses that would arise under co-governance regulation.
At this stage, depending on the initial strategy selection points of the three parties, the stable equilibrium points are E 1 0 0 0 and E 2 0 0 1 . For users, when G b ε 1 ε 2 < C k , the conditions for the equilibrium point E 2 0 0 1 are satisfied. In this scenario, regulatory efforts for privacy governance lack effectiveness, failing to sufficiently constrain the behavior of platform-based enterprises. Consequently, the benefits of disclosing user privacy for platform-based enterprises remain high, resulting in a significant loss of user privacy leakage. Due to privacy asymmetry and the inefficiency of traditional regulatory measures, users tend to stabilize at the strategy of not participating in co-governance. Simultaneously, regardless of whether the regulatory authority adopts co-governance regulation or traditional regulation, it cannot alter the strategy choices of platform-based enterprises.

4.3.4. High-Revenue Scenario

This occurs when M φ + G b ε 1 + η 1 + η 2 < C p C n + H . This strategy indicates that the revenue gained by platform-based enterprises from disclosing user privacy exceeds the combined costs of fines paid to regulatory agencies, compensation paid to users, and the reputation losses incurred due to user participation in co-governance under co-regulation. In this scenario, the system’s stable points are located at E 1 0 0 0 , E 2 0 0 1 and E 4 0 1 1 . It can be observed that, for the platform-based enterprise, regardless of the strategies adopted by users and regulatory agencies, the platform consistently chooses the strategy of disclosing users’ private information. For users, when G b ε 1 ε 2 < C k , the conditions corresponding to the equilibrium point E 2 0 0 1 are satisfied. In this case, the system’s stable equilibrium point is located at E 2 0 0 1 , where users stabilize their strategy by choosing not to participate in joint governance, while the regulatory agencies adopt a joint governance regulatory strategy. When G b ε 1 ε 2 > C k and C d < S + M φ ε 2 ε 1 , the conditions corresponding to the equilibrium point E 4 0 1 1 are satisfied. In this case, the system’s stable equilibrium point is located at E 4 0 1 1 , where users stabilize their strategy by choosing to participate in joint governance, while the regulatory agencies adopt a joint governance regulatory strategy. The emergence of this combination strategy can be attributed to the following reasons. In this case, the profit from platform-based enterprises disclosing user privacy is exceptionally high, even surpassing the sum of fines paid to regulatory agencies, compensation paid to users, and the reputation losses associated with user participation in joint governance. This significantly heightens the tendency of platform-based enterprises to engage in risky behavior by disclosing user privacy to gain substantial profits. Consequently, regardless of the strategies adopted by users and regulatory agencies, the platform-based enterprises’ choice remains unaffected. Furthermore, when the compensation received by users from platform-based enterprises for disclosing consumer privacy is lower than the costs of user participation in joint governance—combined with factors such as unequal bargaining power and information asymmetry—users stabilize their strategy by choosing not to participate in joint governance. However, when the compensation paid by platform-based enterprises for user privacy disclosure exceeds the cost of user participation in joint governance and the initial investment costs for regulatory agencies to implement joint governance are relatively low, users are incentivized to participate in joint governance, and the regulatory agencies opt for a joint governance regulatory approach. This leads to the establishment of a multi-party joint governance system.

5. Simulation

To validate the effectiveness of the four combination strategies analyzed above, this study first establishes the initial constraints of the model based on existing assumptions. These initial constraints are as follows:   I C p < 0 , S + M φ ε 1 C e < S , H + I C n η 2 M φ + G b ε 1 > 0 , S C e > 0 , G a C k > 0 , G b ε 2 C k K > 0 .
Next, based on the initial constraints, the following assumptions are made. The regulatory authority incurs a response cost when adopting joint governance and traditional regulatory strategies, denoted as C e = 20 . When users participate in joint governance and the regulatory authority implements a joint governance strategy, the initial investment cost by the regulatory authority is C d = 5 . To encourage user participation in joint governance, the regulatory authority offers a minimum gain of G a = 10 . The cost of user participation in joint governance is C k = 5 . The reputation benefit for platform-based enterprises is η 1 + η 2 = 15 . The maximum penalty amount the regulatory authority can impose on a platform-based enterprise for disclosing user privacy is M = 50 . The intensity of the penalty is φ = 0.7 . Users who suffer losses and choose to participate in joint governance receive compensation from the platform-based enterprise for user privacy disclosure, denoted as G b = 35 . The probability that joint governance can identify platform-based enterprises’ user privacy disclosure behaviors is ε 1 = 0.6 . The probability of traditional regulatory identification of such behaviors is ε 2 = 0.3 . The social benefit provided by the regulatory authority is S = 100 .
Finally, under the assumption that platform-based enterprises adopt the strategy of disclosing user privacy information, the parameters defined above, the replicator dynamic equations, and the combination strategy analysis scenarios are incorporated into MATLAB(2024b) to conduct a simulation analysis. This analysis focuses on the evolutionary stable strategies of the three parties involved—platform-based enterprises, users, and regulatory agencies—in the context of their strategic interactions. The simulation is carried out as follows:

5.1. High-Loss Scenario

Given that C p C n + H = 15 , the equilibrium points are located at E 5 1 0 0 , E 6 1 0 1 , and E 7 1 1 0 . However, according to the Lyapunov indirect method, if all eigenvalues of the Jacobian matrix, except those with a real part of zero, have negative real parts, the participants are unable to achieve a fully stable equilibrium state.
Since E 5 1 0 0 and E 6 1 0 1 include eigenvalues with a real part of zero, the equilibrium point at which all three parties—platform-based enterprises, users, and regulatory agencies—achieve a stable state is located at E 7 1 1 0 . The simulation results are shown below (Figure 1, Figure 2, Figure 3 and Figure 4):
Figure 1, Figure 2, Figure 3 and Figure 4 depict the strategy evolution paths of the platform, users, and regulatory agencies under the condition that C p C n + H < M φ + G b ε 2 . From the figures, it can be observed that when the benefits for the platform in disclosing user privacy are relatively low and less than the sum of the fines paid to the regulatory agencies and the compensation paid to users under traditional regulatory strategies, the platform lacks the incentive to disclose user privacy for profit. As a result, it avoids loss and stabilizes the strategy of not disclosing user privacy. For users, whether they participate in co-governance is determined by the minimum gain paid by the regulatory agencies to encourage their participation. When G a = 2 < C k , users do not participate in co-governance. However, when G a = 10 > C k , users choose to participate in co-governance.
Based on Table 3, it can be observed that the eigenvalues at the equilibrium point E 5 1 0 0 and E 6 1 0 1 are 0. Only when G a > C k can all three parties—platform-based enterprises, users, and regulatory agencies—simultaneously achieve stability. In this case, the stable equilibrium point is located at E 7 1 1 0 , with the regulatory agencies remaining stable under traditional regulation.

5.2. Low-Yield Scenario

Set C p C n + H = 30 in this case. The stable equilibrium points are located at E 1 0 0 0 , E 2 0 0 1 , and E 7 1 1 0 . However, because E 1 0 0 0 and E 2 0 0 1 contain eigenvalues of 0, the equilibrium point at which the platform-based enterprises, users, and regulatory agencies all achieve a stable state is located at E 7 1 1 0 . The simulation results are shown below (Figure 5, Figure 6 and Figure 7).
Figure 5, Figure 6 and Figure 7 represent the strategy evolution paths of platform-based enterprises, users, and regulatory agencies when M φ + G b ε 2 < C p C n + H < M φ + G b ε 2 + η 1 + η 2 . From the graphs, it can be observed that when G a > C k , all three parties—platform-based enterprises, users, and regulatory agencies—reach a stable equilibrium point located at E 7 1 1 0 .
At this point, the regulatory authority monitors and exerts control over the behavior of platform-based enterprises leaking user privacy information through various means that influence the reputation of network platforms, achieving effective regulation. As a result, platform-based enterprises tend to adopt strategies that avoid leaking user privacy information. Moreover, users find the benefits of participating in governance to outweigh the associated costs, leading to a stable strategy of continued participation in governance. Finally, under conditions of a well-functioning privacy and security market, the regulatory authority remains stable in adopting traditional regulatory strategies to reduce costs.

5.3. Medium-Yield Scenario

At this point, let C p C n + H = 45 . The stable points are located at E 1 0 0 0 and E 2 0 0 1 . However, since E 1 0 0 0 and E 2 0 0 1 contain 0 eigenvalues, only a bilateral equilibrium between the platform-based enterprise and the users can be achieved, while the stable strategy of the regulatory body remains uncertain. The simulation results are shown in Figure 8, Figure 9 and Figure 10:
Figure 8, Figure 9 and Figure 10 illustrate the strategy evolution paths of the platform, users, and regulatory agencies when M φ + G b ε 2 + η 1 + η 2 < C p C n + H < M φ + G b ε 1 + η 1 + η 2 . Since the equilibrium points are located at E 1 0 0 0 and E 2 0 0 1 and the characteristic values of the regulatory agencies at these two points are both 0, the regulatory agencies’ strategy is unstable (Figure 10). If G b ε 1 ε 2 < C k = 12 , then the platform-based enterprise and users can achieve a bilateral equilibrium. In this case, for platform-based enterprises, disclosing user privacy information can gain exceptionally high profits, and the instability of regulatory agencies’ strategies incentivizes platform-based enterprises to settle on a strategy of disclosing user privacy. On the other hand, for users, the instability of regulatory agencies’ strategies reduces their confidence in participating in co-governance and introduces significant uncertainty regarding the benefits of participation, leading users to remain in a state of non-participation in co-governance.

5.4. High-Yield Scenario

In this case, let C p C n + H = 60 . The stable equilibrium points are located at E 1 0 0 0 , E 2 0 0 1 and E 4 0 1 1 .
However, since E 1 0 0 0 and E 2 0 0 1 contain eigenvalues equal to 0, achieving a stable equilibrium among the platform-based enterprise, users, and regulatory agencies requires that G b ε 1 ε 2 > C k and C d < S + M φ ε 2 ε 1 . Under these conditions, the stable equilibrium point for the three parties’ strategies is located at E 4 0 1 1 . The simulation results are shown in Figure 11, Figure 12 and Figure 13.
Figure 11, Figure 12 and Figure 13 illustrates the strategy evolution paths of network platforms, users, and regulators when M φ + G b ε 1 + η 1 + η 2 < C p C n + H .
Since the eigenvalues at the two stable points E 1 0 0 0 and E 2 0 0 1 are both 0 for the regulatory agencies, the three parties can only achieve equilibrium stability if G b ε 1 ε 2 > C k and C d < S + M φ ε 2 ε 1 . In this case, platform-based enterprises can achieve excessive profits by disclosing user privacy, which may even exceed the sum of fines paid to regulatory agencies, compensations to users, and reputation losses caused by user participation in governance under co-governance regulation. As a result, their strategy stabilizes on disclosing user privacy, and no matter how users or regulatory agencies adjust their strategies, it will not affect the platform-based enterprises’ choice. For users, facing the severe issue of privacy leakage, they will actively participate in co-governance to protect their personal privacy and minimize the loss of privacy exposure, thereby safeguarding their information rights. For regulatory agencies, on the one hand, as the primary authority responsible for social governance, they bear an unavoidable responsibility for ensuring privacy security and promoting digital ecological stability. In response to serious privacy leakage, they must adopt more efficient regulatory measures. On the other hand, the diversity of co-governance participants and communication channels has reduced the costs associated with regulatory agencies’ co-governance participation. These factors collectively determine that, in this context, regulatory agencies must adopt a co-governance regulatory strategy.

6. Discussion and Conclusions

6.1. Discussion

With the rapid advancement of the digital economy, online platforms have delivered efficient and convenient services to users but have also emerged as significant sources of privacy leakage. Driven by the pursuit of substantial profits, some platforms engage in the sale of user data, fostering a black-gray industrial chain composed of upstream, midstream, and downstream criminal networks. These activities pose severe threats to users’ privacy rights, undermine social information security, and endanger the sustainable development of the digital economy. The sheer number and scale of platforms, coupled with behaviors characterized by anonymity, virtuality, and indirectness, often leave users in a disadvantaged position when confronting privacy breaches. Regulatory agencies face additional challenges, including the high costs of comprehensive oversight, the inefficiency of traditional regulatory approaches, and difficulties in penetrating all sectors effectively. These systemic shortcomings create an environment conducive to opportunistic behaviors by platforms engaging in unauthorized privacy disclosures.
To address the mismatch between finite regulatory resources and the expansive number of regulatory targets—and to resolve the dilemmas of users’ ‘silent governance’ and regulatory agencies’ ‘limited governance’—, this study adopts a theoretical framework rooted in routine activity theory. It examines the strategic behaviors of platform-based enterprises, users, and regulatory agencies through the lenses of motivation, opportunity, and control. Building on this theoretical foundation, the study develops a tripartite evolutionary game model to analyze the strategic interactions among these three stakeholders within the context of multi-party co-governance. The model explores equilibrium strategies for addressing privacy disclosure by integrating theoretical modeling with dynamic replication and scenario-based analysis. Simulation experiments, incorporating initial constraints, dynamic equation replication, and scenario combination strategies, are conducted to validate the equilibrium outcomes. The results empirically support the theoretical conclusions, demonstrating the model’s effectiveness in elucidating the strategic dynamics among platform-based enterprises, users, and regulatory agencies. These findings offer valuable insights for developing robust multi-stakeholder governance frameworks to mitigate privacy leakage and promote the sustainable growth of the digital economy.
The study demonstrates that
(1) The motivation for platform-based enterprises to disclose user privacy information primarily stems from the substantial profits generated by such activities.
When the benefits of disclosing user privacy information are relatively low—falling below the combined costs of fines imposed by regulatory agencies and compensation paid to users under traditional regulatory frameworks—, platform-based enterprises typically avoid risk by adopting a strategy of non-disclosure. In scenarios where the benefits of disclosing privacy information exceed the sum of fines and compensation under traditional regulatory strategies but remain lower than the combined costs of fines, compensation, and reputation losses arising from user participation in co-governance initiatives, the strategic choices of platform-based enterprises are heavily influenced by user behavior. Specifically, when users adopt a non-participatory stance in co-governance, platform-based enterprises tend to disclose user privacy information. Conversely, when users consistently participate in co-governance, platforms stabilize in adopting a non-disclosure strategy. This dynamic establishes a reciprocal relationship, where user participation in co-governance reinforces platforms’ non-disclosure behavior, and vice versa.
However, when the benefits of disclosing privacy information surpass the costs of fines, compensation, and reputation losses associated with user participation in co-governance under both traditional and co-governance regulatory frameworks, platform-based enterprises tend to stabilize in a disclosure strategy, unaffected by regulatory authority interventions. Furthermore, if the benefits of disclosing privacy information exceed the total costs of fines, compensation, and reputation losses under co-governance regulatory strategies, platforms persistently engage in disclosure, regardless of changes in user or regulatory strategies.
To effectively deter platform-based enterprises from disclosing user privacy information, regulatory agencies must implement robust control measures. These include imposing significant fines for disclosure violations, ensuring adequate compensation for affected users, and addressing reputation losses to strengthen the impact of co-governance mechanisms.
(2) Users’ motivation to participate in co-governance is shaped by several factors, including the minimum incentives provided by regulatory agencies to encourage participation, the strategic approaches of these agencies, and users’ awareness of their rights to protect personal information.
When the profits of platform-based enterprises are relatively low, users’ engagement in co-governance is primarily driven by the rewards offered by regulatory agencies. When the benefits of participation exceed the associated costs, users are inclined to report observed privacy disclosure behaviors by platform-based enterprises to regulatory authorities. This reporting reduces information asymmetry faced by regulatory agencies, expands information acquisition channels, and enhances governance capacity and regulatory efficiency. However, when platform-based enterprises’ profits from disclosing user privacy information fall between the combined costs of fines, compensation, and reputation losses under traditional and co-governance regulatory strategies, inconsistent regulatory actions undermine users’ confidence in participating in co-governance. This increased uncertainty regarding the benefits of participation leads users to stabilize in a non-participation strategy. Furthermore, when the profits gained by platform-based enterprises from disclosing user privacy information exceed the total costs of fines, compensation, and reputation losses under co-governance regulatory strategies, neither user actions nor regulatory strategies effectively influence the enterprises’ decisions to disclose privacy information. Such scenarios result in a significant social privacy and security crisis.
In these circumstances, users are compelled to actively participate in co-governance to maximize their privacy protection and minimize the adverse impacts of privacy breaches. By doing so, they strive to uphold their legal rights and mitigate the risks associated with personal information disclosure. Regulatory agencies, such as the Cyberspace Administration of China (CAC) and the Ministry of Industry and Information Technology (MIIT), play a pivotal role in enforcing privacy protection laws like the Personal Information Protection Law (PIPL). These agencies impose significant penalties, including fines of up to 5% of annual revenue or RMB 50 million, to deter unauthorized data disclosures. Despite these measures, the scale of platform operations and the prevalence of black-gray industrial chains exacerbate privacy risks, necessitating more robust governance mechanisms. For example, our simulation model reveals that when platform-based enterprises’ profits from disclosing user privacy information exceed the combined costs of fines, compensation, and reputation losses, neither user actions nor regulatory strategies effectively deter privacy breaches. This finding underscores the limitations of current governance mechanisms and highlights the need for stronger incentives, such as increased fines and user compensation, to align platform behavior with societal expectations.
(3) For regulatory agencies, when the profits earned by platform-based enterprises from disclosing user privacy information are relatively low, effective privacy governance can be achieved by increasing the incentives for user participation in co-governance and imposing stricter penalties on enterprises that engage in privacy disclosure.
Under these conditions, platform-based enterprises are likely to adopt a non-disclosure strategy due to concerns about reputation damage and financial repercussions. In this scenario, platform-based enterprises tend to adopt the strategy of not disclosing user privacy information due to concerns about reputation losses and financial penalties. However, when the profits derived from disclosing user privacy information surpass a critical threshold—exceeding the combined costs of fines, user compensation, and reputation losses incurred through user participation in co-governance—, platform-based enterprises consistently opt for privacy disclosure. In such cases, privacy and security challenges escalate to unprecedented levels, leaving users’ privacy rights inadequately protected. Faced with these circumstances, both users and regulatory agencies are incentivized to adopt cooperative strategies. Users stabilize in actively participating in co-governance, while regulatory agencies commit to co-governance regulatory approaches. This collaboration facilitates the establishment of a multi-party co-governance framework. Nevertheless, despite these coordinated efforts, the framework struggles to exert effective control over privacy leakage, underscoring the limitations of current governance mechanisms in addressing the pervasive issue of user privacy breaches.
Compared to studies focusing on Western contexts, such as Zuboff’s analysis of surveillance capitalism [34], our findings reveal unique challenges in China’s digital ecosystem. For example, while the European Union’s General Data Protection Regulation [35] emphasizes stringent penalties and user consent mechanisms, our study suggests that China’s regulatory framework must also address the high costs of comprehensive oversight and the inefficiency of traditional governance methods. Furthermore, our simulation model provides a novel contribution by integrating routine activity theory with evolutionary game theory to analyze strategic interactions among stakeholders. This approach complements the work of Xu et al. [36], who focused on the privacy paradox in personalized services, by offering a dynamic perspective on how user participation and regulatory interventions can collectively influence platform behavior.

6.2. Conclusions

This study advances the theoretical understanding of privacy governance by constructing a tripartite co-governance framework that integrates routine activity theory with evolutionary game theory. Through this framework, we have systematically analyzed the strategic interactions among platform-based enterprises, users, and regulatory agencies, highlighting the critical roles of economic incentives, regulatory interventions, and user participation in shaping platform behavior. The findings reveal that when the profits from disclosing user privacy information exceed the combined costs of fines, compensation, and reputation losses, traditional governance mechanisms often fail to deter privacy breaches. However, user participation in co-governance can significantly enhance regulatory efficiency, particularly when supported by adequate incentives and consistent enforcement. These insights underscore the importance of aligning platform behavior with societal expectations through robust regulatory measures and active user engagement.
Despite its contributions, this study has several limitations that warrant further investigation. First, the theoretical assumptions underpinning the simulation model may not fully capture the complexities of real-world scenarios. Future research should incorporate empirical data to refine the model’s parameters and validate its applicability across diverse regulatory environments. Second, the focus on China’s digital ecosystem limits the generalizability of the findings. Comparative studies across jurisdictions with varying levels of regulatory stringency and technological adoption are essential to develop a more comprehensive and globally relevant framework. Additionally, emerging technologies such as blockchain and artificial intelligence offer promising avenues for enhancing privacy protection and governance efficiency, warranting further exploration. By addressing these gaps, researchers can contribute to the development of more effective and inclusive privacy governance frameworks that balance the interests of platform-based enterprises, users, and regulatory agencies, fostering a safer and more sustainable digital ecosystem.

Author Contributions

Conceptualization, Z.S.; methodology, P.X.; software, J.L.; validation, J.L.; formal analysis, P.X.; investigation, P.X.; resources, P.X.; data curation, J.L.; writing—original draft preparation, P.X. and J.L.; writing—review and editing, Z.S.; visualization, P.X.; supervision, Z.S.; project administration, Z.S.; funding acquisition, Z.S. All authors have read and agreed to the published version of the manuscript.

Funding

The research was supported by the China Postdoctoral Science Foundation (2023M743213), the Humanities and Social Sciences Research Project of Henan Province Colleges and Universities (2024-ZZJH-283), and the Henan Key R&D and Promotion Project (Soft Science) (242400411171).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. China Internet Network Information Center. The 55th Statistical Report on Internet Development in China. Available online: https://www.cnnic.net.cn/n4/2025/0117/c88-11229.html (accessed on 17 January 2025).
  2. Le, C.; Zhang, Z.; Liu, Y. Research on Privacy Disclosure Behavior of Mobile App Users from Perspectives of Online Social Support and Gender Differences. Int. J. Hum.–Comput. Interact. 2025, 41, 861–875. [Google Scholar] [CrossRef]
  3. Zhang, F.; Pan, Z.; Lu, Y. AIoT-enabled smart surveillance for personal data digitalization: Contextual personalization-privacy paradox in smart home. Inf. Manag. 2023, 60, 103736. [Google Scholar] [CrossRef]
  4. Xin, X.; Yang, J.; Wang, H.; Ma, J.; Ren, P.; Luo, H.; Shi, X.; Chen, Z.; Ren, Z. On the user behavior leakage from recommender system exposure. ACM Trans. Inf. Syst. 2023, 41, 1–25. [Google Scholar] [CrossRef]
  5. Al-Huthaifi, R.; Li, T.; Huang, W.; Gu, J.; Li, C. Federated learning in smart cities: Privacy and security survey. Inf. Sci. 2023, 632, 833–857. [Google Scholar] [CrossRef]
  6. Spagnoletti, P.; Ceci, F.; Bygstad, B. Online Black-Markets: An Investigation of a Digital Infrastructure in the Dark. Inf. Syst. Front. 2022, 24, 1811–1826. [Google Scholar] [CrossRef] [PubMed]
  7. Standing Committee of the National People’s Congress. Data Security Law of the People’s Republic of China. Available online: https://www.gov.cn/xinwen/2021-06/11/content_5616919.htm (accessed on 1 September 2021).
  8. Standing Committee of the National People’s Congress. Personal Information Protection Law of the People’s Republic of China. Available online: https://www.samr.gov.cn/wljys/gzzd/art/2023/art_3ef1e889c1e644d4b65b5f5c7f432386.html (accessed on 1 November 2021).
  9. Drapalova, E.; Wegrich, K. Platforms’ regulatory disruptiveness and local regulatory outcomes in Europe. Internet Policy Rev. 2024, 13, 1745. [Google Scholar] [CrossRef]
  10. Li, J.; Chen, G. A personalized trajectory privacy protection method. Comput. Secur. 2021, 108, 102323. [Google Scholar] [CrossRef]
  11. Internet Rule of Law Research Center, China Youth University for Political Studies, Cover Think Tank. China Personal Information Security and Privacy Protection Report. Available online: https://www.thecover.cn/news/158619 (accessed on 22 November 2016).
  12. Zhang, L.; Xue, J. Analysis of Personal Privacy Leakage Under the Background of Digital Twin. Sci. Soc. Res. 2024, 6, 168–175. [Google Scholar] [CrossRef]
  13. Sun, Z.; Xie, S.; Xu, W.; Xu, L.; Li, H. User-tailored privacy: Unraveling the influences of psychological needs and message framing on app users’ privacy Disclosure intentions. Curr. Psychol. 2024, 43, 33893–33907. [Google Scholar] [CrossRef]
  14. Vandana, G. Cross-border flow of personal data (digital trade) ought to have data protection. J. Data Prot. Priv. 2024, 7, 61–79. [Google Scholar]
  15. Walker, K.L.; Milne, G.R. AI-driven technology and privacy: The value of social media responsibility. J. Res. Interact. Mark. 2024, 18, 815–835. [Google Scholar] [CrossRef]
  16. Xu, H.; Teo, H.-H.; Tan, B.C.; Agarwal, R. Research note—Effects of individual self-protection, industry self-regulation, and government regulation on privacy concerns: A study of location-based services. Inf. Syst. Res. 2012, 23, 1342–1363. [Google Scholar] [CrossRef]
  17. Alcántara, J.C.; Tasic, I.; Cano, M.-D. Enhancing Digital Identity: Evaluating Avatar Creation Tools and Privacy Challenges for the Metaverse. Information 2024, 15, 624. [Google Scholar] [CrossRef]
  18. Yu, L.; Li, H.; He, W.; Wang, F.-K.; Jiao, S. A meta-analysis to explore privacy cognition and information disclosure of internet users. Int. J. Inf. Manag. 2020, 51, 102015. [Google Scholar] [CrossRef]
  19. Dinev, T.; Hart, P. An extended privacy calculus model for e-commerce transactions. Inf. Syst. Res. 2006, 17, 61–80. [Google Scholar] [CrossRef]
  20. Liu, Z.; Wang, X.; Min, Q.; Li, W. The effect of role conflict on self-disclosure in social network sites: An integrated perspective of boundary regulation and dual process model. Inf. Syst. J. 2019, 29, 279–316. [Google Scholar] [CrossRef]
  21. Kim, J.; Kim, J. A Study on the Causes of Information Privacy Concerns and Protective Responses in e-Commerce: Focusing on the Principal-Agent Theory. J. Inf. Syst. 2014, 23, 119–145. [Google Scholar]
  22. Jozani, M.; Ayaburi, E.; Ko, M.; Choo, K.-K.R. Privacy concerns and benefits of engagement with social media-enabled apps: A privacy calculus perspective. Comput. Hum. Behav. 2020, 107, 106260. [Google Scholar] [CrossRef]
  23. Zhu, N.; Chen, B.; Wang, S.; Teng, D.; He, J. Ontology-Based Approach for the Measurement of Privacy Disclosure. Inf. Syst. Front. 2021, 24, 1689–1707. [Google Scholar] [CrossRef]
  24. Xu, Y.; Meng, X.; Li, Y.; Xu, X. Research on privacy disclosure detection method in social networks based on multi-dimensional deep learning. Comput. Mater. Contin. 2020, 62, 137–155. [Google Scholar] [CrossRef]
  25. Kang, H.; Xiao, Y.; Yin, J. An Intelligent Detection Method of Personal Privacy Disclosure for Social Networks. Secur. Commun. Netw. 2021, 2021, 5518220. [Google Scholar] [CrossRef]
  26. Zhu, K.; He, X.; Xiang, B.; Zhang, L.; Pattavina, A. How dangerous are your smartphones? App usage recommendation with privacy preserving. Mob. Inf. Syst. 2016, 2016, 6804379. [Google Scholar] [CrossRef]
  27. Hoffmann, F.; Inderst, R.; Ottaviani, M. Persuasion through selective disclosure: Implications for marketing, campaigning, and privacy regulation. Manag. Sci. 2020, 66, 4958–4979. [Google Scholar] [CrossRef]
  28. Taneja, A.; Vitrano, J.; Gengo, N.J. Rationality-based beliefs affecting individual’s attitude and intention to use privacy controls on Facebook: An empirical investigation. Comput. Hum. Behav. 2014, 38, 159–173. [Google Scholar] [CrossRef]
  29. Du, J.; Jiang, C.; Chen, K.-C.; Ren, Y.; Poor, H.V. Community-structured evolutionary game for privacy protection in social networks. IEEE Trans. Inf. Forensics Secur. 2017, 13, 574–589. [Google Scholar] [CrossRef]
  30. Gopal, R.D.; Hidaji, H.; Patterson, R.A.; Rolland, E.; Zhdanov, D. How much to share with third parties? User privacy concerns and website dilemmas. MIS Q. 2018, 42, 143–164. [Google Scholar] [CrossRef]
  31. Sun, Z.; Yin, L.; Li, C.; Zhang, W.; Li, A.; Tian, Z. The QoS and privacy trade-off of adversarial deep learning: An evolutionary game approach. Comput. Secur. 2020, 96, 101876. [Google Scholar] [CrossRef]
  32. Mushtaq, S.; Shah, M. Critical Factors and Practices in Mitigating Cybercrimes Within E-Government Services: A Rapid Review on Optimising Public Service Management. Information 2024, 15, 619. [Google Scholar] [CrossRef]
  33. Zhang, P.; Wu, H.; Li, H.; Zhong, B.; Fung, I.W.; Lee, Y.Y.R. Exploring the adoption of blockchain in modular integrated construction projects: A game theory-based analysis. J. Clean. Prod. 2023, 408, 137115. [Google Scholar] [CrossRef]
  34. Zuboff, S. Surveillance Capitalism and the Challenge of Collective Action. New Labor Forum 2019, 28, 10–29. [Google Scholar] [CrossRef]
  35. European Parliament; Council of the European Union. General Data Protection Regulation. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 (accessed on 25 May 2018).
  36. Xu, H.; Luo, X.; Carroll, J.M.; Rosson, M.B. The personalization privacy paradox: An exploratory study of decision making process for location-aware marketing. Decis. Support Syst. 2010, 51, 42–52. [Google Scholar] [CrossRef]
Figure 1. Evolution path of network platform strategies.
Figure 1. Evolution path of network platform strategies.
Information 16 00193 g001
Figure 2. Evolution path of user strategies when G a = 2 < C k .
Figure 2. Evolution path of user strategies when G a = 2 < C k .
Information 16 00193 g002
Figure 3. User strategy evolution path when G a = 10 > C k .
Figure 3. User strategy evolution path when G a = 10 > C k .
Information 16 00193 g003
Figure 4. Evolution path of regulatory agencies atrategies.
Figure 4. Evolution path of regulatory agencies atrategies.
Information 16 00193 g004
Figure 5. Network platform strategy evolution path.
Figure 5. Network platform strategy evolution path.
Information 16 00193 g005
Figure 6. User strategy evolution path.
Figure 6. User strategy evolution path.
Information 16 00193 g006
Figure 7. Regulatory authority strategy evolution path.
Figure 7. Regulatory authority strategy evolution path.
Information 16 00193 g007
Figure 8. Platform-based enterprise strategy evolution path.
Figure 8. Platform-based enterprise strategy evolution path.
Information 16 00193 g008
Figure 9. User strategy evolution path when G b ε 1 ε 2 < C k = 12 .
Figure 9. User strategy evolution path when G b ε 1 ε 2 < C k = 12 .
Information 16 00193 g009
Figure 10. Strategy evolution path of the regulatory authority.
Figure 10. Strategy evolution path of the regulatory authority.
Information 16 00193 g010
Figure 11. Platform strategies evolution path.
Figure 11. Platform strategies evolution path.
Information 16 00193 g011
Figure 12. User strategies evolution path.
Figure 12. User strategies evolution path.
Information 16 00193 g012
Figure 13. Regulator strategies evolution path.
Figure 13. Regulator strategies evolution path.
Information 16 00193 g013
Table 1. Model parameters and description.
Table 1. Model parameters and description.
SymbolDescriptionSymbolDescription
x Platform-based enterprises’ proportion of choosing not to leak user privacy information G a Benefits to users from participating in co-governance
y Proportion of platform-based enterprises not leaking user privacy information H Direct profits of platform-based enterprises from disclosing user privacy
z Proportion of co-governance regulation by the regulatory authority K Losses to non-cooperative users when platform-based enterprises disclose user privacy information
I Profit gained by platform-based enterprises from the legal use of user privacy information for commercial activities G b Compensation received by users through co-governance from platform-based enterprises for privacy leakage
C p Operating cost for platform-based enterprises to avoid leaking user privacy information η 2 Reputation loss for platform-based enterprises due to user participation in co-governance following privacy leakage
C n Operating cost for platform-based enterprises to disclose user privacy information ε 1 Identification probability of platform privacy breach under co-governance regulation
C k Cost of user participation in co-governance ε 2 Identification probability of platform privacy breach under traditional regulation
C e Regulatory agency’s response cost under co-governance and traditional regulation strategies M Maximum penalty amount for platforms disclosing user privacy information
C d Initial investment cost by the regulatory agencies under co-governance and co-management strategies φ Penalty severity for platforms disclosing user privacy information
η 1 Reputation benefits for platform-based enterprises that do not disclose user privacy information as a result of user participation in co-governance S Social benefits of regulatory agencies
Table 2. Tripartite benefits matrix for participating subjects.
Table 2. Tripartite benefits matrix for participating subjects.
UsersRegulatory Agencies
Co - Governance   Regulation s Traditional   Regulation   1 z
Platform-based enterprises Not   disclosing   user   privacy   information   ( x ) Participate   in   co - governance   ( y ) I + η 1 C P I + η 1 C P
G a C k G a C k
S C e C d S C e
Do   not   participate   in   co - governance   ( 1 y ) I C p I C p
00
S C e S C e
Disclosing   user   privacy   information   ( 1 x ) Participate   in   co - governance   ( y ) I + H C n G b ε 1 η 2 M φ ε 1 I + H C n G b ε 2 η 2 M φ ε 2
G b ε 1 C k K G b ε 2 C k K
S ε 1 + M φ ε 1 C e C d S ε 2 + M φ ε 2 C e
Do   not   participate   in   co - governance   ( 1 y ) I + H C n M φ ε 2 G b ε 2 I + H C n M φ ε 2 G b ε 2
G b ε 2 K G b ε 2 K
S ε 2 C e + M φ ε 2 S ε 2 C e + M φ ε 2
Table 3. Stability analysis of pure strategy equilibria.
Table 3. Stability analysis of pure strategy equilibria.
Equilibrium PointsEigenvalues of the Jacobian Matrix
λ 1 λ 2 λ 3
E 1 0 0 0 M φ + G b ε 2 C p C n + H C k 0
E 2 0 0 1 M φ + G b ε 2 C p C n + H C k + G b ε 1 ε 2 0
E 3 0 1 0 η 1 + η 2 + M φ + G b ε 2 C p C n + H C k S + M φ ε 1 ε 2 C d
E 4 0 1 1 η 1 + η 2 + M φ + G b ε 1 C p C n + H C k G b ε 1 ε 2 S + M φ ε 2 ε 1 + C d
E 5 1 0 0 C p C n + H M φ + G b ε 2 G a C k 0
E 6 1 0 1 C p C n + H M φ + G b ε 2 G a C k 0
E 7 1 1 0 C p C n + H η 1 + η 2 M φ + G b ε 2 C k G a C d
E 8 1 1 1 C p C n + H η 1 + η 2 M φ + G b ε 1 C k G a C d
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, P.; Li, J.; Sun, Z. Exploring Privacy Leakage in Platform-Based Enterprises: A Tripartite Evolutionary Game Analysis and Multilateral Co-Regulation Framework. Information 2025, 16, 193. https://doi.org/10.3390/info16030193

AMA Style

Xu P, Li J, Sun Z. Exploring Privacy Leakage in Platform-Based Enterprises: A Tripartite Evolutionary Game Analysis and Multilateral Co-Regulation Framework. Information. 2025; 16(3):193. https://doi.org/10.3390/info16030193

Chicago/Turabian Style

Xu, Peng, Jiaxin Li, and Zhuo Sun. 2025. "Exploring Privacy Leakage in Platform-Based Enterprises: A Tripartite Evolutionary Game Analysis and Multilateral Co-Regulation Framework" Information 16, no. 3: 193. https://doi.org/10.3390/info16030193

APA Style

Xu, P., Li, J., & Sun, Z. (2025). Exploring Privacy Leakage in Platform-Based Enterprises: A Tripartite Evolutionary Game Analysis and Multilateral Co-Regulation Framework. Information, 16(3), 193. https://doi.org/10.3390/info16030193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop