Next Article in Journal
Convex Fault Diagnosis of a Three-Degree-of-Freedom Mechanical Crane
Previous Article in Journal
Ad-Hoc Lanzhou Index
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

PPVC: Towards a Personalized Local Differential Privacy-Preserving Scheme for V2G Charging Networks

Key Laboratory of Aerospace Information Security and Trusted Computing, School of Cyber Science and Engineering, Ministry of Education, Wuhan University, Wuhan 430072, China
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(20), 4257; https://doi.org/10.3390/math11204257
Submission received: 1 September 2023 / Revised: 27 September 2023 / Accepted: 10 October 2023 / Published: 12 October 2023
(This article belongs to the Section Mathematics and Computer Science)

Abstract

:
The rapid development of electric vehicles provides users with convenience of life. When users enjoy the V2G charging service, privacy leakage of their charging location is a crucial security issue. Existing privacy-preserving algorithms for EV access to charging locations suffer from the problem of nondefendable background knowledge attacks and privacy attacks by untrustworthy third parties. We propose a personalized location privacy protection scheme (PPVC) based on differential privacy to meet users’ personalized EV charging requirements while protecting their privacy. First, by constructing a decision matrix, PPVC describes recommended routes’ utility and privacy effects. Then, a utility model is constructed based on the multiattribute theory. The user’s privacy preferences are integrated into the model to provide the route with the best utility. Finally, considering the privacy preference needs of users, the Euclidean distance share is used to assign appropriate privacy budgets to users and determine the generation range of false locations to generate the service request location with the highest utility. The experimental results show that the proposed personalized location privacy protection scheme can meet the service demands of users while reasonably protecting their privacy to provide higher service quality. Compared with existing solutions, PPVC improves the charging efficiency by up to 25%, and 8% at the same privacy protection level.

1. Introduction

With the rapid development of new energy technologies and the gradually rising popularity of electric vehicles (EVs), vehicle-to-grid (V2G) charging networks are being improved [1]. Due to the limitation of onboard battery technology, EVs need to find charging posts frequently. In interacting with charging posts, data centers will collect EV charging location data, which can provide data services to third-party research institutions and companies [2]. For example, charging service providers and grid companies can optimize the placement of charging stations.
The location information of users obtained by the management and services of V2G may cause privacy leakage problems. The charging locations frequently accessed by EVs can be linked to a user’s residence, workplace, and points of interest. These sensitive locations can easily expose specific users’ home addresses, health conditions, personal preferences, social connections, and other private information [3]. In the event of a background knowledge attack [4], the user bears a significant risk of personal privacy leakage [5]. Therefore, handling these location data from a privacy protection perspective is crucial for V2G charging network development.
In recent years, researchers have proposed EV location privacy protection methods using anonymous pseudonym techniques to hide the actual ID of the user [6,7]. The user provides the exact location of the grid control center to obtain charging services so that unauthorized persons cannot determine the user’s real identity. By using partially restricted blind signature techniques when EVs communicate with charging stations, only the intelligent grid server can map the pseudonym to the actual vehicle identity. Also, the pseudonym changes when the EV moves from one charging station to another. However, there are two main problems with the above identity-anonymity-based protection mechanisms:
  • Poor flexibility in sharing data among multiple entities for analysis due to the presence of mostly trusted third parties and critical escrow problems.
  • Privacy protection is not strong enough; trusted third parties are vulnerable to attacks and cannot resist background knowledge attacks.
Another protection method is to hide part of the location information by scrambling and other obfuscation of the user’s location data. Existing location privacy protection schemes are divided into the following categories: encryption mechanisms, caching strategies, anonymization techniques, and differential privacy. Anonymity protects location privacy using generalization theory [8]. The user’s actual location is sent to the charging service provider along with the locations of other k 1 users to generate an anonymous region to protect the user’s actual location information. However, the level of privacy protection is weakened due to the high-speed autonomous mobility of vehicle nodes. The encryption-based privacy protection scheme [9] processes the user’s location information through a one-way irreversible encryption function, ensuring service availability without revealing the user’s identity and location information. However, its high-cost consumption and high computational complexity make data sharing and data mining difficult. Solutions based on caching mechanisms [10] reduce the possibility of user privacy information leakage by reducing the number of interactions between users and charging service providers. However, these caching strategies rarely consider the need for user complexity. Thus, traditional location privacy protection methods do not provide sufficient privacy guarantees for in-vehicle networks.
Differential privacy does not depend on any background knowledge of the attacker, has a strong privacy guarantee, and can completely cut off the possibility of user privacy disclosure from the data source [11]. Dwork et al. [12] first proposed the concept of differential privacy in 2006. The main idea of differential privacy is to add interference noise to the published original data and generate false data to protect the potential user privacy information in the data. Therefore, it has been increasingly introduced into the Internet of vehicles in recent years to protect the location privacy of on-board users [13]. According to whether the third-party data aggregation server is trusted, differential privacy can be divided into centralized differential privacy and local differential privacy. Centralized differential privacy assumes that the third party is trusted, and each user sends his or her real data to the data aggregation server, which then processes the data through perturbation algorithms that satisfy differential privacy. However, not all third parties are trustworthy.
We propose a personalized charging station location privacy protection scheme (PPVC) based on sensitive location information to address the above issues. The scheme can meet the user’s privacy needs for personalized charging station selection while protecting the user’s location privacy. The research content and contributions of this paper are as follows:
  • We select the route with the highest utility from the navigation routes and build a utility model based on multiattribute decision theory. The route utility and privacy effects are described by a normalized decision matrix, and then privacy preferences are added to this model to build a multiattribute utility function to quantify the utility of the routes and select the highest-utility route for the user.
  • We propose a personalized privacy assignment algorithm to satisfy users’ personalized privacy settings. Firstly, we use the distance as an indicator to assign an appropriate privacy budget to the user and determine the range of false locations that the user can receive. While satisfying the user’s personalized privacy needs, we achieve the user’s charging service needs.
  • We evaluate PPVC and prove its privacy and utility by using relevant theorems. Meanwhile, based on the real dataset, experimental simulation is used to compare the performance of PPVC with the existing Shift Route [14] and ATGD [15] methods. PPVC improves the charging service quality by 25% and 8%, respectively, while the accuracy of the charging station is only slightly affected. PPVC not only guarantees privacy protection during the charging request process of EVS users but also meets the personalized privacy needs of users and provides higher service quality.
The rest of this paper is organized as follows. Section 2 provides related work about aggregate data protection methods and negative surveys. Section 3 describes in detail the system model used in this paper. Section 4 presents the specifics of our scheme. Comparative experiments and the corresponding analysis of the experimental results of our approach and the comparison approach are performed in Section 6. Section 7 gives the conclusion and future work.

2. Related Work

Existing location privacy protection schemes fall into the following categories: encryption mechanisms, caching strategies, anonymization techniques, and differential privacy. Anonymization techniques use generalization theory for location privacy protection [8]. Specifically, the user’s actual location is generated as an anonymous region along with the locations of k 1 other users and sent to the charging service provider. However, the level of privacy protection is weakened due to the vehicle nodes’ high-speed autonomous mobility characteristics. Therefore, the anonymization scheme cannot be directly applied in V2G. The encryption-based privacy protection scheme [9] processes the user’s location information through a one-way irreversible encryption function so that the user’s identity and location information cannot be disclosed while ensuring service availability. However, the encryption scheme consumes a lot. It has high computational complexity, making implementing data sharing and mining challenging. Schemes based on caching mechanisms [16] reduce the possibility of user privacy information leakage by reducing the number of interactions between users and charging service providers. However, these caching strategies rarely consider the need for user complexity. Thus, it is clear that traditional location privacy protection methods cannot provide adequate privacy guarantees for V2G networks.
Yin et al. [17], aiming at the characteristics of high dispersion and low density of location data, established a multilevel location information tree model by combining practicality and privacy and added noise to the access frequency of selected data by using the Laplace scheme. Xiong et al. [18] proposed a randomized differential privacy method for location datasets, using private location clustering to narrow the random field to hide the user’s exact location. Andres et al. [19] proposed a geographical method, which provides a formal privacy model for demonstrable privacy protection. Jiang et al. [20] performed noise processing on sensitive locations frequently visited on the driving track, thus protecting users’ location privacy.
Some obfuscation-based schemes focus on location privacy in road networks. The Geo-I Satisfying Map Index scheme (GEM) evaluates the level of privacy protection and data utility of traditional Geo-I in road networks [21]. The GEM sets connections in a road network as obfuscation candidates. The scheme is based on Geo-I directly obfuscating the driver’s location with the connection. The scheme in [22] discretizes the road network with the same length intervals. The authors use the path distance between two intervals to measure the indistinguishability of Geo-I. However, setting the same length intervals in the road network is challenging. If the intervals are short to accommodate short roads, computational consumption increases [23]. The correlation between connectivity and privacy can be ignored if the interval contains several short roads.
However, these traditional differential privacy protection methods provide the same level of privacy protection for users’ different charging service request locations. These methods cannot address sensitive location queries in a personalized manner. Personalized differential privacy protection schemes are proposed. Li et al. [24] proposed a personalized differential privacy protection method for repeated queries, which generates a new privacy protection specification based on the query user privileges and the number of the same queries. Li et al. [25] proposed a personalized range of sensitive privacy protection schemes using a map storage algorithm to facilitate the storage of 2D local maps and reduce the storage cost. The existing personalized differential privacy protection schemes divide privacy into different levels.
Various schemes protect drivers’ sensitive locations at different levels for personalization. The personalization scheme in [26] measures privacy requirements by individual attributes such as access duration, frequency, and regularity. The scheme formulates an incomplete information game to balance service quality and privacy protection. Zhong et al. [27] investigated privacy requirements based on movement regularity for personalized pseudonym exchange. The scheme [28] in measures privacy requirements using intimacy, which specifies the density of community edges in a social network. Differential privacy and generative adversarial networks are used to add noise to the raw data. The scheme [29] specifies that the personal privacy requirements of a location are negatively correlated with the number of hops to the sensitive location. The algorithm in [30] coordinates semantic privacy and location privacy based on the driver’s requirements, which are measured in terms of the relationship between the drivers. Then, a game-theoretic model is constructed to protect location and differential privacy based on social distance. The scheme in [31] designs the privacy requirements to negatively correlate with the Euclidean distance between the current location and the last inferred location. The privacy requirement is used to compute the obfuscated privacy budget. This design reduces the exposure probability but requires real-time computation of privacy requirements.

3. Preliminaries

3.1. Local Differential Privacy

The localized differential privacy technique [32,33] is inherited from the centralized differential privacy technique and, at the same time, extends the new features so that the technique has two main characteristics: (1) fully consider the background knowledge of arbitrary attackers and quantify the degree of privacy protection; (2) localize the perturbed data to resist privacy attacks from untrustworthy third-party data collectors.
Local differential privacy [34] has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts.
( ϵ -LDP): Given n users, each user corresponds to one record. Given a privacy algorithm M and its definition domain D o m ( M ) and value domain R a n g e ( M ) , if the algorithm M obtains the same output t * ( t * R a n g e ( M ) ) on any two records t and t ( t , t D o m ( M ) ) , satisfying the following inequality, then M satisfies ϵ -LDP.
P r M t = t * P r M t = t * e ϵ
where ϵ is the privacy budget, and the magnitude of its value is inversely proportional to the degree of privacy protection of the algorithm M.
It can be seen that the privacy algorithm M perturbing any one record after perturbing any record yields a relatively small change in the probability distribution of the output set, with its probability ratio not exceeding e ϵ . This means that even if the attacker knows the output, they cannot deduce which record is the input data, thus ensuring that the data can be protected against untrustworthy third parties.

3.2. Randomized Response

The randomized response technique [35] is a basic method for achieving local differential privacy. This technique was originally used in questionnaire studies, where instead of responding directly to sensitive questionnaire information, plausible results are returned so that the collector cannot determine the true information, but accurate statistical results can be obtained. The basic process of implementing LDP with random responses is as follows: L = { l 1 , l 2 , , l k } denotes the possible sensitive charging locations and k denotes the number of all charging locations. The true location is assumed to send a location to the central aggregator with probability p. If l i is a disguised location of i, it is sent with probability 1 p . p r , i denotes the probability that l i randomly generates l r , and the location probability matrix M is as follows:
M = p 1 , 1 p 1 , k p k , 1 p k , k

3.3. Geo-Indistinguishability

ε -Geo-indistinguishability [19] is the idea of introducing differential privacy into systems that provide services based on location data. When imposing privacy protection on two points that are geographically close, these two points should have a high probability of generating the exact fake location reported to the location server, where the set of locations is denoted as X, and the Euclidean distance r between the locations and the parameter l are jointly considered to determine the level of privacy protection of the location data.
For ε -geographical indistinguishability, a mapping mechanism K satisfies the definition of geographic indistinguishability determined by the parameter ε on the set of locations X, if and only if any two location points x and x in the set of locations satisfy it:
K X , Z e ε d x , x K x , z , x , x , z X
where d · denotes the Euclidean distance between two locations, and X represents the set of locations in this region. With geographic indistinguishability privacy protection imposed, the privacy leakage risk is limited to a defined range determined by the distance d · and the parameter ε . This privacy-preserving mechanism is referred to as having an ε R 2 privacy guarantee.
Since the Laplace mechanism of differential privacy applies to one-dimensional space, if it is applied to two-dimensional space, it is first necessary to define a continuous mechanism for the continuous plane that satisfies geographical indistinguishability.
Planar Laplace distribution [36]: Given ε R 2 , for the actual position x, for any approximate position z R 2 generated by the mechanism, the probability density function is
D ε x z = ε 2 2 π e ε d x , z
The above equation corresponds to the Laplace distribution in the plane centered at x. Then, the probability density function in the form of polar coordinates with x as the origin is
D ε , θ θ D ε , R r = ε 2 2 π r e ε r
where D ε , R r is consistent with a T-distribution with parameter τ .

4. System Overview

4.1. Problem Statement

In the electric vehicle charging scenario, the system scenario is shown in Figure 1: the vehicle user is located at point A, the destination is point E, and the sensitive locations set by the user are F, H, and I. The sensitive locations are defined by the user according to the requirements, and different users will set different sensitive locations according to their own requirements.
In this case, the navigation system plans the user’s four routes to the charging station. The user must choose the most efficient route from these recommended routes as the driving route. On that route, service request location updates are periodically submitted to the charging service system for service queries. However, this frequent query will leak sensitive information, and the privacy of the user’s charging service request location A, B, C, D, E needs to be protected. The distance between these query locations A, B, C, D, E and their nearest sensitive locations F, H, I varies. Different privacy protection should be provided for different query locations. However, the existing location privacy protection cannot answer sensitive attribute queries in a personalized and differential private way [24,25], which still results in location privacy leakage.

4.2. Personalized Location Privacy Protection

This paper proposes a differential privacy-based privacy protection scheme for personalized charging locations. The scheme can protect the privacy of charging service requests at different locations on the optimal driving route according to the privacy requirements of the user. The personalized location privacy protection scheme is used to protect the user’s location information before the charging service request, as shown in Figure 2.
The user sets their service requirements, including information such as sensitive location points, privacy level ε , target points of interest, and the acceptable error distance range Δ between the wrong location and the actual location. The navigation system recommends m driving routes to the target charging station for the user based on their requirements. The user builds out a multiattribute road selection benefit decision matrix based on the driving routes recommended by the navigation. Based on the multiattribute route selection benefit matrix, a multiattribute route selection benefit function is established using a weight assignment algorithm based on information entropy theory, which jointly optimizes the distance between the user’s starting point and the charging station on the recommended driving route and the sum of the distance from each request location sent by the user to its nearest sensitive location. The benefit value of each recommended driving route is calculated based on the multiattribute benefit function. The most efficient way is determined as the user’s driving road using a ranking algorithm.
The radius of the sensitive circle R is calculated for each sensitive location point based on the user’s previously set acceptable error distance value Δ . Based on the selected driving route, the privacy budget is assigned to the user according to their individual needs. The user’s service request on the way chosen is outside the sensitive circle. Each charging service’s requested location point outside the sensitive circle is assigned a privacy budget according to the percentage of sensitive distance. The remaining privacy budget is set to each charging service request location point inside the sensitive circle in an equal share manner. The privacy budget allocated to the charging service request location is noise-added to generate a false location. The processed false location is sent to the charging service provider for charging service requests to protect the user’s actual location information. The charging service provider feeds the user with the service information results based on the information submitted by the user.

4.3. Multiattribute Decision Model Based on Information Entropy

PPVC establishes a multiattribute decision model based on information entropy, which integrates the influence of two attributes on route selection, namely total route length and Euclidean distance between charging station location and sensitive location, to obtain the relatively optimal road as the user’s driving route. Combined with the actual scenario, the driving route chosen by the user is mainly determined by two factors: One is that the route with low driving cost will be preferred. The other is that the route with less privacy leakage will be preferred. Therefore, we measure the cost of the k route by using the total length from the starting point to the charging station of the k-recommended driving route by the in-vehicle user. If the total length of the route is smaller, it means the cost is lower and the route is more likely to be chosen. The Euclidean distance sum of all service request locations and their nearest sensitive locations on the k-recommended route is used to measure the privacy leakage of the user on the k route. If the distance is larger, it means the privacy leakage is smaller, and the route is more likely to be chosen by the user.
According to multiattribute decision theory [30,31], the attribute whose value is positively proportional to the likelihood of a solution being chosen is called the benefit attribute; conversely, the attribute whose value is inversely proportional to the likelihood of a solution being chosen is called the cost attribute. Among the above two attributes, the Euclidean distance D between all service request locations and their nearest sensitive locations on the driving route recommended by the user in the k entry belongs to the benefit attribute. The distance from the starting place to the charging station on the driving route recommended in the k entry belongs to the cost attribute. The symbols involved are shown in Table 1.
Cost Attribute: The total length from the starting place to the charging station in the driving route recommended in clause k is
l k 1 o = i = 1 n x i + 1 k x i k 2 + y i + 1 k y i k 2 , k = 1 , 2 , m
Benefit Attribute: The sum of the Euclidean distances between all service request location points and their nearest sensitive locations on the driving route recommended by the user in the k entry is
d k 2 s = i , j = 1 n , N μ d i j k , k = 1 , 2 , . . . , m
where d k 2 s denotes the distance from the i service location to the j location on the k route; u = 1 means L i selects S j as the nearest sensitive location; u = 0 means other cases; N is the total number of sensitive locations set by the user; and m is the total number of routes recommended by the navigation system to the user.

4.4. Benefit Attributes

We consider the influence of two attributes on route selection, namely the sum of the distances d k 2 s between all service request location points and their nearest sensitive locations on the k-recommended driving route as the benefit attribute, the total length k of the user’s driving route from the source to the target point of interest on the k driving route as the cost attribute, and the set of attributes affecting which road the user chooses as { d k 1 o , d k 2 s } . Since the navigation system recommends m driving routes for the user, the multiattribute benefit decision matrix is established, the recommended multiple driving routes are treated as m solutions, and two critical attributes are included in each solution; then, the m solutions can form an m × 2 multiattribute decision matrix of the form
D = d 11 o d 12 s d 21 o d 22 s d m 1 o d m 2 s
In the multiattribute decision matrix, the meanings and scales of each attribute are different and noncommensurable. At the same time, each attribute’s value affects the user’s final decision result. Therefore, to make the outcome of the decision matrix meet the user’s requirements, a standardization approach is used to eliminate the differences between the attributes. The normalization of the two attributes that affect the path selection can be expressed as
r k 1 = max d 11 o , d 21 o , , d m 1 o d k 1 o max d 11 o , d 21 o , , d m 1 o min d 11 o , d 21 o , , d m 1 o , k = 1 , 2 , , m
r k 1 = max d 11 o , d 21 o , , d m 1 o d k 1 o max d 11 o , d 21 o , , d m 1 o min d 11 o , d 21 o , , d m 1 o , k = 1 , 2 , , m
where r k 1 is the cost attribute; r k 2 is the benefit attribute; m is the total number of paths planned by the navigation for the user: and r k j [ 0 , 1 ] , k = 1 , 2 , , m , j = 1 , 2 . After dimensionless processing, the normalization matrix R = ( r k j ) m × 2 is obtained, and r k j is called the normalization attribute of the k scenario for the j attribute value. Obviously, the larger the r k j value, the better.
The benefit value of each recommended driving route is calculated based on the multiattribute route selection benefit function, and the most efficient route is determined as the driving route using a ranking algorithm. The multiattribute route selection benefit function is calculated based on the user’s cost and benefit attributes and can be expressed as
z k = w 1 r k 1 + w 2 r k 2 , k = 1 , 2 , , m
where w 1 denotes the weight assigned to the user cost attribute using information entropy theory, and r k 1 denotes the cost attribute of the k-recommended driving route; w 2 denotes the weight assigned to the user benefit attribute using information entropy theory, and r k 2 denotes the benefit attribute of the k 2 recommended driving route; and w 1 + w 2 = 1 . After the weight values of the two attributes are determined, the utility value of each route can be calculated to determine which driving route the user chooses to drive. Thus, the privacy budget of that driving route can be assigned.

4.5. Weight Assignment for Benefit Attributes

The choice of attribute weights directly affects the decision results, and the information entropy method is used to determine the weights of the decision matrix. For the decision matrix R = ( r k j ) m × 2 , the evaluation of the solution for attribute j, p i j is defined as
p i j = x i j i = 1 m x i j , i , j
Bringing p i j into the information entropy formula, the entropy value E j = k i = 1 m p i j ln p i j , k = 1 ln m of the scheme about attribute j, and the degree of information deviation is defined as d j = 1 E j . In the decision-making process, if the user has unique preferences for specific attributes, the preference value λ l can be introduced to adjust the weights, and the relevant weights can be expressed as
w l = λ l d l λ l d l , l = 1 , 2
The above equation satisfies 0 w j 1 , w 1 + w 2 = 1 . Once the attribute weights are determined, the utility value of each path can be calculated to determine which path the user chooses. Then, the privacy budget can be allocated. Algorithm 1 summarizes the information entropy-based multiattribute path benefit algorithm.
Algorithm 1 Multiattribute-based path benefit algorithm
Input: The set of k routes recommended by the navigation system for the user to reach the destination.
Output: Route with the highest value of the benefit function R m a x k .
1:
Calculate the length of each driving route. d k 1 o = i = 1 n x i + 1 k x i k 2 + y i + 1 k y i k 2 , k = 1 , 2 , m
2:
Calculate the sum of the distances between each service request location point and its nearest sensitive location point on each route.
3:
Apply Equations (11) and (12) to normalize the decision matrix R = r k j m × 2
4:
Calculation of weight values for each attribute value: w l = λ l d l λ l d l , l = 1 , 2
5:
Establishment of a benefit function;
6:
i n d e x = 1
7:
for Number of iterations  k = 1 , 2 , , m 1  do
8:
   Benefit function  z k = w 1 r k 1 + w 2 r k 2 , k = 1 , 2 , , m . Selection of the maximum value of the benefit function
9:
   if  z [ 1 ] < z [ k + 1 ]  then
10:
      z [ 1 ] = z [ k + 1 ]
11:
      i n d e x = k + 1
12:
   end if
13:
end for
14:
get i n d e x , z [ i n d e x ]

5. Personalized Privacy Budget Allocation Algorithm

The existing privacy mechanisms do not consider users’ individual privacy needs. They cannot allocate appropriate privacy budgets to users according to their privacy needs at different locations. Hence, privacy protection is too much for some locations and not enough for others. Therefore, a personalized privacy budget allocation PPVC algorithm is proposed to meet users’ personalized service needs with reasonable privacy protection.
In order to meet the different privacy needs of users, the personalized privacy budget allocation PPVC algorithm quantifies the personalized privacy budget allocation model using the sensitive distance share. In order to implement this algorithm, it is necessary to set a sensitive area range for sensitive location points. Suppose the user’s actual location is outside the set sensitive circle, which indicates that the actual location is far from the sensitive location. In that case, the user’s privacy is less likely to be compromised. The privacy budget allocation scheme can perform the allocation process according to the original scheme, mainly determined by the Euclidean distance between the current location point and the sensitive location.
Our proposed privacy allocation method can achieve personalized privacy protection. The user adaptively chooses the privacy budget according to the privacy needs at different locations. The privacy budget value is close to 0 when the location of the user’s charging service request passes through a sensitive location. The amount of noise added to the user’s location data tends to infinity, making the user receive an invalid service. We propose a privacy budget allocation scheme inside the sensitive circle for this particular case. We evenly distribute the remaining privacy budget to all charging service request location points within the sensitive circle.
The privacy budget allocation model in this subsection is divided into the following parts: determination of the radius R of the sensitive circle, allocation of the privacy budget inside the sensitive area, and allocation of the privacy budget outside the sensitive area.

5.1. Radius of the Sensitive Circle

The radius of the sensitive circle corresponding to each sensitive position point is calculated using the planar Laplace noise mechanism. Suppose the current actual position of the user is p 0 = ( x 0 , y 0 ) . The noise is added to the actual position using the planar Laplace mechanism to generate the false position as q = ( x 0 + r r cos θ , y 0 + r r sin θ ) . The distortion distance between the actual position and the false position of the user can be expressed as
r r = p 0 , q = x x 0 2 + y y 0 2
According to the geographic indistinguishability, using the planar Laplace mechanism to add noise to the actual location, we can see that r r has the following relationship [37]:
r r = 1 ε i W 1 t 1 e + 1 , t = r a n d 0 , 1
The user sets their acceptable error value between the actual and false positions to Δ . If the generated false position satisfies the user’s requirements, the following relationship must be satisfied:
r r Δ
Based on Equations (15) and (16), we have the following relationship:
1 ε i W 1 τ 1 e + 1 Δ S u m ε d i W 1 τ 1 e + 1 Δ , S u m = d i , j
where R denotes; s u m denotes the sum of the distances of all service locations to their nearest sensitive locations on the entire driving route; ε denotes the total privacy budget selected by the user; Δ denotes the distance error threshold between the current service location and the generated false location, which is set by the user; W 1 t 1 e M , 1 is the Lambert function; M is related to the value of τ [37]; and τ denotes the random number generated between [0, 1]. When the distance between the actual location and the sensitive location is less than R, the privacy budget allocation within the sensitive region is applied; otherwise, the privacy budget allocation outside the sensitive region is applied.

5.2. Privacy Budget Allocation

5.2.1. Budget Allocation for Privacy Outside Sensitive Areas

When the user’s actual location is outside the set sensitive area d i , j > R , it means that the actual location is far away from the sensitive location. The user’s privacy is less likely to be compromised. In this case, the privacy budget allocation scheme can perform the allocation process according to the original scheme. This allocation scheme is mainly determined by the Euclidean distance between the current charging service request location and the sensitive location, as shown in Figure 1. For locations A, C, D, and E outside the sensitive circle, the allocation model is shown below:
ε i o u t = μ d i , j i = 1 n j = 1 N ε
where ε i ( o u t ) denotes the privacy budget assigned to the i service location outside the sensitive circle; ε denotes the total privacy budget selected by the user for privacy protection on the driving route; u = 1 means L i selects S j as the nearest sensitive location, and u = 0 means otherwise; d i , j denotes the distance from the i service location to the j sensitive location; n denotes the total number of service location points; and N denotes the total number of sensitive location points set by the user.
When the location of the user’s charging service request happens to pass through a sensitive location point, the privacy budget value is close to 0 if this allocation method is used, which means that the amount of noise added to the user’s location data tends to be infinite, and the user receives an invalid service. Therefore, we propose a privacy budget allocation method in sensitive areas for this particular case.

5.2.2. Privacy Budget Allocation in Sensitive Areas

Suppose the user’s actual location is inside the sensitive area set by the user. In that case, it means that the location of the user’s charging service request is close to the sensitive location. The possibility of privacy leakage is very high. Figure 1 shows the sensitive area, with F as the center and R as the radius. When a charging service request is made at this location, the user’s privacy budget is allocated as the sum of the total privacy budgets at the location minus the privacy budgets of all points outside the sensitive circle. Then, the remaining privacy budget is evenly distributed to each charging service request location. If the entire route contains n points inside the sensitive circle, the privacy budget allocated to each charging service request location is
ε i o u t = ε ε i o u t n
where ε denotes the total privacy budget set by the user on the driving route; ε i ( o u t ) denotes the privacy budget assigned to the i location outside the sensitive circle on the current driving route; and n denotes the total number of locations within the sensitive circle on the current driving route. Algorithm 2 outlines the implementation of the PPVC algorithm for personalized privacy budget assignment.
Algorithm 2 PPVC algorithm
Input:  R max k ; S = x 1 s , y 1 s , x 2 s , y 2 s , , x n s , y n s ; Δ ; ε , count = 0
Output: Service request location point privacy budget allocation results ε i .
1:
Calculate the radius of the sensitivity circle: R = Sum ε Δ W 1 τ 1 e + 1
2:
for Number of iterations  i = 1 , 2 , n  do
3:
   Determine if the service request location is outside the sensitive circle.
4:
   if  d i s R  then
5:
     Budgetary allocations for privacy outside sensitive circles: ε i ( out ) = μ d i , j i = 1 n j = 1 N μ d i , j ε
6:
     Sum of privacy budgets for all sensitive circles: ε s u m = ε s u m + ε i ( o u t )
7:
   else
8:
      c o u n t = c o u n t + 1
9:
   end if
10:
end for
11:
Budgetary allocations for privacy in sensitive circles: ε i ( i n ) = ε ε i ( o u t ) n

5.3. Data Analytics for Utilities

Theorem 1
( ( ε i , δ ) -utility). D is a transactional dataset, and D ¯ is the result of the personalized location privacy protection scheme after protecting D. If the following relation holds, then this scheme satisfies ( ε i , δ ) -utility:
P r Q D ¯ Q D ε i > 1 δ
Proof. 
Assume that the spatial extent query Q covers n items in the output domain, and the exact query result of Q on the dataset is Q ( D ) = i = 1 n Q ( S i ) . Where S i denotes the items covered by the query Q, the query result on the noisy dataset D ¯ is denoted as Q ( D ¯ ) = i = 1 n ( Q ( S i + N i ) ) , and N i denotes the added noise. According to the definition of ( ε i , δ ) -utility, it is necessary to prove that P r Q D ¯ Q D ε i > 1 δ .
Q D ¯ Q D = i = 1 n Q S i + N i i 1 n Q S i = i = 1 n N i i = 1 n N i
where N i denotes the noise added to the original data that satisfies the planar Laplace distribution, and for each | N i | ε i satisfies the following relation:
P r i = 1 n N i ε i > 1 δ
If | N i | ε i , then it is denoted as a FAILURE, whose probability of occurrence is given by the following relation:
P r F a i l u r e = 0 2 π R ε i 2 2 π r e ε i r d r d θ = e ε i R 1 + ε i R
Thus, the likelihood of a successful event is related as follows:
P r i = 1 n N i ε i > 1 n e ε i R 1 + ε i R n
Thus, it can be obtained that the program satisfies the definition of data utility.
1 e ε i R 1 + ε i R n 1 n e ε i R 1 + ε i R

6. Performance Evaluation

In this section, we present our implementation, evaluation, and comparison with other works and discussions of PPVC.

6.1. Setup and Dataset

The computer used for the experiments is equipped with an Intel i7 processor, 16GB RAM, and Windows 10 64-bit operating system, and all algorithms are implemented by the Python language.
Dataset Description. The cab dataset is highly similar to the data of EVs accessing charging posts. Therefore, we use the New York cab dataset provided by the Kaggle website as the public dataset and select the main urban area of New York City (latitude 40.34° to 40.45° north, longitude 73.94° to 74.00° west) as the experimental dataset [38]. Then, the area is divided into geographical grids at 0.01° latitude and longitude intervals. It is assumed that each block is regarded as a power supply station area, and all the check-in locations within it can be regarded as the charging locations of charging posts. Since the dataset contains a large amount of sparse check-in data (a single location is visited no more than twice), the actual locations of the charging piles are frequently visited. Therefore, the dataset needs to be preprocessed to select data in the privacy domain for check-in locations, with several check-in users (higher than four) and locations (higher than two) as the experimental numbers.

6.2. Evaluation Metrics

We evaluate the performance of the scheme in terms of quality of service and privacy protection. The quality of service is affected by the deviation between the wrong location and the actual location uploaded by the user to the charging service. The Q o S l o s s (quality of charging service loss) is used to quantify the user’s quality of service in the experimental validation. PPVC calculates the average relative error of the count query Q in the processed dataset D and the actual result in the original dataset D ¯ . The average relative error can be expressed as
e r r o r Q D ¯ = Q D ¯ Q D m a x Q D , s
where e r r o r ( Q ( D ¯ ) ) is the average relative error of the noise-treated dataset D ¯ compared with the original dataset D. We use this formula to define the service quality loss. D ¯ denotes the data after adding noise processing; D denotes the original data; and the parameter s is a threshold value set to prevent the query Q from being too selective so that the denominator is not 0. The query selectivity is the percentage of the total number of records that satisfy the query condition, where m a x | x | denotes the maximum sequence of queries. When a user obtains a service from a charging service provider, the actual location is replaced by a false location, which causes the location of the charging service request sent by the user to the charging service to deviate from the real location, the service information provided by the charging service provider to the user is inaccurate. Therefore, the Q o S provided by the charging service to the user is determined by the similarity between the false dataset D ¯ and the original dataset D, the Q o S l o s s of the user is determined by the average relative e r r o r ( Q ( D ¯ ) ) . If the similarity between the real dataset D and the noisy dataset D ¯ is higher, the Q o S is higher; otherwise, the Q o S is worse.

6.3. Security Analysis

Given the user’s actual location q = ( x 0 , y 0 ) , add noise to this location according to the privacy budget assigned to it, and then send the generated fake location p to the server. The personalized location privacy protection scheme proposed in this paper provides a privacy guarantee to the user, and the privacy budget obtained by the scheme needs to satisfy the following relation:
Pr ε i ( q ) ( p ) Pr ε i ( q ) ( p ) e ε i r
Proof. 
The probability function of the planar Laplace mechanism is as follows:
D ε ( p 0 ) ( p ) = ε 2 2 π e ε d ( p 0 , p )
and it follows from geographic indistinguishability that
Pr ε i ( q ) ( p ) Pr ε i ( q ) ( p ) = e ε i [ d ( q , p ) d ( q , p ) ] e ε i d ( q , q ) = e ε i r
It is proved that the proposed scheme satisfies the ε -differential privacy and can provide a reliable privacy guarantee for users. □

6.4. Analysis of Service Quality

6.4.1. Impact of Path Length on Quality of Service

After determining the level of privacy protection, we compared the relationship between the length of the user’s driving route and the quality of service. Since the user needs to periodically submit service request location updates to the charging service system while driving on the selected most-driven route, PPVC uses the location where the service request is made to represent the driving route length. From the analysis in Figure 3, it can be seen that PPVC outperforms both ATGD [14] and Shift Route [15] in terms of privacy protection until 15 service request locations. When the number of service request locations increases, i.e., after 15 service request locations, the average error of PPVC is smaller than the other two schemes.
The experimental results can be analyzed as follows: when the selected route is shorter (the number of service request locations is fewer), the number of sensitive locations the user sets will be much larger than the number of nonsensitive locations. From the experimental results, the user’s privacy protection is better in this case. PPVC proves the proposed scheme adopts suitable privacy protection measures for sensitive locations. When the route length is increased with the same number of sensitive locations, the number of nonsensitive locations increases except for three sensitive locations. From the analysis of the experimental results, it is concluded that this scheme’s service quality is higher than the other two schemes. The proposed scheme focuses on the improvement of service quality for nonsensitive locations under the premise of protecting user privacy.
We can analyze from the experimental results that when the selected route is short (the number of service request locations is small), the number of sensitive locations set by the user will be much larger than the number of nonsensitive locations. In this case, the user’s privacy protection is better. The experimental results verify that this scheme takes suitable privacy protection measures for sensitive location points. When the route length is increased with the same number of sensitive locations, the number of nonsensitive locations increases in addition to the three sensitive locations. From the experimental results, the service quality of PPVC is higher than the other two schemes. PPVC focuses on improving service quality for nonsensitive locations while protecting user privacy.
Therefore, our experiments show that PPVC has good privacy protection for service request locations close to sensitive locations. At the same time, it can improve the service quality for service request locations far from sensitive locations while ensuring privacy. In this paper, PPVC can meet users’ personalized privacy preferences and improve service quality while protecting privacy.

6.4.2. Impact of Error Value Δ on Service Quality

We set the length of the entire trajectory route driven by the user to 20. We compare the quality of service with the error distance Δ between the acceptable false position and the actual position set by the user with the same privacy budget. From the previous analysis, Δ is inversely related to the radius of the sensitive circle. Figure 4 shows that the loss of service quality is inversely related to Δ for the same privacy level for the same privacy protection. As Δ decreases, the quality of service loss increases when the radius of the sensitive circle R increases because, as the radius of the sharp circle R increases, the privacy requirement of each location point is the same. At this time, some points with low privacy requirements also use higher privacy protection, which results in a mass loss of service quality. As Δ increases, the radius R of the sensitive circle decreases, and the quality of service loss decreases. As the radius of the sensitive circle R decreases, the user needs privacy at sensitive locations. Only some points with high privacy requirements are protected with more robust privacy protection. In contrast, other locations are given more attention to service quality improvement.
By comparing the error distance Δ and the quality of service, the inverse relationship between the sensitive circle radius and the quality of service is obtained. It is thus verified that the proposed scheme takes good privacy protection for sensitive location points, while for other location points, the quality of service is well improved to meet users’ personalized privacy preference requirements while protecting users’ privacy.

6.4.3. The Impact of Routing and Nonrouting on Service Quality

We compare the impact of user driving routes selected using the multiattribute route selection benefit function z developed in this paper with the impact of unselected driving routes on the quality of service provided by the personalized privacy budget allocation PPVC algorithm. From the results shown in Figure 5, we can see that the loss in quality of service of the PPVC selected driving route data is less than that of the unselected route for the same privacy budget. Therefore, our developed multiattribute decision model based on information entropy provides better service utility for the user and recommends the most efficient route for the user.
The above analysis of PPVC’s performance regarding privacy protection and service quality proves that PPVC can satisfy ε -differential privacy well and provide users with the most basic privacy guarantee. By analyzing the impact of selected and unselected routes on service quality, it is verified that the multiattribute decision model based on information entropy established in this paper recommends the most efficient routes for users. By comparing PPVC with other solutions for the privacy protection of accurate trajectories and the relationship between route length and service quality, it is verified that PPVC can meet users’ privacy preference requirements well. The quality of service can be improved well while preserving privacy.

7. Conclusions and Future Work

We propose a personalized location privacy protection scheme based on differential privacy to meet users’ personalized location privacy protection needs. Taking into account the effects of journey length and distance to sensitive locations, we introduce multiattribute theory to construct a utility model. Then, users’ personalized service requirements are integrated into this model to select the most efficient driving route. A personalized privacy budget is assigned to the query location based on the Euclidean distance between the query location and its nearest sensitive location on the selected route. The experimental results show that compared with other existing location privacy protection schemes, PPVC can meet the personalized privacy needs of users while protecting their privacy and improving the quality of charging services. The next step will be to analyze the impact of other attributes of in-vehicle users on the privacy leakage problem, such as driving speed, vehicle acceleration, and driving direction, to further improve the location privacy protection model in V2G.

Author Contributions

Conceptualization, P.Q.; Methodology, P.Q.; Software, P.Q.; Data curation, L.W.; Writing—original draft, L.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported in part by the National Natural Science Foundation of China (62372334, 42071431); in part by the National Key Research and Development Program of China (No.2020YFB1805400); in part by the Provincial Key Research and Development Program of Hubei, China (No.2020BAB101).

Data Availability Statement

Data will be made available on request.

Acknowledgments

This work is supported in part by the National Natural Science Foundation of China (62372334, 42071431); in part by the National Key Research and Development Program of China (No.2020YFB1805400); in part by the Provincial Key Research and Development Program of Hubei, China (No.2020BAB101).

Conflicts of Interest

The authors declare that they have no known competing financial interest or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Su, W.; Rahimi-Eichi, H.; Zeng, W.; Chow, M.Y. A Survey on the Electrification of Transportation in a Smart Grid Environment. IEEE Trans. Ind. Inform. 2012, 8, 1–10. [Google Scholar] [CrossRef]
  2. Raghavan, U.; Albert, R.; Kumara, S. Near linear time algorithm to detect community structures in large-scale networks. Phys. Rev. E 2007, 76, 036106. [Google Scholar] [CrossRef] [PubMed]
  3. Asghar, M.R.; Dán, G.; Miorandi, D.; Chlamtac, I. Smart Meter Data Privacy: A Survey. IEEE Commun. Surv. Tutor. 2017, 19, 2820–2835. [Google Scholar] [CrossRef]
  4. Qian, J.; Li, X.; Zhang, C.; Chen, L. De-anonymizing social networks and inferring private attributes using knowledge graphs. In Proceedings of the IEEE INFOCOM 2016—The 35th Annual IEEE International Conference on Computer Communications, San Francisco, CA, USA, 10–14 April 2016; pp. 1–9. [Google Scholar]
  5. Fortunato, S. Community detection in graphs. Phys. Rep.-Rev. Sec. Phys. Lett. 2010, 486, 75–174. [Google Scholar] [CrossRef]
  6. Blondel, V.D.; Guillaume, J.L.; Lambiotte, R.; Lefebvre, E. Fast unfolding of communities in large networks. J. Stat. Mech.-Theory Exp. 2008, 2008, P10008. [Google Scholar] [CrossRef]
  7. Jiang, R.; Lu, R.; Lai, C.; Li, A. A Secure Communication Protocol with Privacy-Preserving Monitoring and Controllable Linkability for V2G. In Proceedings of the IEEE First International Conference on Data Science in Cyberspace, DSC 2016, Changsha, China, 13–16 June 2016; IEEE Computer Society: Washington, DC, USA, 2016; pp. 567–572. [Google Scholar] [CrossRef]
  8. Yang, D.; Fang, X.; Xue, G. Truthful incentive mechanisms for k-anonymity location privacy. In Proceedings of the 2013 Proceedings IEEE INFOCOM, Turin, Italy, 14–19 April 2013; pp. 2994–3002. [Google Scholar] [CrossRef]
  9. Liu, B.; Chen, L.; Zhu, X.; Zhang, Y.; Zhang, C.; Qiu, W. Protecting Location Privacy in Spatial Crowdsourcing using Encrypted Data. In Proceedings of the EDBT, Venice, Italy, 21–24 March 2017. [Google Scholar]
  10. Chen, Q.; Shi, S.; Li, X.; Qian, C.; Zhong, S. SDN-Based Privacy Preserving Cross Domain Routing. IEEE Trans. Dependable Secur. Comput. 2019, 16, 930–943. [Google Scholar] [CrossRef]
  11. Dwork, C. Differential Privacy. In Encyclopedia of Cryptography and Security; Springer: Boston, MA, USA, 2006. [Google Scholar]
  12. Dwork, C.; Roth, A. The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci. 2014, 9, 211–407. [Google Scholar] [CrossRef]
  13. Zhou, L.; Yu, L.; Du, S.; Zhu, H.; Chen, C. Achieving Differentially Private Location Privacy in Edge-Assistant Connected Vehicles. IEEE Internet Things J. 2019, 6, 4472–4481. [Google Scholar] [CrossRef]
  14. Zhang, P.; Hu, C.; Chen, D.; Li, H.; Li, Q. ShiftRoute: Achieving Location Privacy for Map Services on Smartphones. IEEE Trans. Veh. Technol. 2018, 67, 4527–4538. [Google Scholar] [CrossRef]
  15. Wei, J.; Lin, Y.; Yao, X.; Zhang, J. Differential Privacy-Based Location Protection in Spatial Crowdsourcing. IEEE Trans. Serv. Comput. 2019, 15, 45–58. [Google Scholar] [CrossRef]
  16. Andreoletti, D.; Rottondi, C.E.M.; Giordano, S.; Verticale, G.; Tornatore, M. An Open Privacy-Preserving and Scalable Protocol for a Network-Neutrality Compliant Caching. In Proceedings of the ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, 20–24 May 2019; pp. 1–6. [Google Scholar]
  17. Yin, C.; Xi, J.; Sun, R.; Wang, J. Location Privacy Protection Based on Differential Privacy Strategy for Big Data in Industrial Internet of Things. IEEE Trans. Ind. Inform. 2018, 14, 3628–3636. [Google Scholar] [CrossRef]
  18. Xiong, P.; Zhu, T.; Pan, L.; Niu, W.; Li, G. Privacy Preserving in Location Data Release: A Differential Privacy Approach. In Proceedings of the PRICAI, Gold Coast, Australia, 1–5 December 2014. [Google Scholar]
  19. Andrés, M.E.; Bordenabe, N.E.; Chatzikokolakis, K.; Palamidessi, C. Geo-indistinguishability: Differential privacy for location-based systems. In Proceedings of the 2013 ACM SIGSAC Conference on Computer & Communications Security, Berlin, Germany, 4–8 November 2013. [Google Scholar]
  20. Jiang, K.; Shao, D.; Bressan, S.; Kister, T.; Tan, K.L. Publishing trajectories with differential privacy guarantees. In Proceedings of the SSDBM, Baltimore, MD, USA, 29–31 July 2013. [Google Scholar]
  21. Takagi, S.; Cao, Y.; Asano, Y.; Yoshikawa, M. Geo-Graph-Indistinguishability: Protecting Location Privacy for LBS over Road Networks. In Proceedings of the Database Security, Charleston, SC, USA, 15–17 July 2019. [Google Scholar]
  22. Qiu, C.; Squicciarini, A.C.; Pang, C.; Wang, N.; Wu, B. Location Privacy Protection in Vehicle-Based Spatial Crowdsourcing via Geo-Indistinguishability. IEEE Trans. Mob. Comput. 2022, 21, 2436–2450. [Google Scholar]
  23. Wang, L.; Zhang, D.; Yang, D.; Lim, B.Y.; Han, X.; Ma, X. Sparse Mobile Crowdsensing With Differential and Distortion Location Privacy. IEEE Trans. Inf. Forensics Secur. 2020, 15, 2735–2749. [Google Scholar] [CrossRef]
  24. Li, W.; Niu, B.; Cao, J.; Luo, Y.; Li, H. A personalized range-sensitive privacy-preserving scheme in LBSs. Concurr. Comput. Pract. Exp. 2020, 32, e5462. [Google Scholar] [CrossRef]
  25. Li, S.; Ji, X.; You, W. A Personalized Differential Privacy Protection Method for Repeated Queries. In Proceedings of the 2019 IEEE 4th International Conference on Big Data Analytics (ICBDA), Suzhou, China, 15–18 March 2019; pp. 274–280. [Google Scholar]
  26. Xiong, J.; Ma, R.; Chen, L.; Tian, Y.; Li, Q.; Liu, X.; Yao, Z. A Personalized Privacy Protection Framework for Mobile Crowdsensing in IIoT. IEEE Trans. Ind. Inform. 2020, 16, 4231–4241. [Google Scholar] [CrossRef]
  27. Zhong, H.; Ni, J.; Cui, J.; Zhang, J.; Liu, L. Personalized Location Privacy Protection Based on Vehicle Movement Regularity in Vehicular Networks. IEEE Syst. J. 2021, 16, 755–766. [Google Scholar] [CrossRef]
  28. Qu, Y.; Yu, S.; Zhou, W.; Tian, Y. GAN-Driven Personalized Spatial-Temporal Private Data Sharing in Cyber-Physical Social Systems. IEEE Trans. Netw. Sci. Eng. 2020, 7, 2576–2586. [Google Scholar] [CrossRef]
  29. Qu, Y.; Cui, L.; Yu, S.; Zhou, W.; Wu, J. Improving Data Utility Through Game Theory in Personalized Differential Privacy. J. Comput. Sci. Technol. 2019, 34, 272–286. [Google Scholar]
  30. He, Y.; Zhang, J.; Shuai, L.; Luo, J.; Yang, X.; Sun, Q.T. A Personalized Secure Publishing Mechanism of the Sensing Location Data in Crowdsensing Location-Based Services. IEEE Sens. J. 2021, 21, 13628–13637. [Google Scholar] [CrossRef]
  31. Al-Dhubhani, R.; Cazalas, J.M. An adaptive geo-indistinguishability mechanism for continuous LBS queries. Wirel. Netw. 2018, 24, 3221–3239. [Google Scholar] [CrossRef]
  32. Kasiviswanathan, S.P.; Lee, H.K.; Nissim, K.; Raskhodnikova, S.; Smith, A.D. What Can We Learn Privately? In Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science, Philadelphia, PA, USA, 25–28 October 2008; pp. 531–540. [Google Scholar]
  33. Duchi, J.C.; Jordan, M.I.; Wainwright, M.J. Local privacy and statistical minimax rates. In Proceedings of the 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton), Monticello, IL, USA, 2–4 October 2013; p. 1592. [Google Scholar]
  34. Kairouz, P.; Oh, S.; Viswanath, P. Extremal Mechanisms for Local Differential Privacy. J. Mach. Learn. Res. 2014, 17, 17:1–17:51. [Google Scholar]
  35. Warner, S.L. Randomized response: A survey technique for eliminating evasive answer bias. J. Am. Stat. Assoc. 1965, 60, 63–66. [Google Scholar] [CrossRef]
  36. Luo, L.; Han, Z.; Xu, C.; Zhao, G. A Geo-indistinguishable Location Privacy Preservation Scheme for Location-Based Services in Vehicular Networks. In Proceedings of the International Conference on Algorithms and Architectures for Parallel Processing, Melbourne, Australia, 9–11 December 2019. [Google Scholar]
  37. Xiong, X.; Liu, S.; Li, D.; Cai, Z.; Niu, X. A Comprehensive Survey on Local Differential Privacy. Secur. Commun. Netw. 2020, 2020, 8829523:1–8829523:29. [Google Scholar] [CrossRef]
  38. Newman, M.E.J. Network Data. 2013. Available online: http://www-personal.umich.edu/~mejn/netdata/ (accessed on 25 September 2023).
Figure 1. Electric vehicle charging system.
Figure 1. Electric vehicle charging system.
Mathematics 11 04257 g001
Figure 2. PPVC-Privacy protection program process.
Figure 2. PPVC-Privacy protection program process.
Mathematics 11 04257 g002
Figure 3. Impact of path length on quality of service.
Figure 3. Impact of path length on quality of service.
Mathematics 11 04257 g003
Figure 4. Effect of error value Δ .
Figure 4. Effect of error value Δ .
Mathematics 11 04257 g004
Figure 5. Impact of routing and nonrouting on charging service quality.
Figure 5. Impact of routing and nonrouting on charging service quality.
Mathematics 11 04257 g005
Table 1. Symbol Definition.
Table 1. Symbol Definition.
NameDescription
uUsers
mTotal number of routes planned for the user by the in-vehicle navigation
nNumber of locations on each path where users send service request messages
NNumber of sensitive locations set by the user
L i = x i , y i Coordinates of the i service request location of the user, i = 1 , 2 , , n
S j = x j , y j Coordinates of the j sensitive location point set by the user, j = 1 , 2 , , N
d i , j The distance between L i and S j
d k 1 o Total length of the k path
d k 2 s The sum of the distances of each requested location point on the k route
RSensitive circle size for sensitive location S j
Δ User-allowed distance error values between real and false positions
ε The total value of the privacy budget of the user-selected driving trajectory
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qin, P.; Wang, L. PPVC: Towards a Personalized Local Differential Privacy-Preserving Scheme for V2G Charging Networks. Mathematics 2023, 11, 4257. https://doi.org/10.3390/math11204257

AMA Style

Qin P, Wang L. PPVC: Towards a Personalized Local Differential Privacy-Preserving Scheme for V2G Charging Networks. Mathematics. 2023; 11(20):4257. https://doi.org/10.3390/math11204257

Chicago/Turabian Style

Qin, Peng, and Lina Wang. 2023. "PPVC: Towards a Personalized Local Differential Privacy-Preserving Scheme for V2G Charging Networks" Mathematics 11, no. 20: 4257. https://doi.org/10.3390/math11204257

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop