Results 1 -
5 of
5
Prolonging the Hide-and-Seek Game: Optimal Trajectory Privacy for Location-Based Services
"... Human mobility is highly predictable. Individuals tend to only visit a few locations with high frequency, and to move among them in a certain sequence reflecting their habits and daily routine. This predictability has to be taken into account in the design of location privacy preserving mecha-nisms ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
Human mobility is highly predictable. Individuals tend to only visit a few locations with high frequency, and to move among them in a certain sequence reflecting their habits and daily routine. This predictability has to be taken into account in the design of location privacy preserving mecha-nisms (LPPMs) in order to effectively protect users when they expose their whereabouts to location-based services (LBSs) continuously. In this paper, we describe a method for creating LPPMs tailored to a user’s mobility profile taking into her account privacy and quality of service requirements. By construction, our LPPMs take into account the sequen-tial correlation across the user’s exposed locations, provid-ing the maximum possible trajectory privacy, i.e., privacy for the user’s past, present location, and expected future loca-tions. Moreover, our LPPMs are optimal against a strategic adversary, i.e., an attacker that implements the strongest inference attack knowing both the LPPM operation and the user’s mobility profile. The optimality of the LPPMs in the context of trajectory privacy is a novel contribution, and it is achieved by formulating the LPPM design problem as a Bayesian Stackelberg game between the user and the adver-sary. An additional benefit of our formal approach is that the design parameters of the LPPM are chosen by the opti-mization algorithm.
Privacy Games: Optimal User-Centric Data Obfuscation
, 2015
"... Abstract: Consider users who share their data (e.g., location) with an untrusted service provider to obtain a personalized (e.g., location-based) service. Data obfuscation is a prevalent user-centric approach to protecting users' privacy in such systems: the untrusted entity only receives a no ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract: Consider users who share their data (e.g., location) with an untrusted service provider to obtain a personalized (e.g., location-based) service. Data obfuscation is a prevalent user-centric approach to protecting users' privacy in such systems: the untrusted entity only receives a noisy version of user's data. Perturbing data before sharing it, however, comes at the price of the users' utility (service quality) experience which is an inseparable design factor of obfuscation mechanisms. The entanglement of the utility loss and the privacy guarantee, in addition to the lack of a comprehensive notion of privacy, have led to the design of obfuscation mechanisms that are either suboptimal in terms of their utility loss, or ignore the user's information leakage in the past, or are limited to very specific notions of privacy which e.g., do not protect against adaptive inference attacks or the adversary with arbitrary background knowledge. In this paper, we design user-centric obfuscation mechanisms that impose the minimum utility loss for guaranteeing user's privacy. We optimize utility subject to a joint guarantee of differential privacy (indistinguishability) and distortion privacy (inference error). This double shield of protection limits the information leakage through obfuscation mechanism as well as the posterior inference. We show that the privacy achieved through joint differential-distortion mechanisms against optimal attacks is as large as the maximum privacy that can be achieved by either of these mechanisms separately. Their utility cost is also not larger than what either of the differential or distortion mechanisms imposes. We model the optimization problem as a leader-follower game between the designer of obfuscation mechanism and the potential adversary, and design adaptive mechanisms that anticipate and protect against optimal inference algorithms. Thus, the obfuscation mechanism is optimal against any inference algorithm.
Anatomization and Protection of Mobile Apps' Location Privacy Threats Anatomization and Protection of Mobile Apps' Location Privacy Threats
"... Abstract Mobile users are becoming increasingly aware of the privacy threats resulting from apps' access of their location. Few of the solutions proposed thus far to mitigate these threats have been deployed as they require either app or platform modifications. Mobile operating systems (OSes) ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract Mobile users are becoming increasingly aware of the privacy threats resulting from apps' access of their location. Few of the solutions proposed thus far to mitigate these threats have been deployed as they require either app or platform modifications. Mobile operating systems (OSes) also provide users with location access controls. In this paper, we analyze the efficacy of these controls in combating the location-privacy threats. For this analysis, we conducted the first location measurement campaign of its kind, analyzing more than 1000 free apps from Google Play and collecting detailed usage of location by more than 400 location-aware apps and 70 Advertisement and Analytics (A&A) libraries from more than 100 participants over a period ranging from 1 week to 1 year. Surprisingly, 70% of the apps and the A&A libraries pose considerable profiling threats even when they sporadically access the user's location. Existing OS controls are found ineffective and inefficient in mitigating these threats, thus calling for a finer-grained location access control. To meet this need, we propose LP-Doctor, a light-weight user-level tool that allows Android users to effectively utilize the OS's location access controls while maintaining the required app's functionality as our userstudy (with 227 participants) shows.
Distributed under a Creative Commons Attribution 4.0 International License Geo-indistinguishability: A Principled Approach to
, 2015
"... HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract
- Add to MetaCart
(Show Context)
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.