Results 11 - 20
of
231
Middleware for Distributed Context-Aware Systems
- International Symposium on Distributed Objects and Applications (DOA
, 2005
"... Context-aware systems represent extremely complex and heterogeneous distributed systems, composed of sensors, actuators, application components, and a variety of context processing components that manage the flow of context information between the sensors/actuators and applications. The need for ..."
Abstract
-
Cited by 62 (4 self)
- Add to MetaCart
(Show Context)
Context-aware systems represent extremely complex and heterogeneous distributed systems, composed of sensors, actuators, application components, and a variety of context processing components that manage the flow of context information between the sensors/actuators and applications. The need for middleware to seamlessly bind these components together is well recognised. Numerous attempts to build middleware or infrastructure for context-aware systems have been made, but these have provided only partial solutions; for instance, most have not adequately addressed issues such as mobility, fault tolerance or privacy.
SmokeScreen: flexible privacy controls for presence-sharing
- IN MOBISYS
, 2007
"... Presence-sharing is an emerging platform for mobile applications, but presence-privacy remains a challenge. Privacy controls must be flexible enough to allow sharing between both trusted social relations and untrusted strangers. In this paper, we present a system called SmokeScreen that provides fle ..."
Abstract
-
Cited by 57 (5 self)
- Add to MetaCart
(Show Context)
Presence-sharing is an emerging platform for mobile applications, but presence-privacy remains a challenge. Privacy controls must be flexible enough to allow sharing between both trusted social relations and untrusted strangers. In this paper, we present a system called SmokeScreen that provides flexible and power-efficient mechanisms for privacy management. Broadcasting clique signals, which can only be interpreted by other trusted users, enables sharing between social relations; broadcasting opaque identifiers (OIDs), which can only be resolved to an identity by a trusted broker, enables sharing between strangers. Computing these messages is power-efficient since they can be precomputed with acceptable storage costs. In evaluating these mechanisms we first analyzed traces from an actual presence-sharing application. Four months of traces provide evidence of anonymous snooping, even among trusted users. We have also implemented our mechanisms on two devices and found the power demands of clique signals and OIDs to be reasonable. A mobile phone running our software can operate for several days on a single charge.
Siren: Context-aware computing for firefighting
- In Proceedings of Pervasive Computing
, 2004
"... See next page for additional authors Follow this and additional works at: ..."
Abstract
-
Cited by 51 (3 self)
- Add to MetaCart
(Show Context)
See next page for additional authors Follow this and additional works at:
Enabling Private Continuous Queries For Revealed User Locations
"... Abstract. Existing location-based services provide specialized services to their customers based on the knowledge of their exact locations. With untrustworthy servers, location-based services may lead to several privacy threats ranging from worries over employers snooping on their workers’ whereabou ..."
Abstract
-
Cited by 49 (7 self)
- Add to MetaCart
(Show Context)
Abstract. Existing location-based services provide specialized services to their customers based on the knowledge of their exact locations. With untrustworthy servers, location-based services may lead to several privacy threats ranging from worries over employers snooping on their workers’ whereabouts to fears of tracking by potential stalkers. While there exist several techniques to preserve location privacy in mobile environments, such techniques are limited as they do not distinguish between location privacy (i.e., a user wants to hide her location) and query privacy (i.e., a user can reveal her location but not her query). Such distinction is crucial in many applications where the locations of mobile users is publicly known. In this paper, we go beyond the limitation of existing cloaking algorithms as we propose a new robust spatial cloaking technique for snapshot and continuous location-based queries that clearly distinguishes between location privacy and query privacy. By such distinction, we achieve two main goals: (1) supporting private location-based services to those customers with public locations, and (2) performing spatial cloaking on-demand basis only (i.e., when issuing queries) rather than exhaustively cloaking every single location update. Experimental results show that the robust spatial cloaking algorithm is scalable and efficient while providing anonymity for large numbers of continuous queries without hiding users ’ locations. 1
Putting people in their place: an anonymous and privacy-sensitive approach to collecting sensed data in location-based applications
- In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI
, 2006
"... The emergence of location-based computing promises new and compelling applications, but raises very real privacy risks. Existing approaches to privacy generally treat people as the entity of interest, often using a fidelity tradeoff to manage the costs and benefits of revealing a person’s location. ..."
Abstract
-
Cited by 41 (10 self)
- Add to MetaCart
(Show Context)
The emergence of location-based computing promises new and compelling applications, but raises very real privacy risks. Existing approaches to privacy generally treat people as the entity of interest, often using a fidelity tradeoff to manage the costs and benefits of revealing a person’s location. However, these approaches cannot be applied in some applications, as a reduction in precision can render location information useless. This is true of a category of applications that use location data collected from multiple people to infer such information as whether there is a traffic jam on a bridge, whether there are seats available in a nearby coffee shop, when the next bus will arrive, or if a particular conference room is currently empty. We present hitchhiking, a new approach that treats locations as the primary entity of interest. Hitchhiking removes the fidelity tradeoff by preserving the anonymity of reports without reducing the precision of location disclosures. We can therefore support the full functionality of an interesting class of location-based applications without introducing the privacy concerns that would otherwise arise.
You are what you say: Privacy risks of public mentions
- In Proc. 29th Annual ACM SIGIR Conference on Research and Development in Information Retrieval
, 2006
"... In today’s data-rich networked world, people express many aspects of their lives online. It is common to segregate different aspects in different places: you might write opinionated rants about movies in your blog under a pseudonym while participating in a forum or web site for scholarly discussion ..."
Abstract
-
Cited by 40 (4 self)
- Add to MetaCart
(Show Context)
In today’s data-rich networked world, people express many aspects of their lives online. It is common to segregate different aspects in different places: you might write opinionated rants about movies in your blog under a pseudonym while participating in a forum or web site for scholarly discussion of medical ethics under your real name. However, it may be possible to link these separate identities, because the movies, journal articles, or authors you mention are from a sparse relation space whose properties (e.g., many items related to by only a few users) allow reidentification. This re-identification violates people’s intentions to separate aspects of their life and can have negative consequences; it also may allow other privacy violations, such as obtaining a stronger identifier like name and address. This paper examines this general problem in a specific setting: reidentification of users from a public web movie forum in a private movie ratings dataset. We present three major results. First, we develop algorithms that can re-identify a large proportion of public users in a sparse relation space. Second, we evaluate whether private dataset owners can protect user privacy by hiding data; we show that this requires extensive and undesirable changes to the dataset, making it impractical. Third, we evaluate two methods for users in a public forum to protect their own privacy, suppression and misdirection. Suppression doesn’t work here either. However, we show that a simple misdirection strategy works well: mention a few popular items that you haven’t rated.
CPOL: High-performance policy evaluation
- In Proceedings of the 12th ACM Conference on Computer and Communications Security (CCS
, 2005
"... Policy enforcement is an integral part of many applications. Policies are often used to control access to sensitive information. Current policy specification languages give users fine-grained control over when and how information can be accessed, and are flexible enough to be used in a variety of ap ..."
Abstract
-
Cited by 35 (0 self)
- Add to MetaCart
(Show Context)
Policy enforcement is an integral part of many applications. Policies are often used to control access to sensitive information. Current policy specification languages give users fine-grained control over when and how information can be accessed, and are flexible enough to be used in a variety of applications. Evaluation of these policies, however, is not optimized for performance. Emerging applications, such as real-time enforcement of privacy policies in a sensor network or location-aware computing environment, require high throughput. Our experiments indicate that current policy enforcement solutions are unable to deliver the level of performance needed for such systems, and limit their overall scalability. To deal with the need for high-throughput evaluation, we propose CPOL, a flexible C++ framework for policy evaluation. CPOL is designed to evaluate policies as efficiently as possible, and still maintain a level of expressiveness comparable to current policy languages. CPOL achieves its performance goals by efficiently evaluating policies and caching query results (while still preserving correctness). To evaluate CPOL, we ran a simulated workload of users making privacy queries in a location-sensing infrastructure. CPOL was able to handle policy evaluation requests two to six orders of magnitude faster than a MySql implementation and an existing policy evaluation system. We present the design and implementation of CPOL, a high-performance policy evaluation engine, along with our testing methodology and experimental results. Categories and Subject Descriptors
Devices that tell on you: Privacy trends in consumer ubiquitous computing
- IN: PROC. 16TH USENIX SECURITY SYMPOSIUM
, 2007
"... We analyze three new consumer electronic gadgets in order to gauge the privacy and security trends in massmarket UbiComp devices. Our study of the Slingbox Pro uncovers a new information leakage vector for encrypted streaming multimedia. By exploiting properties of variable bitrate encoding schemes, ..."
Abstract
-
Cited by 34 (2 self)
- Add to MetaCart
(Show Context)
We analyze three new consumer electronic gadgets in order to gauge the privacy and security trends in massmarket UbiComp devices. Our study of the Slingbox Pro uncovers a new information leakage vector for encrypted streaming multimedia. By exploiting properties of variable bitrate encoding schemes, we show that a passive adversary can determine with high probability the movie that a user is watching via her Slingbox, even when the Slingbox uses encryption. We experimentally evaluated our method against a database of over 100 hours of network traces for 26 distinct movies. Despite an opportunity to provide significantly more location privacy than existing devices, like RFIDs, we find that an attacker can trivially exploit the Nike+iPod Sport Kit’s design to track users; we demonstrate this with a GoogleMaps-based distributed surveillance system. We also uncover security issues with the way Microsoft Zunes manage their social relationships. We show how these products’ designers could have significantly raised the bar against some of our attacks. We also use some of our attacks to motivate fundamental security and privacy challenges for future UbiComp devices.
Access Control to Information in Pervasive Computing Environments.
- In Proceedings of 9th Workshop on Hot Topics in Operating Systems (HotOS IX),
, 2003
"... Abstract Pervasive computing envisions a world in which our environment is full of embedded devices that gather and share vast amounts of information about people, such as their location, activity, or even their feelings. Some of this information is confidential and should not be released to just a ..."
Abstract
-
Cited by 32 (3 self)
- Add to MetaCart
(Show Context)
Abstract Pervasive computing envisions a world in which our environment is full of embedded devices that gather and share vast amounts of information about people, such as their location, activity, or even their feelings. Some of this information is confidential and should not be released to just anyone. In this thesis, I show how existing solutions for controlling access to information are not sufficient for pervasive computing because of four challenges: First, there will be many information services, potentially offering the same information, run by different organizations, even in a single social environment. Second, there will be complex types of information, such as a person's calendar entry, which reveal other kinds of information, such as the person's current location. Third, there will be services that derive specific information, such as a person's activity, from raw information, such as a videostream, and that become attractive targets for intruders. Fourth, an individual's ability to access information could be constrained based on confidential information about the individual's context. This thesis presents a distributed access-control architecture for pervasive computing that supports complex and derived information and confidential context-sensitive constraints. In particular, the thesis makes the following contributions: First, I introduce a distributed accesscontrol architecture, in which a client proves to a service that the client is authorized to access requested information. Second, I show how to incorporate the semantics of complex information as a first-class citizen into this architecture, based on information relationships. Third, I propose derivation-constrained access control, which reduces the influence of intruders by making a service prove that the service is accessing information on behalf of an authorized client. Fourth, I study the kinds of information leaks that context-sensitive constraints can cause. I introduce access-rights graphs and hidden constraints for avoiding these leaks. Fifth, I show how pervasive computing makes it difficult for a client to prove that the client is authorized to access complex confidential information. I propose a cryptographic solution based on an extension of hierarchical identity-based encryption. Sixth, as an alternative approach, I introduce an encryption-based access-control architecture for pervasive computing, in which a service gives information to any client, but only in an encrypted form. I present a formal model for my contributions based on Lampson et al.'s theory of authentication. All of my contributions have been implemented in an actual pervasive computing environment. A performance analysis of my implementation demonstrates the feasibility of my design.
Protecting Moving Trajectories with Dummies
"... Abstract—Dummy-based anonymization techniques for protecting location privacy of mobile users have been proposed in the literature. By generating dummies that move in humanlike trajectories, [8] shows that location privacy of mobile users can be preserved. However, by monitoring long-term movement p ..."
Abstract
-
Cited by 30 (0 self)
- Add to MetaCart
(Show Context)
Abstract—Dummy-based anonymization techniques for protecting location privacy of mobile users have been proposed in the literature. By generating dummies that move in humanlike trajectories, [8] shows that location privacy of mobile users can be preserved. However, by monitoring long-term movement patterns of users, the trajectories of mobile users can still be exposed. We argue that, once the trajectory of a user is identified, locations of the user is exposed. Thus, it’s critical to protect the moving trajectories of mobile users in order to preserve user location privacy. We propose two schemes that generate consistent movement patterns in a long run. Guided by three parameters in user specified privacy profile, namely, shortterm disclosure, long-term disclosure and distance deviation, the proposed schemes derive movement trajectories for dummies. A preliminary performance study shows that our approach is more effective than existing work in protecting moving trajectories of mobile users and their location privacy.