Results 1 - 10
of
29
Shining light in dark places: Understanding the Tor network
- In Proceedings of the 8th Privacy Enhancing Technologies Symposium
, 2008
"... Abstract. To date, there has yet to be a study that characterizes the usage of a real deployed anonymity service. We present observations and analysis obtained by participating in the Tor network. Our primary goals are to better understand Tor as it is deployed and through this understanding, propos ..."
Abstract
-
Cited by 92 (19 self)
- Add to MetaCart
(Show Context)
Abstract. To date, there has yet to be a study that characterizes the usage of a real deployed anonymity service. We present observations and analysis obtained by participating in the Tor network. Our primary goals are to better understand Tor as it is deployed and through this understanding, propose improvements. In particular, we are interested in answering the following questions: (1) How is Tor being used? (2) How is Tor being mis-used? (3) Who is using Tor? To sample the results, we show that web traffic makes up the majority of the connections and bandwidth, but non-interactive protocols consume a disproportionately large amount of bandwidth when compared to interactive protocols. We provide a survey of how Tor is being misused, both by clients and by Tor router operators. In particular, we develop a method for detecting exit router logging (in certain cases). Finally, we present evidence that Tor is used throughout the world, but router participation is limited to only a few countries. 1
Blacklistable anonymous credentials: Blocking misbehaving users without TTPs
- In ACM Conference on Computer and Communications Security. ACM
, 2007
"... Several credential systems have been proposed in which users can authenticate to services anonymously. Since anonymity can give users the license to misbehave, some variants allow the selective deanonymization (or linking) of misbehaving users upon a complaint to a trusted third party (TTP). The abi ..."
Abstract
-
Cited by 51 (6 self)
- Add to MetaCart
Several credential systems have been proposed in which users can authenticate to services anonymously. Since anonymity can give users the license to misbehave, some variants allow the selective deanonymization (or linking) of misbehaving users upon a complaint to a trusted third party (TTP). The ability of the TTP to revoke a user’s privacy at any time, however, is too strong a punishment for misbehavior. To limit the scope of deanonymization, systems such as “e-cash ” have been proposed in which users are deanonymized under only certain types of well-defined misbehavior such as “double spending. ” While useful in some applications, it is not possible to generalize such techniques to more subjective definitions of misbehavior. We present the first anonymous credential system in which services can “blacklist ” misbehaving users without contacting a TTP. Since blacklisted users remain anonymous, misbehaviors can be judged subjectively without users fearing arbitrary deanonymization by a TTP.
Nymble: Blocking Misbehaving Users in Anonymizing Networks.
, 2008
"... Abstract-Anonymizing networks such as Tor allow users to access Internet services privately by using a series of routers to hide the client's IP address from the server. The success of such networks, however, has been limited by users employing this anonymity for abusive purposes such as defac ..."
Abstract
-
Cited by 22 (1 self)
- Add to MetaCart
(Show Context)
Abstract-Anonymizing networks such as Tor allow users to access Internet services privately by using a series of routers to hide the client's IP address from the server. The success of such networks, however, has been limited by users employing this anonymity for abusive purposes such as defacing popular websites. Website administrators routinely rely on IP-address blocking for disabling access to misbehaving users, but blocking IP addresses is not practical if the abuser routes through an anonymizing network. As a result, administrators block all known exit nodes of anonymizing networks, denying anonymous access to misbehaving and behaving users alike. To address this problem, we present Nymble, a system in which servers can "blacklist" misbehaving users, thereby blocking users without compromising their anonymity. Our system is thus agnostic to different servers' definitions of misbehavior -servers can blacklist users for whatever reason, and the privacy of blacklisted users is maintained.
Reputation Systems for Anonymous Networks
"... Abstract. We present a reputation scheme for a pseudonymous peer-to-peer (P2P) system in an anonymous network. Misbehavior is one of the biggest problems in pseudonymous P2P systems, where there is little incentive for proper behavior. In our scheme, using ecash for reputation points, the reputation ..."
Abstract
-
Cited by 21 (0 self)
- Add to MetaCart
(Show Context)
Abstract. We present a reputation scheme for a pseudonymous peer-to-peer (P2P) system in an anonymous network. Misbehavior is one of the biggest problems in pseudonymous P2P systems, where there is little incentive for proper behavior. In our scheme, using ecash for reputation points, the reputation of each user is closely related to his real identity rather than to his current pseudonym. Thus, our scheme allows an honest user to switch to a new pseudonym keeping his good reputation, while hindering a malicious user from erasing his trail of evil deeds with a new pseudonym. 1
Making a Nymbler Nymble using VERBS
, 2010
"... In this work, we propose a new platform to enable service providers, such as web site operators, on the Internet to block past abusive users of anonymizing networks (for example, Tor) from further misbehaviour, without compromising their privacy, and while preserving the privacy of all of the non- ..."
Abstract
-
Cited by 19 (6 self)
- Add to MetaCart
(Show Context)
In this work, we propose a new platform to enable service providers, such as web site operators, on the Internet to block past abusive users of anonymizing networks (for example, Tor) from further misbehaviour, without compromising their privacy, and while preserving the privacy of all of the non-abusive users. Our system provides a privacy-preserving analog of IP address banning, and is modeled after the well-known Nymble system [29,47,48]. However, while we solve the same problem as the original Nymble scheme, we eliminate the troubling situation in which users must trust their anonymity in the hands of a small number of trusted third parties. Unlike other approaches that have been considered in the literature [10,44,45,46], we avoid the use of trusted hardware devices or unrealistic assumptions about offline credential issuing authorities who are responsible for ensuring that no user is able to obtain multiple credentials. Thus, our scheme combines the strong privacy guarantees of [10,44,45,46] with a simple infrastructure as in [29,47,48]. To prevent malicious third parties from trivially colluding to reveal the identities of anonymous users we make use of a number of standard zeroknowledge proofs, and to maintain efficiency we introduce a new cryptographic technique which we call verifier efficient restricted blind signatures, or VERBS. Our approach allows users to perform all privacy-sensitive computations locally, and then prove in zero-knowledge that the computations were performed correctly in order to obtain efficiently verifiable signatures on the output — all without revealing neither the result of the computation, nor any potentially identifying information, to the signature issuing authority. Signature verification in our proposed VERBS scheme is 1–2 orders of magnitude more efficient than verification in any known restricted blind signature scheme.
BLACR: TTP-Free Blacklistable Anonymous Credentials with Reputation
"... Anonymous authentication can give users the license to misbehave since there is no fear of retribution. As a deterrent or means to revocation, various schemes for accountable anonymity feature some kind of (possibly distributed) trusted third party (TTP) with the power to identify or link such misbe ..."
Abstract
-
Cited by 11 (2 self)
- Add to MetaCart
(Show Context)
Anonymous authentication can give users the license to misbehave since there is no fear of retribution. As a deterrent or means to revocation, various schemes for accountable anonymity feature some kind of (possibly distributed) trusted third party (TTP) with the power to identify or link such misbehaving users. Recently, schemes such as BLAC, EPID, and PEREA showed how anonymous revocation can be achieved without such TTPs—anonymous users can be revoked if they misbehave, and yet nobody can identify or link such users cryptographically. Despite being the state of the art in anonymous revocation, BLAC, EPID, and PEREA allow only a basic form of revocation amounting to “revoke anybody on the blacklist”. Recently BLAC was extended to support d-strikes-out policies that revokes anybody who has d or more entries on the blacklist. In this paper we significantly advance this concept and make the first attempt to generalize reputation-based anonymous revocation through our proposed scheme called BLACR. We show how various negative or positive scores can be assigned to anonymous sessions across various categories of misbehavior resulting in users being blocked based on their reputation scores. We show how various relevant policies can be instantiated in BLACR and the workload for authenticating users is reasonable for web services. 1
Jack: Scalable Accumulator-based Nymble System
"... Anonymous blacklisting schemes enable online service providers to block future accesses from abusive users behind anonymizing networks, such as Tor, while preserving the privacy of all users, both abusive and non-abusive. Several such schemes exist in the literature, but all suffer from one of sever ..."
Abstract
-
Cited by 7 (0 self)
- Add to MetaCart
Anonymous blacklisting schemes enable online service providers to block future accesses from abusive users behind anonymizing networks, such as Tor, while preserving the privacy of all users, both abusive and non-abusive. Several such schemes exist in the literature, but all suffer from one of several faults: they rely on trusted parties that can collude to de-anonymize users, they scale poorly with the number of blacklisted users, or they place a very high computational load on the trusted parties. We introduce Jack, an efficient, scalable anonymous blacklisting scheme based on cryptographic accumulators. Compared to the previous efficient schemes, Jack significantly reduces the communication and computation costs required of trusted parties while also weakening the trust placed in these parties. Compared with schemes with no trusted parties, Jack enjoys constant scaling with respect to the number of blacklisted users, imposing dramatically reduced computation and communication costs for service providers. Jack is provably secure in the random oracle model, and we demonstrate its efficiency both analytically and experimentally.
Anon-Pass: Practical Anonymous Subscriptions
"... We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the sy ..."
Abstract
-
Cited by 6 (0 self)
- Add to MetaCart
(Show Context)
We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch. Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider’s desire for a long epoch (to reduce server-side computation) versus users ’ desire for a short epoch (so they can repeatedly “re-anonymize ” their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user. I.
PEREA: Practical TTP-Free Revocation of Repeatedly Misbehaving Anonymous Users
, 2010
"... Several anonymous authentication schemes allow servers to revoke a misbehaving user’s ability to make future accesses. Traditionally, these schemes have relied on powerful TTPs capable of deanonymizing (or linking) users ’ connections. Recent schemes such as Blacklistable Anonymous Credentials (BLAC ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
(Show Context)
Several anonymous authentication schemes allow servers to revoke a misbehaving user’s ability to make future accesses. Traditionally, these schemes have relied on powerful TTPs capable of deanonymizing (or linking) users ’ connections. Recent schemes such as Blacklistable Anonymous Credentials (BLAC) and Enhanced Privacy ID (EPID) support “privacy-enhanced revocation ” — servers can revoke misbehaving users without a TTP’s involvement, and without learning the revoked users ’ identities. In BLAC and EPID, however, the computation required for authentication at the server is linear in the size (L) of the revocation list. We propose PEREA, a new anonymous authentication scheme for which this bottleneck computation is independent of the size of the revocation list. Instead, the time complexity of authentication is linear in the size (K ≪ L) of a revocation window, the number of subsequent authentications before which a user’s misbehavior must be recognized if the user is to be revoked. We prove the security of our construction, and validate its efficiency as compared to BLAC analytically and quantitatively. In addition, we extend PEREA to support more
Anonygator: privacy and integrity preserving data aggregation
- In Proc. ACM/IFIP/USENIX 11th International Conference on Middleware, Middleware
, 2010
"... Abstract. Data aggregation is a key aspect of many distributed applications, such as distributed sensing, performance monitoring, and distributed diagnostics. In such settings, user anonymity is a key concern of the participants. In the absence of an assurance of anonymity, users may be reluctant t ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
(Show Context)
Abstract. Data aggregation is a key aspect of many distributed applications, such as distributed sensing, performance monitoring, and distributed diagnostics. In such settings, user anonymity is a key concern of the participants. In the absence of an assurance of anonymity, users may be reluctant to contribute data such as their location or configuration settings on their computer. In this paper, we present the design, analysis, implementation, and evaluation of Anonygator, an anonymity-preserving data aggregation service for large-scale distributed applications. Anonygator uses anonymous routing to provide user anonymity by disassociating messages from the hosts that generated them. It prevents malicious users from uploading disproportionate amounts of spurious data by using a light-weight accounting scheme. Finally, Anonygator maintains overall system scalability by employing a novel distributed tree-based data aggregation procedure that is robust to pollution attacks. All of these components are tuned by a customization tool, with a view to achieve specific anonymity, pollution resistance, and efficiency goals. We have implemented Anonygator as a service and have used it to prototype three applications, one of which we have evaluated on PlanetLab. The other two have been evaluated on a local testbed.