Results 11  20
of
31
Finding Endogenously Formed Communities
, 2012
"... A central problem in data mining and social network analysis is determining overlapping communities (clusters) among individuals or objects in the absence of external identification or tagging. We address this problem by introducing a framework that captures the notion of communities or clusters det ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
A central problem in data mining and social network analysis is determining overlapping communities (clusters) among individuals or objects in the absence of external identification or tagging. We address this problem by introducing a framework that captures the notion of communities or clusters determined by the relative affinities among their members. To this end we define what we call an affinity system, which is a set of elements, each with a vector characterizing its preference for all other elements in the set. We define a natural notion of (potentially overlapping) communities in an affinity system, in which the members of a given community collectively prefer each other to anyone else outside the community. Thus these communities are endogenously formed in the affinity system and are “selfdetermined ” or
On the Complexity of Approximating a Nash Equilibrium
"... We show that computing a relative—that is, multiplicative as opposed to additive—approximate Nash equilibrium in twoplayer games is PPADcomplete, even for constant values of the approximation. Our result is the first constant inapproximability result for the problem, since the appearance of the or ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We show that computing a relative—that is, multiplicative as opposed to additive—approximate Nash equilibrium in twoplayer games is PPADcomplete, even for constant values of the approximation. Our result is the first constant inapproximability result for the problem, since the appearance of the original results on the complexity of the Nash equilibrium [8, 5, 7]. Moreover, it provides an apparent—assuming that PPAD ̸⊆ TIME(n O(log n))—dichotomy between the complexities of additive and relative notions of approximation, since for constant values of additive approximation a quasipolynomialtime algorithm is known [22]. Such a dichotomy does not arise for values of the approximation that scale with the size of the game, as both relative and additive approximations are PPADcomplete [7]. As a byproduct, our proof shows that the LiptonMarkakisMehta sampling lemma is not applicable to relative notions of constant approximation, answering in the negative direction a question posed to us by ShangHua Teng [26]. 1
A Sharp PageRank Algorithm with Applications to Edge Ranking and Graph Sparsification
, 2010
"... We give an improved algorithm for computing personalized PageRank vectors with tight error bounds which can be as small as Ω(n −p) for any fixed positive integer p. The improved PageRank algorithm is crucial for computing a quantitative ranking of edges in a given graph. We will use the edge rankin ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
We give an improved algorithm for computing personalized PageRank vectors with tight error bounds which can be as small as Ω(n −p) for any fixed positive integer p. The improved PageRank algorithm is crucial for computing a quantitative ranking of edges in a given graph. We will use the edge ranking to examine two interrelated problems – graph sparsification and graph partitioning. We can combine the graph sparsification and the partitioning algorithms using PageRank vectors to derive an improved partitioning algorithm.
Small clique detection and approximate Nash equilibria
 In 13th International Workshop on Randomization and Computation (RANDOM
, 2009
"... Abstract. Recently, Hazan and Krauthgamer showed [12] that if, for a fixed small ε, an εbest εapproximate Nash equilibrium can be found in polynomial time in twoplayer games, then it is also possible to find a planted clique in Gn,1/2 of size C log n, where C is a large fixed constant independent ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, Hazan and Krauthgamer showed [12] that if, for a fixed small ε, an εbest εapproximate Nash equilibrium can be found in polynomial time in twoplayer games, then it is also possible to find a planted clique in Gn,1/2 of size C log n, where C is a large fixed constant independent of ε. In this paper, we extend their result to show that if an εbest εapproximate equilibrium can be efficiently found for arbitrarily small ε> 0, then one can detect the presence of a planted clique of size (2+δ) log n in Gn,1/2 in polynomial time for arbitrarily small δ> 0. Our result is optimal in the sense that graphs in Gn,1/2 have cliques of size (2 − o(1)) log n with high probability. 1
Statistical and computational tradeoffs in estimation of sparse principal components
, 2014
"... In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension reduction technique for highdimensional data. The theoretical challenge, in the simplest case, is to estimate the leading eigenvector of a population covariance matrix under the assumption that this e ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension reduction technique for highdimensional data. The theoretical challenge, in the simplest case, is to estimate the leading eigenvector of a population covariance matrix under the assumption that this eigenvector is sparse. An impressive range of estimators have been proposed; some of these are fast to compute, while others are known to achieve the minimax optimal rate over certain Gaussian or subgaussian classes. In this paper we show that, under a widelybelieved assumption from computational complexity theory, there is a fundamental tradeoff between statistical and computational performance in this problem. More precisely, working with new, larger classes satisfying a Restricted Covariance Concentration condition, we show that no randomised polynomial time algorithm can achieve the minimax optimal rate. On the other hand, we also study a (polynomial time) variant of the wellknown semidefinite relaxation estimator, and show that it attains essentially the optimal rate among all randomised polynomial time algorithms.
On the Hardness of Signaling
, 2014
"... There has been a recent surge of interest in the role of information in strategic interactions. Much of this work seeks to understand how the realized equilibrium of a game is influenced by uncertainty in the environment and the information available to players in the game. Lurking beneath this lite ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
There has been a recent surge of interest in the role of information in strategic interactions. Much of this work seeks to understand how the realized equilibrium of a game is influenced by uncertainty in the environment and the information available to players in the game. Lurking beneath this literature is a fundamental, yet largely unexplored, algorithmic question: how should a “market maker ” who is privy to additional information, and equipped with a specified objective, inform the players in the game? This is an informational analogue of the mechanism design question, and views the information structure of a game as a mathematical object to be designed, rather than an exogenous variable. We initiate a complexitytheoretic examination of the design of optimal information structures in general Bayesian games, a task often referred to as signaling. We focus on one of the simplest instantiations of the signaling question: Bayesian zerosum games, and a principal who must choose an information structure maximizing the equilibrium payoff of one of the players. In this setting, we show that optimal signaling is computationally intractable, and in some cases hard to approximate, assuming that it is hard to recover a planted clique from an ErdősRényi random graph. This is despite the fact that equilibria in these games are computable in polynomial time, and therefore suggests that the hardness of optimal signaling is a distinct phenomenon from the hardness of equilibrium computation. Necessitated by the nonlocal nature of information structures, enroute to our results we prove an “amplification lemma ” for the planted clique problemwhichmay be of independent interest. Specifically, we show that even if we plant many cliques in an ErdősRényi random graph, so much so that most nodes in the graph are in some planted clique, recovering a constant fraction of the planted cliques is no easier than the traditional planted clique problem. 1
Robust convex relaxation for the planted clique and densest ksubgraph problems: additional proofs.
, 2013
"... Abstract We consider the problem of identifying the densest knode subgraph in a given graph. We write this problem as an instance of rankconstrained cardinality minimization and then relax using the nuclear and 1 norms. Although the original combinatorial problem is NPhard, we show that the dens ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract We consider the problem of identifying the densest knode subgraph in a given graph. We write this problem as an instance of rankconstrained cardinality minimization and then relax using the nuclear and 1 norms. Although the original combinatorial problem is NPhard, we show that the densest ksubgraph can be recovered from the solution of our convex relaxation for certain program inputs. In particular, we establish exact recovery in the case that the input graph contains a single planted clique plus noise in the form of corrupted adjacency relationships. We consider two constructions for this noise. In the first, noise is introduced by an adversary deterministically deleting edges within the planted clique and placing diversionary edges. In the second, these edge corruptions are performed at random. Analogous recovery guarantees for identifying the densest subgraph of fixed size in a bipartite graph are also established, and results of numerical simulations for randomly generated graphs are included to demonstrate the efficacy of our algorithm.
Hidden cliques and the certification of the restricted isometry property,” submitted for publication, 2012, [Online] Available: http://arxiv.org/abs/1211.0665
"... Abstract Compressed sensing is a technique for finding sparse solutions to underdetermined linear systems. This technique relies on properties of the sensing matrix such as the restricted isometry property. Sensing matrices that satisfy this property with optimal parameters are mainly obtained via ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract Compressed sensing is a technique for finding sparse solutions to underdetermined linear systems. This technique relies on properties of the sensing matrix such as the restricted isometry property. Sensing matrices that satisfy this property with optimal parameters are mainly obtained via probabilistic arguments. Deciding whether a given matrix satisfies the restricted isometry property is a nontrivial computational problem. Indeed, we show in this paper that restricted isometry parameters cannot be approximated in polynomial time within any constant factor under the assumption that the hidden clique problem is hard. Moreover, on the positive side we propose an improvement on the bruteforce enumeration algorithm for checking the restricted isometry property.