Results 1  10
of
10
Eigenvalues, geometric expanders, sorting in rounds, and Ramsey theory
 CORNBINATORICA
, 1986
"... Expanding graphs are relevant to theoretical computer science in several ways. Here we show that the points versus hyperplanes incidence graphs of finite geometries form highly (nonlinear) expanding graphs with essentially the smallest possible number of edges. The expansion properties of the graphs ..."
Abstract

Cited by 58 (13 self)
 Add to MetaCart
Expanding graphs are relevant to theoretical computer science in several ways. Here we show that the points versus hyperplanes incidence graphs of finite geometries form highly (nonlinear) expanding graphs with essentially the smallest possible number of edges. The expansion properties of the graphs are proved using the eigenvalues of their adjacency matrices. These graphs enable us to improve previous results on a parallel sorting problem that arises in structural modeling, by describing an explicit algorithm to sort n elements in k time units using O(n ~k) parallel processors, where, e.g., cq=7/4, ~q8/5, 0q=26/17 and ~q=22/15. Our approach also yields several applications to Ramsey Theory and other extremal problems in
Max Algorithms in Crowdsourcing Environments
"... Our work investigates the problem of retrieving the maximum item from a set in crowdsourcing environments. We first develop parameterized families of max algorithms, that take as input a set of items and output an item from the set that is believed to be the maximum. Such max algorithms could, for i ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
Our work investigates the problem of retrieving the maximum item from a set in crowdsourcing environments. We first develop parameterized families of max algorithms, that take as input a set of items and output an item from the set that is believed to be the maximum. Such max algorithms could, for instance, select the best Facebook profile that matches a given person or the best photo that describes a given restaurant. Then, we propose strategies that select appropriatemaxalgorithmparameters. Ourframeworksupports various human error and cost models and we consider many of them for our experiments. We evaluate under many metrics, both analytically and via simulations, the tradeoff between three quantities: (1) quality, (2) monetary cost, and (3) execution time. Also, we provide insights on the effectiveness of the strategies in selecting appropriate max algorithm parameters and guidelines for choosing max algorithms and strategies for each application.
Tight Comparison Bounds On The Complexity Of Parallel Sorting
, 1987
"... The problem of sorting n elements using p processors in a parallel comparison model is considered. Lower and upper bounds which imply that for p ³ n, the time complexity of this problem is Q( log(1 + p / n) logn ___________ ) are presented. This complements [AKS83] in settling the problem since ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
The problem of sorting n elements using p processors in a parallel comparison model is considered. Lower and upper bounds which imply that for p ³ n, the time complexity of this problem is Q( log(1 + p / n) logn ___________ ) are presented. This complements [AKS83] in settling the problem since the AKS sorting network established that for pn the time complexity is Q( p nlogn ______ ). To prove the lower bounds we show that to achieve k logn parallel time, we need W(n 1 + 1/k ) processors. 1. Introduction Apparently, there is no problem in Computer Science which received more attention than sorting. [Kn73], for instance, found that existing computers devote approximately a quarter of their time to sorting. The advent of parallel computers stimulated intensive research of the sorting with respect to various models of parallel computation. Extensive lists of references which recorded this activity are given in [Ak85], [BHe86] and [Th83]. Most of the fastest serial and paral...
Sorting and Selection with Imprecise Comparisons
"... Abstract. In experimental psychology, the method of paired comparisons was proposed as a means for ranking preferences amongst n elements of a human subject. The method requires performing all ( n 2 comparisons then sorting elements according to the number of wins. The large number of comparisons i ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In experimental psychology, the method of paired comparisons was proposed as a means for ranking preferences amongst n elements of a human subject. The method requires performing all ( n 2 comparisons then sorting elements according to the number of wins. The large number of comparisons is performed to counter the potentially faulty decisionmaking of the human subject, who acts as an imprecise comparator. We consider a simple model of the imprecise comparisons: there exists some δ> 0 such that when a subject is given two elements to compare, if the values of those elements (as perceived by the subject) differ by at least δ, then the comparison will be made correctly; when the two elements have values that are within δ, the outcome of the comparison is unpredictable. This δ corresponds to the just noticeable difference unit (JND) or difference threshold in the psychophysics literature, but does not require the statistical assumptions used to define this value. In this model, the standard method of paired comparisons minimizes the errors introduced by the imprecise comparisons at the cost of ( n 2 comparisons. We show that the same optimal guarantees can be achieved using 4n 3/2 comparisons, and we prove the optimality of our method. We then explore the general tradeoff between the guarantees on the error that can be made and number of comparisons for the problems of sorting, maxfinding, and selection. Our results provide closetooptimal solutions for each of these problems. 1
Constant time parallel sorting: an empirical view
 J. Computer Syst. Sci
, 2003
"... It is well known that sorting can be done with O(n log n) comparisons. It is also known that (in the comparison decision tree model) sorting requires Ω(n log n) comparisons. What happens if you allow massive parallelism? In the extreme case you ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
It is well known that sorting can be done with O(n log n) comparisons. It is also known that (in the comparison decision tree model) sorting requires Ω(n log n) comparisons. What happens if you allow massive parallelism? In the extreme case you
Highly Parallelizable Problems (Extended Abstract)
"... We establish that several problems are highly parallelizable. For each of these problems, we design an optimal O (loglogn ) time parallel algorithm on the Common CRCW PRAM model which is the weakest among the CRCW PRAM models. These problems include: # all nearest smaller values, # preprocessing ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We establish that several problems are highly parallelizable. For each of these problems, we design an optimal O (loglogn ) time parallel algorithm on the Common CRCW PRAM model which is the weakest among the CRCW PRAM models. These problems include: # all nearest smaller values, # preprocessing for answering range maxima queries, # several problems in Computational Geometry, # string matching. Until recently, such algorithms were known only for finding the maximum and merging. A new lower bound technique is presented showing that some of the new O (loglogn ) upper bounds cannot be improved even when non optimal algorithms are used. The technique extends Ramseylike lower bound argumentation due to auf der Heide and Wigderson [MW85]. Its most interesting applications are for Computational Geometry problems for which no previous lower bounds are known.
A survey of constant time parallel sorting
 Bulletin of the European Association for Theoretical Computer Science
"... It is well known that sorting can be done with O(n log n) comparisons. It is also known that (in the comparison decision tree model) sorting requires Ω(n log n) comparisons. What happens if you allow massive parallelism? In the extreme case you ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
It is well known that sorting can be done with O(n log n) comparisons. It is also known that (in the comparison decision tree model) sorting requires Ω(n log n) comparisons. What happens if you allow massive parallelism? In the extreme case you
A Survey of Constant Time Parallel Sorting
"... this paper only work if n is \large." The algorithm in Theorem 4.15 needs much larger n than usual. 11 5 A Randomized Algorithm ..."
Abstract
 Add to MetaCart
this paper only work if n is \large." The algorithm in Theorem 4.15 needs much larger n than usual. 11 5 A Randomized Algorithm
unknown title
"... this paper only work if n is \large." The algorithm in Theorem 4.15 needs much larger n than usual ..."
Abstract
 Add to MetaCart
this paper only work if n is \large." The algorithm in Theorem 4.15 needs much larger n than usual