Results 1  10
of
3,805,196
EXPOSE: Inferring Worstcase Time Complexity by Automatic Empirical Study
"... Introduction to doubling. A useful understanding of an algorithm’s efficiency, the worstcase time complexity gives an upper bound on how an increase in the size of the input, denoted n, increases the execution time of the algorithm, or f(n). This relationship is often expressed in the ..."
Abstract
 Add to MetaCart
Introduction to doubling. A useful understanding of an algorithm’s efficiency, the worstcase time complexity gives an upper bound on how an increase in the size of the input, denoted n, increases the execution time of the algorithm, or f(n). This relationship is often expressed in the
Worstcase time complexity of a lattice formation problem
"... Abstract — We consider a formation control problem for a robotic network with limited communication and controlled motion abilities. We propose a novel control structure that organizes the robots in concentric layers and that associates to each layer a local leader. Through a load balancing algorith ..."
Abstract
 Add to MetaCart
algorithm on the asynchronous network of layers we allocate the desired number of robots on each layer. A final uniform spreading algorithm leads the robots to a latticelike formation. This novel distributed communication and control algorithm runs in linear time in the worst case. I.
Worstcase equilibria
 IN PROCEEDINGS OF THE 16TH ANNUAL SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SCIENCE
, 1999
"... In a system in which noncooperative agents share a common resource, we propose the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. Deriving upper and lower bounds for this ratio in a model in which several agents share a ver ..."
Abstract

Cited by 851 (17 self)
 Add to MetaCart
In a system in which noncooperative agents share a common resource, we propose the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. Deriving upper and lower bounds for this ratio in a model in which several agents share a
A discrete subexponential algorithm for parity games
 STACS’03
, 2003
"... We suggest a new randomized algorithm for solving parity games with worst case time complexity roughly ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
We suggest a new randomized algorithm for solving parity games with worst case time complexity roughly
Q : Worstcase Fair Weighted Fair Queueing
"... The Generalized Processor Sharing (GPS) discipline is proven to have two desirable properties: (a) it can provide an endtoend boundeddelay service to a session whose traffic is constrained by a leaky bucket; (b) it can ensure fair allocation of bandwidth among all backlogged sessions regardless o ..."
Abstract

Cited by 361 (11 self)
 Add to MetaCart
transmission time of that provided by GPS. In this paper, we will show that, contrary to pop...
Complexity of finding embeddings in a ktree
 SIAM JOURNAL OF DISCRETE MATHEMATICS
, 1987
"... A ktree is a graph that can be reduced to the kcomplete graph by a sequence of removals of a degree k vertex with completely connected neighbors. We address the problem of determining whether a graph is a partial graph of a ktree. This problem is motivated by the existence of polynomial time al ..."
Abstract

Cited by 390 (1 self)
 Add to MetaCart
status of two problems related to finding the smallest number k such that a given graph is a partial ktree. First, the corresponding decision problem is NPcomplete. Second, for a fixed (predetermined) value of k, we present an algorithm with polynomially bounded (but exponential in k) worst case time
ArbitraryPrecision Division
"... This paper presents an algorithm for arbitraryprecision division and shows its worstcase time complexity to be related by a constant factor to that of arbitraryprecision multiplication. The material is adapted from [1], pp. 264, 295297, where S. A. Cook is credited for suggesting the basic idea. ..."
Abstract
 Add to MetaCart
This paper presents an algorithm for arbitraryprecision division and shows its worstcase time complexity to be related by a constant factor to that of arbitraryprecision multiplication. The material is adapted from [1], pp. 264, 295297, where S. A. Cook is credited for suggesting the basic idea
© 1996 SpringerVerlag New York Inc. A Mildly Exponential Approximation Algorithm for the Permanent
"... Abstract. A new approximation algorithm for the permanent of an n × n 0,1matrix is presented. The algorithm is shown to have worstcase time complexity exp(O(n 1/2 log 2 n)). Asymptotically, this represents a considerable improvement over the best existing algorithm, which has worstcase time compl ..."
Abstract
 Add to MetaCart
Abstract. A new approximation algorithm for the permanent of an n × n 0,1matrix is presented. The algorithm is shown to have worstcase time complexity exp(O(n 1/2 log 2 n)). Asymptotically, this represents a considerable improvement over the best existing algorithm, which has worstcase time
Results 1  10
of
3,805,196