Results 1  10
of
831
Selfadjusting binary search trees
, 1985
"... The splay tree, a selfadjusting form of binary search tree, is developed and analyzed. The binary search tree is a data structure for representing tables and lists so that accessing, inserting, and deleting items is easy. On an nnode splay tree, all the standard search tree operations have an am ..."
Abstract

Cited by 432 (18 self)
 Add to MetaCart
an amortized time bound of O(log n) per operation, where by “amortized time ” is meant the time per operation averaged over a worstcase sequence of operations. Thus splay trees are as efficient as balanced trees when total running time is the measure of interest. In addition, for sufficiently long access
A NEW POLYNOMIALTIME ALGORITHM FOR LINEAR PROGRAMMING
 COMBINATORICA
, 1984
"... We present a new polynomialtime algorithm for linear programming. In the worst case, the algorithm requires O(tf'SL) arithmetic operations on O(L) bit numbers, where n is the number of variables and L is the number of bits in the input. The running,time of this algorithm is better than the ell ..."
Abstract

Cited by 860 (3 self)
 Add to MetaCart
We present a new polynomialtime algorithm for linear programming. In the worst case, the algorithm requires O(tf'SL) arithmetic operations on O(L) bit numbers, where n is the number of variables and L is the number of bits in the input. The running,time of this algorithm is better than
On Generating WorstCases for the Insertion Sort
"... this paper, we shall answer these questions with an algorithm that can systematically construct sequences with which the number of comparisons required by the insertion sort is between O(n) and O(n ..."
Abstract
 Add to MetaCart
this paper, we shall answer these questions with an algorithm that can systematically construct sequences with which the number of comparisons required by the insertion sort is between O(n) and O(n
On Delivery Guarantees and WorstCase Forwarding . . .
, 2010
"... In this paper, we provide a thorough theoretical study on delivery guarantees, loopfree operation, and worstcase behavior of face and combined greedyface routing. We show that under specific planar topology control schemes, recovery from a greedy routing failure is always possible without changi ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
In this paper, we provide a thorough theoretical study on delivery guarantees, loopfree operation, and worstcase behavior of face and combined greedyface routing. We show that under specific planar topology control schemes, recovery from a greedy routing failure is always possible without
WorstCase Analysis of Selective Sampling for Linear Classification
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... A selective sampling algorithm is a learning algorithm for classification that, based on the past observed data, decides whether to ask the label of each new instance to be classified. In this paper, we introduce a general technique for turning linearthreshold classification algorithms from the ..."
Abstract

Cited by 52 (6 self)
 Add to MetaCart
the general additive family into randomized selective sampling algorithms. For the most popular algorithms in this family we derive mistake bounds that hold for individual sequences of examples. These bounds
WorstCase Efficient ExternalMemory Priority Queues
 In Proc. Scandinavian Workshop on Algorithms Theory, LNCS 1432
, 1998
"... . A priority queue Q is a data structure that maintains a collection of elements, each element having an associated priority drawn from a totally ordered universe, under the operations Insert, which inserts an element into Q, and DeleteMin, which deletes an element with the minimum priority from ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
in elements. The developed data structure handles any intermixed sequence of Insert and DeleteMin operations such that in every disjoint interval of B consecutive priorityqueue operations at most c log M=B N M I/Os are performed, for some positive constant c. These I/Os are divided evenly among
How to Use Expert Advice
 JOURNAL OF THE ASSOCIATION FOR COMPUTING MACHINERY
, 1997
"... We analyze algorithms that predict a binary value by combining the predictions of several prediction strategies, called experts. Our analysis is for worstcase situations, i.e., we make no assumptions about the way the sequence of bits to be predicted is generated. We measure the performance of the ..."
Abstract

Cited by 377 (79 self)
 Add to MetaCart
We analyze algorithms that predict a binary value by combining the predictions of several prediction strategies, called experts. Our analysis is for worstcase situations, i.e., we make no assumptions about the way the sequence of bits to be predicted is generated. We measure the performance
WorstCase Execution Time Analysis on Modern Processors
 in ACM SIGPLAN 1995 Workshop on Languages, Compilers, and Tools for RealTime Systems
, 1995
"... Many of the trends that have dominated recent evolution and advancement within the computer architecture community have complicated the analysis of task execution times. Most of the difficulties result from two particular emphases: (1) Instructionlevel parallelism, and (2) Optimization of averagec ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
behavior rather than worstcase latencies. Both of these trends have resulted in increased nondeterminism in the time required to execute particular code sequences. And since the analysis required to determine worstcase task execution times on modern processors is so complicated, it is not practical
WorstCase Analysis of the Perceptron and Exponentiated Update Algorithms
 Artificial Intelligence
, 1998
"... The absolute loss is the absolute difference between the desired and predicted outcome. This paper demonstrates worstcase upper bounds on the absolute loss for the Perceptron learning algorithm and the Exponentiated Update learning algorithm, which is related to the Weighted Majority algorithm. The ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
. The bounds characterize the behavior of the algorithms over any sequence of trials, where each trial consists of an example and a desired outcome interval (any value in the interval is an acceptable outcome). The worstcase absolute loss of both algorithms is bounded by: the absolute loss of the best linear
Complexity of finding embeddings in a ktree
 SIAM JOURNAL OF DISCRETE MATHEMATICS
, 1987
"... A ktree is a graph that can be reduced to the kcomplete graph by a sequence of removals of a degree k vertex with completely connected neighbors. We address the problem of determining whether a graph is a partial graph of a ktree. This problem is motivated by the existence of polynomial time al ..."
Abstract

Cited by 386 (1 self)
 Add to MetaCart
status of two problems related to finding the smallest number k such that a given graph is a partial ktree. First, the corresponding decision problem is NPcomplete. Second, for a fixed (predetermined) value of k, we present an algorithm with polynomially bounded (but exponential in k) worst case time
Results 1  10
of
831