Results 1  10
of
16
Data Streams: Algorithms and Applications
, 2005
"... In the data stream scenario, input arrives very rapidly and there is limited memory to store the input. Algorithms have to work with one or few passes over the data, space less than linear in the input size or time significantly less than the input size. In the past few years, a new theory has emerg ..."
Abstract

Cited by 538 (22 self)
 Add to MetaCart
In the data stream scenario, input arrives very rapidly and there is limited memory to store the input. Algorithms have to work with one or few passes over the data, space less than linear in the input size or time significantly less than the input size. In the past few years, a new theory has emerged for reasoning about algorithms that work within these constraints on space, time, and number of passes. Some of the methods rely on metric embeddings, pseudorandom computations, sparse approximation theory and communication complexity. The applications for this scenario include IP network traffic analysis, mining text message streams and processing massive data sets in general. Researchers in Theoretical Computer Science, Databases, IP Networking and Computer Systems are working on the data stream challenges. This article is an overview and survey of data stream algorithmics and is an updated version of [175].1
A Survey on Combinatorial Group Testing Algorithms with Applications to DNA Library Screening
, 2000
"... In this paper, we give an overview of Combinatorial Group Testing algorithms which are applicable to DNA Library Screening. Our survey focuses on several classes of constructions not discussed in previous surveys, provides a general view on pooling design constructions and poses several open quest ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
In this paper, we give an overview of Combinatorial Group Testing algorithms which are applicable to DNA Library Screening. Our survey focuses on several classes of constructions not discussed in previous surveys, provides a general view on pooling design constructions and poses several open questions arising from this view.
Sorting and Searching in the Presence of Memory Faults (without Redundancy)
 Proc. 36th ACM Symposium on Theory of Computing (STOC’04
, 2004
"... We investigate the design of algorithms resilient to memory faults, i.e., algorithms that, despite the corruption of some memory values during their execution, are able to produce a correct output on the set of uncorrupted values. In this framework, we consider two fundamental problems: sorting and ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
We investigate the design of algorithms resilient to memory faults, i.e., algorithms that, despite the corruption of some memory values during their execution, are able to produce a correct output on the set of uncorrupted values. In this framework, we consider two fundamental problems: sorting and searching. In particular, we prove that any O(n log n) comparisonbased sorting algorithm can tolerate at most O((n log n) ) memory faults. Furthermore, we present one comparisonbased sorting algorithm with optimal space and running time that is resilient to O((n log n) ) faults. We also prove polylogarithmic lower and upper bounds on faulttolerant searching.
New constructions of nonadaptive and errortolerance pooling designs
, 2000
"... We propose two new classes of nonadaptive pooling designs. The first one is guaranteed to bederrordetecting and thusbd2cerrorcorrecting given any positive integer d. Also, this construction induces a construction of a binary code with minimum Hamming distance at least 2d+2. The second design is ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We propose two new classes of nonadaptive pooling designs. The first one is guaranteed to bederrordetecting and thusbd2cerrorcorrecting given any positive integer d. Also, this construction induces a construction of a binary code with minimum Hamming distance at least 2d+2. The second design is the qanalogue of a known construction on ddisjunct matrices.
Resilient search trees
 IN PROCEEDINGS OF 18TH ACMSIAM SODA
, 2007
"... We investigate the problem of computing in a reliable fashion in the presence of faults that may arbitrarily corrupt memory locations. In this framework, we focus on the design of resilient data structures, i.e., data structures that, despite the corruption of some memory values during their lifetim ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
We investigate the problem of computing in a reliable fashion in the presence of faults that may arbitrarily corrupt memory locations. In this framework, we focus on the design of resilient data structures, i.e., data structures that, despite the corruption of some memory values during their lifetime, are nevertheless able to operate correctly (at least) on the set of uncorrupted values. In particular, we present resilient search trees which achieve optimal time and space bounds while tolerating up to O ( √ log n) memory faults, where n is the current number of items in the search tree. In more detail, our resilient search trees are able to insert, delete and search for a key in O(log n + δ 2) amortized time, where δ is an upper bound on the total number of faults. The space required is O(n + δ).
Designing Reliable Algorithms in Unreliable Memories
"... Some of today’s applications run on computer platforms with large and inexpensive memories, which are also errorprone. Unfortunately, the appearance of even very few memory faults may jeopardize the correctness of the computational results. An algorithm is resilient to memory faults if, despite t ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Some of today’s applications run on computer platforms with large and inexpensive memories, which are also errorprone. Unfortunately, the appearance of even very few memory faults may jeopardize the correctness of the computational results. An algorithm is resilient to memory faults if, despite the corruption of some memory values before or during its execution, it is nevertheless able to get a correct output at least on the set of uncorrupted values. In this paper we will survey some recent work on reliable computation in the presence of memory faults.
Lower Bounds for Identifying Subset Members with Subset Queries
, 1995
"... An instance of a group testing problem is a set of objects O and an unknown subset P of O. The task is to determine P by using queries of the type "does P intersect Q", where Q is a subset of O. This problem occurs in areas such as fault detection, multiaccess communications, optimal sea ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
An instance of a group testing problem is a set of objects O and an unknown subset P of O. The task is to determine P by using queries of the type "does P intersect Q", where Q is a subset of O. This problem occurs in areas such as fault detection, multiaccess communications, optimal search, blood testing and chromosome mapping. Consider the two stage algorithm for solving a group testing problem. In the first stage a predetermined set of queries are asked in parallel and in the second stage, P is determined by testing individual objects. Let n = O . Suppose that P is generated by independently adding each x 2 O to P with probability p=n. Let q1 (q2) be the number of queries asked in the first (second) stage of this algorithm. We show that if q1 = o(log(n) log(n)= log log(n)), then Exp(q2) = n 1\Gammao(1) , while there exist algorithms with q1 = O(log(n) log(n)= log log(n)) and Exp(q2 ) = o(1). The proof involves a relaxation technique which can be used with arbitrary distributio...
Locating Information with Uncertainty in Fully Interconnected Networks with Applications to World Wide Web Information Retrieval
, 2001
"... In this paper we examine the problem of searching for some information item in the nodes of a fully interconnected computer network, where each node contains information relevant to some topic as well as links to other network nodes that also contain information, not necessarily related to locall ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
In this paper we examine the problem of searching for some information item in the nodes of a fully interconnected computer network, where each node contains information relevant to some topic as well as links to other network nodes that also contain information, not necessarily related to locally kept information. These links are used to facilitate the Internet users and mobile software agents that try to locate specific pieces of information.
Resilient dictionaries
 ACM Transactions on Algorithms
"... We address the problem of designing data structures in the presence of faults that may arbitrarily corrupt memory locations. More precisely, we assume that an adaptive adversary can arbitrarily overwrite the content of up to δ memory locations, that corrupted locations cannot be detected, and that o ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
We address the problem of designing data structures in the presence of faults that may arbitrarily corrupt memory locations. More precisely, we assume that an adaptive adversary can arbitrarily overwrite the content of up to δ memory locations, that corrupted locations cannot be detected, and that only O(1) memory locations are safe. In this framework, we call a data structure resilient if it is able to operate correctly (at least) on the set of uncorrupted values. We present a resilient dictionary, implementing search, insert and delete operations. Our dictionary has O(log n + δ) expected amortized time per operation, and O(n) space complexity, where n denotes the current number of keys in the dictionary. We also describe a deterministic resilient dictionary, with the same amortized cost per operation over a sequence of at least δ ǫ operations, where ǫ> 0 is an arbitrary constant. Finally, we show that any resilient comparisonbased dictionary must take Ω(log n+δ) expected time per search. Our results are achieved by means of simple, new techniques, which might be of independent interest for the design of other resilient algorithms. 1