• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 4,648
Next 10 →

SCRIBE: A large-scale and decentralized application-level multicast infrastructure

by Miguel Castro, Peter Druschel, Anne-Marie Kermarrec, Antony Rowstron - IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS (JSAC , 2002
"... This paper presents Scribe, a scalable application-level multicast infrastructure. Scribe supports large numbers of groups, with a potentially large number of members per group. Scribe is built on top of Pastry, a generic peer-to-peer object location and routing substrate overlayed on the Internet, ..."
Abstract - Cited by 658 (29 self) - Add to MetaCart
, and leverages Pastry's reliability, self-organization, and locality properties. Pastry is used to create and manage groups and to build efficient multicast trees for the dissemination of messages to each group. Scribe provides best-effort reliability guarantees, but we outline how an application can extend

The pyramid match kernel: Discriminative classification with sets of image features

by Kristen Grauman, Trevor Darrell - IN ICCV , 2005
"... Discriminative learning is challenging when examples are sets of features, and the sets vary in cardinality and lack any sort of meaningful ordering. Kernel-based classification methods can learn complex decision boundaries, but a kernel over unordered set inputs must somehow solve for correspondenc ..."
Abstract - Cited by 544 (29 self) - Add to MetaCart
for correspondences – generally a computationally expensive task that becomes impractical for large set sizes. We present a new fast kernel function which maps unordered feature sets to multi-resolution histograms and computes a weighted histogram intersection in this space. This “pyramid match” computation is linear

Ideal spatial adaptation by wavelet shrinkage

by David L. Donoho, Iain M. Johnstone - Biometrika , 1994
"... With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic ad ..."
Abstract - Cited by 1269 (5 self) - Add to MetaCart
is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variable-knot spline methods equipped

The geometry of graphs and some of its algorithmic applications

by Nathan Linial, Eran London, Yuri Rabinovich - COMBINATORICA , 1995
"... In this paper we explore some implications of viewing graphs as geometric objects. This approach offers a new perspective on a number of graph-theoretic and algorithmic problems. There are several ways to model graphs geometrically and our main concern here is with geometric representations that res ..."
Abstract - Cited by 524 (19 self) - Add to MetaCart
that respect the metric of the (possibly weighted) graph. Given a graph G we map its vertices to a normed space in an attempt to (i) Keep down the dimension of the host space and (ii) Guarantee a small distortion, i.e., make sure that distances between vertices in G closely match the dis-tances between

Loopy belief propagation for approximate inference: An empirical study. In:

by Kevin P Murphy , Yair Weiss , Michael I Jordan - Proceedings of Uncertainty in AI, , 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" -the use of Pearl's polytree algorithm in a Bayesian network with loops -can perform well in the context of error-correcting codes. The most dramatic instance of this is the near Shannon-limit performanc ..."
Abstract - Cited by 676 (15 self) - Add to MetaCart
the convergence the more exact the approximation. • If the hidden nodes are binary, then thresholding the loopy beliefs is guaranteed to give the most probable assignment, even though the numerical value of the beliefs may be incorrect. This result only holds for nodes in the loop. In the max-product (or "

Approximate Frequency Counts over Data Streams

by Gurmeet Singh Manku, Rajeev Motwani - VLDB , 2002
"... We present algorithms for computing frequency counts exceeding a user-specified threshold over data streams. Our algorithms are simple and have provably small memory footprints. Although the output is approximate, the error is guaranteed not to exceed a user-specified parameter. Our algorithms can e ..."
Abstract - Cited by 418 (1 self) - Add to MetaCart
We present algorithms for computing frequency counts exceeding a user-specified threshold over data streams. Our algorithms are simple and have provably small memory footprints. Although the output is approximate, the error is guaranteed not to exceed a user-specified parameter. Our algorithms can

A new database on financial development and structure, Working paper

by Thorsten Beck, Aslı Demirgüç-kunt, Ross Levine - Journal of Financial Economics , 1999
"... The findings, interpretations, and conclusions expressed in this paper are entirely those of the author(s) and should not be attributed in any manner to the World Bank, to its affiliated organizations, or to members of its Board of Executive Directors or the countries they represent. The World Bank ..."
Abstract - Cited by 414 (21 self) - Add to MetaCart
does not guarantee the accuracy of the data included in this publication and accepts no responsibility whatsoever for any consequence of their use. A New Database on Financial Development and Structure iii Summary This paper introduces a new database of indicators of financial development and structure

GSAT and Dynamic Backtracking

by Matthew L. Ginsberg, David A. McAllester - Journal of Artificial Intelligence Research , 1994
"... There has been substantial recent interest in two new families of search techniques. One family consists of nonsystematic methods such as gsat; the other contains systematic approaches that use a polynomial amount of justification information to prune the search space. This paper introduces a new te ..."
Abstract - Cited by 386 (15 self) - Add to MetaCart
that guarantee that this database will be polynomial in the size of the problem in question. 1 INTRODUCTION The past few years have seen rapid progress in the development of algorithms for solving constraintsatisfaction problems, or csps. Csps arise naturally in subfields of AI from planning to vision

An optimal graph theoretic approach to data clustering: Theory and its application to image segmentation

by Zhenyu Wu, Richard Leahy - IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE , 1993
"... A novel graph theoretic approach for data clustering is presented and its application to the image segmentation problem is demonstrated. The data to be clustered are represented by an undirected adjacency graph G with arc capacities assigned to reflect the similarity between the linked vertices. Cl ..."
Abstract - Cited by 360 (0 self) - Add to MetaCart
. Clustering is achieved by removing arcs of G to form mutually exclusive subgraphs such that the largest inter-subgraph maximum flow is minimized. For graphs of moderate size (- 2000 vertices), the optimal solution is obtained through partitioning a flow and cut equivalent tree of 6, which can be efficiently

Iterative hard thresholding for compressed sensing

by Thomas Blumensath, Mike E. Davies - Appl. Comp. Harm. Anal
"... Compressed sensing is a technique to sample compressible signals below the Nyquist rate, whilst still allowing near optimal reconstruction of the signal. In this paper we present a theoretical analysis of the iterative hard thresholding algorithm when applied to the compressed sensing recovery probl ..."
Abstract - Cited by 329 (18 self) - Add to MetaCart
problem. We show that the algorithm has the following properties (made more precise in the main text of the paper) • It gives near-optimal error guarantees. • It is robust to observation noise. • It succeeds with a minimum number of observations. • It can be used with any sampling operator for which
Next 10 →
Results 1 - 10 of 4,648
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University