Results 1  10
of
1,500
Learning with Nonuniform Class and Cost Distributions: Effects and a Distributed Multiclassifier Approach
 In Workshop Notes KDD98 Workshop on Distributed Data Mining
, 1998
"... Many factors influence a learning process and the performance of a learned classifier. In this paper we investigate the effects of class distribution in the training set on performance. We also study different methods of measuring performance based on cost models and the performance effects of train ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
of training class distribution with respect to the different cost models. Observations from these effects help us devise a distributed multiclassifier metalearning approach to learn in domains with skewed class distributions, nonuniform cost per error, and large amounts of data. One such domain is credit
Universal Learning of Classes From Sparse and NonUniform Evidence
, 2000
"... We present a framework for the Inductive Functional Logic Programming paradigm from positive data and from nonuniform classes. It extends the ..."
Abstract
 Add to MetaCart
We present a framework for the Inductive Functional Logic Programming paradigm from positive data and from nonuniform classes. It extends the
Random Languages for NonUniform Complexity Classes
 Journal of Complexity
, 1991
"... A language A is considered to be random for a class C if for every language B in C the fraction of the strings where A and B coincide is approximately 1/2. We show that there exist languages in DSPACE(f(n)) which are random for the nonuniform class DSPACE(g(n))=h(n), where n, g(n) and h(n) are in ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
A language A is considered to be random for a class C if for every language B in C the fraction of the strings where A and B coincide is approximately 1/2. We show that there exist languages in DSPACE(f(n)) which are random for the nonuniform class DSPACE(g(n))=h(n), where n, g(n) and h
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
nothing directly to do with coding or decoding will show that in some sense belief propagation "converges with high probability to a nearoptimum value" of the desired belief on a class of loopy DAGs Progress in the analysis of loopy belief propagation has been made for the case of networks
A firstorder primaldual algorithm for convex problems with applications to imaging
, 2010
"... In this paper we study a firstorder primaldual algorithm for convex optimization problems with known saddlepoint structure. We prove convergence to a saddlepoint with rate O(1/N) in finite dimensions, which is optimal for the complete class of nonsmooth problems we are considering in this paper ..."
Abstract

Cited by 436 (20 self)
 Add to MetaCart
In this paper we study a firstorder primaldual algorithm for convex optimization problems with known saddlepoint structure. We prove convergence to a saddlepoint with rate O(1/N) in finite dimensions, which is optimal for the complete class of nonsmooth problems we are considering
Nonuniform deblurring for shaken images
 In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2010. 8. Image taken with a Canon 1D Mark III, at 35mm f/4.5. Images
"... Blur from camera shake is mostly due to the 3D rotation of the camera, resulting in a blur kernel that can be significantly nonuniform across the image. However, most current deblurring methods model the observed image as a convolution of a sharp image with a uniform blur kernel. We propose a new p ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
Blur from camera shake is mostly due to the 3D rotation of the camera, resulting in a blur kernel that can be significantly nonuniform across the image. However, most current deblurring methods model the observed image as a convolution of a sharp image with a uniform blur kernel. We propose a new
Randomness and nonuniformity
, 2006
"... In the first part, we introduce randomized algorithms as a new notion of efficient algorithms for decision problems. We classify randomized algorithms according to their error probabilities, and define appropriate complexity classes. (RP, coRP, ZPP, BPP, PP). We discuss which classes are realistic p ..."
Abstract
 Add to MetaCart
and Turing machines that take advice. We demonstrate the power of nonuniform complexity classes. We show the relevance of nonuniform polynomial time for complexity theory, especially the P? = NP
The NonUniformity of Degree Achievements
, 1998
"... this paper I argue that the aspectual class verbs derived from gradable adjectives by en (e.g. widen) and by 0 (e.g. cool) can be classified into two aspectual classes, which differ in their telicity. The classification of each verb depends crucially on characteristics of the specific adjective fr ..."
Abstract
 Add to MetaCart
this paper I argue that the aspectual class verbs derived from gradable adjectives by en (e.g. widen) and by 0 (e.g. cool) can be classified into two aspectual classes, which differ in their telicity. The classification of each verb depends crucially on characteristics of the specific adjective
NonUniform Reductions
, 2007
"... Reductions and completeness notions form the heart of computational complexity theory. Recently nonuniform reductions have been naturally introduced in a variety of settings concerning completeness notions for NP and other classes. We follow up on these results by strengthening some of them. In par ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Reductions and completeness notions form the heart of computational complexity theory. Recently nonuniform reductions have been naturally introduced in a variety of settings concerning completeness notions for NP and other classes. We follow up on these results by strengthening some of them
Sampling signals with finite rate of innovation
 IEEE Transactions on Signal Processing
, 2002
"... Abstract—Consider classes of signals that have a finite number of degrees of freedom per unit of time and call this number the rate of innovation. Examples of signals with a finite rate of innovation include streams of Diracs (e.g., the Poisson process), nonuniform splines, and piecewise polynomials ..."
Abstract

Cited by 350 (67 self)
 Add to MetaCart
Abstract—Consider classes of signals that have a finite number of degrees of freedom per unit of time and call this number the rate of innovation. Examples of signals with a finite rate of innovation include streams of Diracs (e.g., the Poisson process), nonuniform splines, and piecewise
Results 1  10
of
1,500