Results 1  10
of
12
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 766 (29 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propose a method to approach this problem by trying to estimate a function f which is positive on S and negative on the complement. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data; it is regularized by controlling the length of the weight vector in an associated feature space. The expansion coefficients are found by solving a quadratic programming problem, which we do by carrying out sequential optimization over pairs of input patterns. We also provide a preliminary theoretical analysis of the statistical performance of our algorithm. The algorithm is a natural extension of the support vector algorithm to the case of unlabelled d...
SV Estimation of a Distribution's Support
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified 0 < 1. We propose an algorithm which appro ..."
Abstract

Cited by 37 (2 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified 0 < 1. We propose an algorithm which approaches this problem by trying to estimate a function f which is positive on S and negative on the complement. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data; it is regularized by controlling the length of the weight vector in an associated feature space. The algorithm is a natural extension of the support vector algorithm to the case of unlabelled data.
Novelty Detection in Learning Systems
 Neural Comp. Surveys
, 2003
"... Novelty detection is concerned with recognising inputs that differ in some way from those that are usually seen. It is a useful technique in cases where an important class of data is underrepresented in the training set. This means that the performance of the network will be poor for those classes. ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
(Show Context)
Novelty detection is concerned with recognising inputs that differ in some way from those that are usually seen. It is a useful technique in cases where an important class of data is underrepresented in the training set. This means that the performance of the network will be poor for those classes. In some circumstances, such as medical data and fault detection, it is often precisely the class that is underrepresented in the data, the disease or potential fault, that the network should detect. In novelty detection systems the network is trained only on the negative examples where that class is not present, and then detects inputs that do not fits into the model that it has acquired, that it, members of the novel class. This paper reviews the literature on novelty detection in neural networks and other machine learning techniques, as well as providing brief overviews of the related topics of statistical outlier detection and novelty detection in biological organisms. 1
ONLINE NOVELTY DETECTION THROUGH SELFORGANISATION, WITH APPLICATION TO INSPECTION ROBOTICS
, 2001
"... ..."
On The Estimation Of A Support Curve Of Indeterminate Sharpness
 J. Multivariate Anal
, 1997
"... . We propose nonparametric methods for estimating the support curve of a bivariate density, when the density decreases at a rate which might vary along the curve. Attention is focussed on cases where the rate of decrease is relatively fast, this being the most difficult setting. It demands the use o ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
. We propose nonparametric methods for estimating the support curve of a bivariate density, when the density decreases at a rate which might vary along the curve. Attention is focussed on cases where the rate of decrease is relatively fast, this being the most difficult setting. It demands the use of a relatively large number of bivariate order statistics. By way of comparison, support curve estimation in the context of slow rates of decrease of the density may be addressed using methods that use only a relatively small number of order statistics at the extremities of the point cloud. In this paper we suggest a new type of estimator, based on projecting onto an axis those data values lying within a thin rectangular strip. Adaptive univariate methods are then applied to the problem of estimating an endpoint of the distribution on the axis. The new method is shown to have theoretically optimal performance in a range of settings. Its numerical properties are explored in a simulation study...
Doptimality for minimum volume ellipsoid with outliers
 In Proceedings of the Seventh International Conference on Signal/Image Processing and Pattern Recognition, (UkrOBRAZ’2004
, 2004
"... A family of oneclass classification methods is extended by the determinant maximization novelty detection (DMND) model based on the Doptimum experimental design approach for the ellipsoid estimation. Similar to the oneclass classification methods based on the support vector machine or the socall ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
A family of oneclass classification methods is extended by the determinant maximization novelty detection (DMND) model based on the Doptimum experimental design approach for the ellipsoid estimation. Similar to the oneclass classification methods based on the support vector machine or the socalled support vector data description (SVDD) approach, DMND is a method that fits a geometrical object around the training data. However, in contrast to SVDD, DMND finds the hyperellipsoid of the smallest volume covering the target objects that can contain outliers by maximizing the determinant of an information matrix. Simulation results are presented for the case when training data are contaminated by compactly located outliers. 1.
Set estimation from reflected Brownian motion
"... We study the problem of estimating a compact set S ⊂ Rd from a trajectory of a reflected Brownian motion in S with reflections on the boundary of S. We establish consistency and rates of convergence for various estimators of S and its boundary. This problem has relevant applications in ecology in es ..."
Abstract
 Add to MetaCart
We study the problem of estimating a compact set S ⊂ Rd from a trajectory of a reflected Brownian motion in S with reflections on the boundary of S. We establish consistency and rates of convergence for various estimators of S and its boundary. This problem has relevant applications in ecology in estimating the home range of an animal based on tracking data. There are a variety of studies on the habitat of animals that employ the notion of home range. This paper offers theoretical foundations for a new methodology that, under fairly unrestrictive shape assumptions, allows one to find flexible regions close to reality. The theoretical findings are illustrated on simulated and real data examples.
FACULTY OF ENGINEERING, SCIENCE AND MATHEMATICS
, 2005
"... Ellipsoid estimation is an issue of primary importance in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and novelty/outlier detection. This paper presents a new method of kernel information matrix ellipsoid est ..."
Abstract
 Add to MetaCart
(Show Context)
Ellipsoid estimation is an issue of primary importance in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and novelty/outlier detection. This paper presents a new method of kernel information matrix ellipsoid estimation (KIMEE) that finds an ellipsoid in a kernel defined feature space based on a centered information matrix. Although the method is very general and can be applied to many of the aforementioned problems, the main focus in this paper is the problem of novelty or outlier detection associated with fault detection. A simple iterative algorithm based on Titterington’s minimum volume ellipsoid method is proposed for practical implementation. The KIMEE method demonstrates very good performance on a set of reallife and simulated datasets compared with support vector machine methods.