Results 1 -
8 of
8
Multiscale methods for data on graphs and irregular multidimensional situations,
, 2008
"... Summary. For regularly spaced 1D data, wavelet shrinkage has proven to be a compelling method for nonparametric function estimation. We create three new multiscale methods that provide wavelet-like transforms for both data arising on graphs and for irregularly spaced spatial data in more than 1D. T ..."
Abstract
-
Cited by 20 (0 self)
- Add to MetaCart
(Show Context)
Summary. For regularly spaced 1D data, wavelet shrinkage has proven to be a compelling method for nonparametric function estimation. We create three new multiscale methods that provide wavelet-like transforms for both data arising on graphs and for irregularly spaced spatial data in more than 1D. The concept of scale still exists within these transforms but as a continuous quantity rather than dyadic levels. Further, we adapt recent empirical Bayesian shrinkage techniques to enable us to perform multiscale shrinkage for function estimation both on graphs and for irregular spatial data. We demonstrate that our methods perform very well when compared to several other methods for spatial regression for both real and simulated data. Although our article concentrates on multiscale shrinkage (regression) we present our new 'wavelet transforms' as generic tools intended to be the basis of methods that might benefit from a multiscale representation of data either on graphs or for irregular spatial data.
Multivariate nonparametric regression using lifting
, 2004
"... Summary For regularly spaced one-dimensional data wavelet shrinkage has proven to be a compelling method for nonparametric function estimation. We argue that this is not the case for irregularly spaced data in two or higher dimensions. This article develops three methods for the multiscale analysis ..."
Abstract
-
Cited by 10 (5 self)
- Add to MetaCart
Summary For regularly spaced one-dimensional data wavelet shrinkage has proven to be a compelling method for nonparametric function estimation. We argue that this is not the case for irregularly spaced data in two or higher dimensions. This article develops three methods for the multiscale analysis of irregularly spaced data based on the recently developed lifting paradigm by "lifting one coefficient at a time". The concept of scale still exists within these transforms but as a continuous quantity rather than dyadic levels. We develop empirical Bayes methods that take account of the continuous nature of the scale. We apply our new methods to the problems of estimation of krill density and rail arrival delays. We demonstrate good performance in a simulation study on new two-dimensional analogues of the well-known Blocks, Bumps, Doppler and Heavisine and a new piecewise linear function called maartenfunc.
Improving prediction of hydrophobic segments along a transmembrane protein sequence using adaptive multiscale lifting
, 2004
"... Motivation: Established methods for transmembrane protein segment prediction are often based upon hydrophobicity analysis. Classical wavelet multiscale methods have proved successful in the prediction task. However, they implicitly model protein chain residues as being equally spaced. Our main motiv ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
Motivation: Established methods for transmembrane protein segment prediction are often based upon hydrophobicity analysis. Classical wavelet multiscale methods have proved successful in the prediction task. However, they implicitly model protein chain residues as being equally spaced. Our main motivation is to challenge this assumption by developing a new multiscale ‘lifting ’ technique that utilizes irregularly spaced residues, where the spacing is derived from resolved 3D information obtained from similar aligned proteins. For different protein families we calculate asymmetrical dissimilarity matrices of order 20 that estimate the ‘distance ’ between residue types. Results: We then use our new adaptive lifting technique to regress the Kyte and Doolittle hydrophobicity index upon residues (now irregularly spaced using information from the distance matrices) and use the regression to predict transmembranar segments. Overall we show that incorporating 3D resolved structure by introducing irregular distances improves prediction: both in terms of the existence of predicted segments compared to experimentally determined ones and also the proportion of correctly predicted segments. 1
Analysis of long period variable stars with nonparametric tests for trend detection
- Journal of the American Statistical Association
, 2011
"... Abstract In astronomy the study of variable stars -i.e., stars characterized by showing significant variation in their brightness over time -has made crucial contributions to our understanding of many fields, from stellar birth and evolution to the calibration of the extragalactic distance scale. I ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract In astronomy the study of variable stars -i.e., stars characterized by showing significant variation in their brightness over time -has made crucial contributions to our understanding of many fields, from stellar birth and evolution to the calibration of the extragalactic distance scale. In this paper, we develop a method for analyzing multiple, (pseudo)-periodic time series with the goal of detecting temporal trends in their periods. We allow for non-stationary noise and for clustering among the various time series. We apply this method to the long-standing astronomical problem of identifying variables stars whose regular brightness fluctuations have periods that change over time. The results of our analysis show that such changes can be substantial, raising the possibility that astronomers' estimates of galactic distances can be refined. Two significant contributions of our approach, relative to existing methods for this problem, are as follows: 1. The method is nonparametric, making minimal assumptions about both the temporal trends themselves but also the covariance structure of the non-stationary noise. 2. Our proposed test has higher power than existing methods. The test is based on inference for a high-dimensional Normal mean, with control of the False Discovery Rate to account for multiplicity. We present theory and simulations to demonstrate the performance of our method relative. We also analyze data from the American Association of Variable Star Observers and find a monotone relationship between mean period and strength of trend similar to that identified by
Improving Prediction of Hydrophobic Segments along a Transmembrane Protein Sequence using Adaptive Multiscale Lifting
, 2005
"... Abstract: Established methods for transmembrane protein segment prediction are often based upon hydrophobicity analysis. Classical wavelet multiscale methods have proved successful in the prediction task. However, they implicitly model protein chain residues as being equally spaced. Our main motivat ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Abstract: Established methods for transmembrane protein segment prediction are often based upon hydrophobicity analysis. Classical wavelet multiscale methods have proved successful in the prediction task. However, they implicitly model protein chain residues as being equally spaced. Our main motivation is to challenge this assumption by developing a new multiscale ‘lifting ’ technique that utilizes irregularly spaced residues, where the spacing is derived from resolved 3D information obtained from similar aligned proteins. For different protein families we calculate asymmetrical dissimilarity matrices of order 20 that estimate the ‘distance ’ between residue types. We use our new adaptive lifting technique to regress the Kyte and Doolittle hydrophobicity index upon residues (now irregularly spaced using information from the distance matrices) and use the regression to predict transmembranar segments. We compare the results obtained through our method with the ones obtained through the usage of classical wavelets, and show that incorporating 3D resolved structure improves overall prediction (both in terms of the existence of predicted segments compared to experimentally determined ones and also the proportion of correctly predicted segments). The software is available from
A ‘nondecimated ’ lifting transform
, 2008
"... Classical nondecimated wavelet transforms are attractive for many applications. When the data comes from complex or irregular designs, the use of second generation wavelets in nonparametric regression has proved superior to that of classical wavelets. However, the construction of a nondecimated seco ..."
Abstract
- Add to MetaCart
(Show Context)
Classical nondecimated wavelet transforms are attractive for many applications. When the data comes from complex or irregular designs, the use of second generation wavelets in nonparametric regression has proved superior to that of classical wavelets. However, the construction of a nondecimated second generation wavelet transform is not obvious. In this paper we propose a new ‘nondecimated ’ lifting transform, based on the lifting algorithm which removes one coefficient at a time, and explore its behaviour. Our approach also allows for embedding adaptivity in the transform, i.e. wavelet functions can be constructed such that their smoothness adjusts to the local properties of the signal. We address the problem of nonparametric regression and propose an (averaged) estimator obtained by using our nondecimated lifting technique teamed with empirical Bayes shrinkage. Simulations show that our proposed method has higher performance than competing techniques able to work on irregular data. Our construction also opens avenues for generating a ‘best ’ representation, which we shall explore.
MULTISCALE KERNEL SMOOTHING USING A LIFTING SCHEME
"... This paper discusses the idea of a lifting scheme for multiscale implementation of kernel estimation procedures used in statistical estimation. The resulting decomposition is related to the Burt-Adelson pyramid, but the design of the filters is well adapted to nonequispaced samples. The proposed dec ..."
Abstract
- Add to MetaCart
(Show Context)
This paper discusses the idea of a lifting scheme for multiscale implementation of kernel estimation procedures used in statistical estimation. The resulting decomposition is related to the Burt-Adelson pyramid, but the design of the filters is well adapted to nonequispaced samples. The proposed decomposition has an oversampling rate of 2, where the oversampling can be seen as an alternative to primal lifting steps (update steps) as a tool for stabilising and anti-aliasing. We then propose an adaptive version of this multiscale kernel estimation with truncated kernels. Truncated kernels allow sharp representations of jumps. Illustrations show that our method is numerically well conditioned, suffers less from visual effects due to false detections, and allows indeed sharp transitions if equiped with an adaptive choice among truncated kernels. All variants of the proposed method have linear computational complexity. Key words: wavelet; lifting; kernel; adaptive; smoothing; thresholding