Results 1  10
of
51,186
Sparse signal detection from incoherent projections
 in IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP), III
, 2006
"... Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating ..."
Abstract

Cited by 56 (14 self)
 Add to MetaCart
Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating
Hyperspectral target detection from incoherent projections: nonequiprobable targets and inhomogenous snr
 Accepted to the International Conference on Image Processing (ICIP
, 2010
"... This paper studies the detection of spectral targets corrupted by a colored Gaussian background from noisy, incoherent projection measurements. Unlike many detection methods designed for incoherent projections, the proposed approach a) is computationally efficient, b) allows for spectral backgrounds ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
This paper studies the detection of spectral targets corrupted by a colored Gaussian background from noisy, incoherent projection measurements. Unlike many detection methods designed for incoherent projections, the proposed approach a) is computationally efficient, b) allows for spectral
Fast Reconstruction from Random Incoherent Projections
"... The Compressed Sensing paradigm consists of recovering signals that can be sparsely represented in a given basis from a small set of projections into random vectors; the problem is typically solved using an adapted Basis Pursuit algorithm. We show that the recovery of the signal is equivalent to det ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The Compressed Sensing paradigm consists of recovering signals that can be sparsely represented in a given basis from a small set of projections into random vectors; the problem is typically solved using an adapted Basis Pursuit algorithm. We show that the recovery of the signal is equivalent
1 Spectral Target and Anomaly Detection from Incoherent Projections
"... Abstract—This paper describes computationally efficient approaches and associated theoretical performance guarantees for the detection of known spectral targets and spectral anomalies from few incoherent projections. The proposed approaches accommodate targets of different signal strengths contamina ..."
Abstract
 Add to MetaCart
Abstract—This paper describes computationally efficient approaches and associated theoretical performance guarantees for the detection of known spectral targets and spectral anomalies from few incoherent projections. The proposed approaches accommodate targets of different signal strengths
Fast Reconstruction of Piecewise Smooth Signals from Incoherent Projections
"... Abstract — The Compressed Sensing framework aims to recover a sparse signal from a small set of projections onto random vectors; the problem reduces to searching for a sparse approximation of this measurement vector. Conventional solutions involve linear programming or greedy algorithms and can be c ..."
Abstract
 Add to MetaCart
Abstract — The Compressed Sensing framework aims to recover a sparse signal from a small set of projections onto random vectors; the problem reduces to searching for a sparse approximation of this measurement vector. Conventional solutions involve linear programming or greedy algorithms and can
Near Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
, 2004
"... Suppose we are given a vector f in RN. How many linear measurements do we need to make about f to be able to recover f to within precision ɛ in the Euclidean (ℓ2) metric? Or more exactly, suppose we are interested in a class F of such objects— discrete digital signals, images, etc; how many linear m ..."
Abstract

Cited by 1513 (20 self)
 Add to MetaCart
Suppose we are given a vector f in RN. How many linear measurements do we need to make about f to be able to recover f to within precision ɛ in the Euclidean (ℓ2) metric? Or more exactly, suppose we are interested in a class F of such objects— discrete digital signals, images, etc; how many linear measurements do we need to recover objects from this class to within accuracy ɛ? This paper shows that if the objects of interest are sparse or compressible in the sense that the reordered entries of a signal f ∈ F decay like a powerlaw (or if the coefficient sequence of f in a fixed basis decays like a powerlaw), then it is possible to reconstruct f to within very high accuracy from a small number of random measurements. typical result is as follows: we rearrange the entries of f (or its coefficients in a fixed basis) in decreasing order of magnitude f  (1) ≥ f  (2) ≥... ≥ f  (N), and define the weakℓp ball as the class F of those elements whose entries obey the power decay law f  (n) ≤ C · n −1/p. We take measurements 〈f, Xk〉, k = 1,..., K, where the Xk are Ndimensional Gaussian
Compressive sampling
, 2006
"... Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired res ..."
Abstract

Cited by 1427 (15 self)
 Add to MetaCart
Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired resolution of the image, i.e. the number of pixels in the image. This paper surveys an emerging theory which goes by the name of “compressive sampling” or “compressed sensing,” and which says that this conventional wisdom is inaccurate. Perhaps surprisingly, it is possible to reconstruct images or signals of scientific interest accurately and sometimes even exactly from a number of samples which is far smaller than the desired resolution of the image/signal, e.g. the number of pixels in the image. It is believed that compressive sampling has far reaching implications. For example, it suggests the possibility of new data acquisition protocols that translate analog information into digital form with fewer sensors than what was considered necessary. This new sampling theory may come to underlie procedures for sampling and compressing data simultaneously. In this short survey, we provide some of the key mathematical insights underlying this new theory, and explain some of the interactions between compressive sampling and other fields such as statistics, information theory, coding theory, and theoretical computer science.
2003: Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late Nineteenth Century
 J. Geophysical Research
"... data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1 ° latitudelongitude grid ..."
Abstract

Cited by 517 (3 self)
 Add to MetaCart
data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1 ° latitudelongitude grid from 1871. The companion HadMAT1 runs monthly from 1856 on a 5 ° latitudelongitude grid and incorporates new corrections for the effect on NMAT of increasing deck (and hence measurement) heights. HadISST1 and HadMAT1 temperatures are reconstructed using a twostage reducedspace optimal interpolation procedure, followed by superposition of qualityimproved gridded observations onto the reconstructions to restore local detail. The sea ice fields are made more homogeneous by compensating satellite microwavebased sea ice concentrations for the impact of surface melt effects on retrievals in the Arctic and for algorithm deficiencies in the Antarctic and by making the historical in situ concentrations consistent with the satellite data. SSTs near sea ice are estimated using statistical relationships between SST and sea ice concentration. HadISST1 compares well with other published analyses, capturing trends in global, hemispheric, and regional SST well,
Data Integration: A Theoretical Perspective
 Symposium on Principles of Database Systems
, 2002
"... Data integration is the problem of combining data residing at different sources, and providing the user with a unified view of these data. The problem of designing data integration systems is important in current real world applications, and is characterized by a number of issues that are interestin ..."
Abstract

Cited by 944 (45 self)
 Add to MetaCart
Data integration is the problem of combining data residing at different sources, and providing the user with a unified view of these data. The problem of designing data integration systems is important in current real world applications, and is characterized by a number of issues that are interesting from a theoretical point of view. This document presents on overview of the material to be presented in a tutorial on data integration. The tutorial is focused on some of the theoretical issues that are relevant for data integration. Special attention will be devoted to the following aspects: modeling a data integration application, processing queries in data integration, dealing with inconsistent data sources, and reasoning on queries.
A Comparative Analysis of Methodologies for Database Schema Integration
 ACM COMPUTING SURVEYS
, 1986
"... One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries.
Metho ..."
Abstract

Cited by 642 (10 self)
 Add to MetaCart
One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries.
Methodologies for database design usually perform the design activity by separately producing several schemas, representing parts of the application, which are subsequently merged. Database schema integration is the activity of integrating the schemas of existing or proposed databases into a global, unified schema. The aim of the paper is to provide first a unifying framework for the problem of schema integration, then a comparative review of the work done thus far in this area. Such a framework, with the associated analysis of the existing approaches, provides a basis for identifying strengths and weaknesses of individual methodologies, as well as general guidelines for future improvements and extensions.
Results 1  10
of
51,186