Results 1 - 10
of
2,803
Regularization and variable selection via the Elastic Net.
- J. R. Stat. Soc. Ser. B
, 2005
"... Abstract We propose the elastic net, a new regularization and variable selection method. Real world data and a simulation study show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, wher ..."
Abstract
-
Cited by 973 (11 self)
- Add to MetaCart
Abstract We propose the elastic net, a new regularization and variable selection method. Real world data and a simulation study show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, where strongly correlated predictors tend to be in (out) the model together. The elastic net is particularly useful when the number of predictors (p) is much bigger than the number of observations (n). By contrast, the lasso is not a very satisfactory variable selection method in the p n case. An efficient algorithm called LARS-EN is proposed for computing elastic net regularization paths efficiently, much like the LARS algorithm does for the lasso.
Blind Beamforming for Non Gaussian Signals
- IEE Proceedings-F
, 1993
"... This paper considers an application of blind identification to beamforming. The key point is to use estimates of directional vectors rather than resorting to their hypothesized value. By using estimates of the directional vectors obtained via blind identification i.e. without knowing the arrray mani ..."
Abstract
-
Cited by 719 (31 self)
- Add to MetaCart
(Show Context)
This paper considers an application of blind identification to beamforming. The key point is to use estimates of directional vectors rather than resorting to their hypothesized value. By using estimates of the directional vectors obtained via blind identification i.e. without knowing the arrray manifold, beamforming is made robust with respect to array deformations, distortion of the wave front, pointing errors, etc ... so that neither array calibration nor physical modeling are necessary. Rather surprisingly, `blind beamformers' may outperform `informed beamformers' in a plausible range of parameters, even when the array is perfectly known to the informed beamformer. The key assumption blind identification relies on is the statistical independence of the sources, which we exploit using fourth-order cumulants. A computationally efficient technique is presented for the blind estimation of directional vectors, based on joint diagonalization of 4th-order cumulant matrices
A Signal Processing Approach To Fair Surface Design
, 1995
"... In this paper we describe a new tool for interactive free-form fair surface design. By generalizing classical discrete Fourier analysis to two-dimensional discrete surface signals -- functions defined on polyhedral surfaces of arbitrary topology --, we reduce the problem of surface smoothing, or fai ..."
Abstract
-
Cited by 654 (15 self)
- Add to MetaCart
In this paper we describe a new tool for interactive free-form fair surface design. By generalizing classical discrete Fourier analysis to two-dimensional discrete surface signals -- functions defined on polyhedral surfaces of arbitrary topology --, we reduce the problem of surface smoothing, or fairing, to low-pass filtering. We describe a very simple surface signal low-pass filter algorithm that applies to surfaces of arbitrary topology. As opposed to other existing optimization-based fairing methods, which are computationally more expensive, this is a linear time and space complexity algorithm. With this algorithm, fairing very large surfaces, such as those obtained from volumetric medical data, becomes affordable. By combining this algorithm with surface subdivision methods we obtain a very effective fair surface design technique. We then extend the analysis, and modify the algorithm accordingly, to accommodate different types of constraints. Some constraints can be imposed without any modification of the algorithm, while others require the solution of a small associated linear system of equations. In particular, vertex location constraints, vertex normal constraints, and surface normal discontinuities across curves embedded in the surface, can be imposed with this technique.
V-BLAST: An architecture for realizing very high data rates over the rich-scattering wireless channel.
- In Proc. Intl. Symp. on Signals, Systems, and Electronics,
, 1998
"... ..."
(Show Context)
Power and centrality: A family of measures.
- 13656 |www.pnas.org/cgi/doi/10.1073/pnas.1401211111 Contractor and DeChurch
, 1987
"... JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about J ..."
Abstract
-
Cited by 595 (3 self)
- Add to MetaCart
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. Although network centrality is generally assumed to produce power, recent research shows that this is not the case in exchange networks. This paper proposes a generalization of the concept of centrality that accounts for both the usual positive relationship between power and centrality and Cook et al.'s recent exceptional results. I propose a family of centrality measures c(a, 3) generated by two parameters, a and P. The parameter P reflects the degree to which an individual's status is a function of the statuses of those to whom he or she is connected. If P is positive, c(a, P) is a conventional centrality measure in which each unit's status is a positive function of the statuses of those with which it is in contact.2 In a communication network, for example, a 1 Requests for reprints should be sent to Phillip Bonacich,
Large steps in cloth simulation
- SIGGRAPH 98 Conference Proceedings
, 1998
"... The bottle-neck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particle ..."
Abstract
-
Cited by 576 (5 self)
- Add to MetaCart
(Show Context)
The bottle-neck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particles with an implicit integration method. The simulator models cloth as a triangular mesh, with internal cloth forces derived using a simple continuum formulation that supports modeling operations such as local anisotropic stretch or compression; a unified treatment of damping forces is included as well. The implicit integration method generates a large, unbanded sparse linear system at each time step which is solved using a modified conjugate gradient method that simultaneously enforces particles ’ constraints. The constraints are always maintained exactly, independent of the number of conjugate gradient iterations, which is typically small. The resulting simulation system is significantly faster than previous accounts of cloth simulation systems in the literature. Keywords—Cloth, simulation, constraints, implicit integration, physically-based modeling. 1
Consistency of spectral clustering
, 2004
"... Consistency is a key property of statistical algorithms, when the data is drawn from some underlying probability distribution. Surprisingly, despite decades of work, little is known about consistency of most clustering algorithms. In this paper we investigate consistency of a popular family of spe ..."
Abstract
-
Cited by 572 (15 self)
- Add to MetaCart
Consistency is a key property of statistical algorithms, when the data is drawn from some underlying probability distribution. Surprisingly, despite decades of work, little is known about consistency of most clustering algorithms. In this paper we investigate consistency of a popular family of spectral clustering algorithms, which cluster the data with the help of eigenvectors of graph Laplacian matrices. We show that one of the two of major classes of spectral clustering (normalized clustering) converges under some very general conditions, while the other (unnormalized), is only consistent under strong additional assumptions, which, as we demonstrate, are not always satisfied in real data. We conclude that our analysis provides strong evidence for the superiority of normalized spectral clustering in practical applications. We believe that methods used in our analysis will provide a basis for future exploration of Laplacian-based methods in a statistical setting.
Lambertian Reflectance and Linear Subspaces
, 2000
"... We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a wi ..."
Abstract
-
Cited by 526 (20 self)
- Add to MetaCart
(Show Context)
We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a wide variety of lighting conditions can be approximated accurately by a low-dimensional linear subspace, explaining prior empirical results. We also provide a simple analytic characterization of this linear space. We obtain these results by representing lighting using spherical harmonics and describing the effects of Lambertian materials as the analog of a convolution. These results allow us to construct algorithms for object recognition based on linear methods as well as algorithms that use convex optimization to enforce non-negative lighting functions. Finally, we show a simple way to enforce non-negative lighting when the images of an object lie near a 4D linear space. Research conducted w...