Results 1  10
of
85
The Dantzig selector: statistical estimation when p is much larger than n
, 2005
"... In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n ≪ ..."
Abstract

Cited by 879 (14 self)
 Add to MetaCart
≪ p, and the zi’s are i.i.d. N(0, σ 2). Is it possible to estimate x reliably based on the noisy data y? To estimate x, we introduce a new estimator—we call the Dantzig selector—which is solution to the ℓ1regularization problem min ˜x∈R p ‖˜x‖ℓ1 subject to ‖A T r‖ℓ ∞ ≤ (1 + t −1) √ 2 log p · σ
Stochastic global optimization
, 2008
"... Stochastic global optimization methods are methods for solving a global optimization problem incorporating probabilistic (stochastic) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself, or in both. Global optimization is a very important ..."
Abstract

Cited by 289 (6 self)
 Add to MetaCart
optimization we refer to the ‘Journal of Global Optimization ’ and two volumes of the ‘Handbook of Global Optimization ’ [1,2]. If the objective function is given as a ‘black box ’ computer code, the optimization problem is especially difficult. Stochastic approaches can often deal with problems of this kind
A superlinearly convergent trust region bundle method
, 1998
"... Abstract Bundle methods for the minimization of nonsmooth functions have been around for almost 20 years. Numerous variations have been proposed. But until very recently they all suffered from the drawback of only linear convergence. The aim of this paper is to show how exploiting an analogy with S ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
with SQP gives rise to a superlinearly convergent bundle method. Our algorithm features a trust region philosophy and is expected to converge superlinearly even for nonconvex problems. 1 Introduction Our aim will be to present a superlinearly convergent algorithm to solve the problem minx2IRn f (x); (1
Quadratic Minimisation Problems in Statistics
"... Abstract. We consider the problem minx(x−t)′A(x−t) subject to x′Bx+2b′x = k where A is positive definite or positive semidefinite. Commonly occurring statistical variants of this problem are discussed within the framework of a general unifying methodology. These include nontrivial considerations ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We consider the problem minx(x−t)′A(x−t) subject to x′Bx+2b′x = k where A is positive definite or positive semidefinite. Commonly occurring statistical variants of this problem are discussed within the framework of a general unifying methodology. These include nontrivial considerations
A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation
 SIAM Journal on Scientific Computing
, 2010
"... Abstract. We propose a fast algorithm for solving the ℓ1regularized minimization problem minx∈R n µ‖x‖1 + ‖Ax − b ‖ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax = b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a ..."
Abstract

Cited by 54 (8 self)
 Add to MetaCart
Abstract. We propose a fast algorithm for solving the ℓ1regularized minimization problem minx∈R n µ‖x‖1 + ‖Ax − b ‖ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax = b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a
Solve the Brachistochrone Problem with Numerical Optimization
, 2008
"... Incorporate your univariate search algorithm inside an algorithm to solve minx F(x). You will use this routine to solve the brachistochrone problem. If you type this into Google you will see a wealth of information on the problem and its history. There is an analytical solution to the problem. You w ..."
Abstract
 Add to MetaCart
Incorporate your univariate search algorithm inside an algorithm to solve minx F(x). You will use this routine to solve the brachistochrone problem. If you type this into Google you will see a wealth of information on the problem and its history. There is an analytical solution to the problem. You
Minimum Area Enclosing Ellipsoidal Cylinder Problem
"... Given an arbitrary set A ∈ IRn, we know that there exists an ellipsoid E which provides an nrounding of the set A, i.e. n−1E ⊆ conv(A) ⊆ E. The minimumvolume ellipsoid that encloses the set A provides such an ellipsoid and will be denoted as MVEE(A). Finding good approximations to this ellipsoid ..."
Abstract
 Add to MetaCart
Given an arbitrary set A ∈ IRn, we know that there exists an ellipsoid E which provides an nrounding of the set A, i.e. n−1E ⊆ conv(A) ⊆ E. The minimumvolume ellipsoid that encloses the set A provides such an ellipsoid and will be denoted as MVEE(A). Finding good approximations to this ellipsoid
Gammaconvergence and its Applications to Some Problems in the Calculus of Variations
, 1993
"... e K t 8h 2 N: The main properties of \Gammaconvergence are (see the book of Dal Maso [Birkhauser]): ffl) (F h ) equicoercive, F h \Gamma ! F ) minX F = lim h (inf X F h ); ffl) F h \Gamma ! F , x h minimizer of F h , x h ! x ) x minimizer of F ; ffl) F h \Gamma ! F , x h minimizer of F ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
e K t 8h 2 N: The main properties of \Gammaconvergence are (see the book of Dal Maso [Birkhauser]): ffl) (F h ) equicoercive, F h \Gamma ! F ) minX F = lim h (inf X F h ); ffl) F h \Gamma ! F , x h minimizer of F h , x h ! x ) x minimizer of F ; ffl) F h \Gamma ! F , x h minimizer of F
On the Dirichlet Problem for the ReactionDiffusion Equations in Nonsmooth Domains
"... We study the Dirichlet problem for the parabolic equation ut = ∆um − buβ, m> 0, β> 0, b ∈ IR in a bounded, noncylindrical and nonsmooth domain Ω ⊂ IRN+1, N ≥ 2. Existence and boundary regularity results are established. We introduce a notion of parabolic modulus of leftlower (or leftupper) ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We study the Dirichlet problem for the parabolic equation ut = ∆um − buβ, m> 0, β> 0, b ∈ IR in a bounded, noncylindrical and nonsmooth domain Ω ⊂ IRN+1, N ≥ 2. Existence and boundary regularity results are established. We introduce a notion of parabolic modulus of leftlower (or left
Projection methods for the linear split feasibility problems
"... Let C ⊂ IRn be a closed convex subset, A an n × m real matrix and b ∈ IRm. Consider the following linear split feasibility problem (LSFP) find x ∈ C such that ATx ≤ b, if such x exist. The problem has lot of applications, e.g., the problem of computed tomography or the problem of intensity modulat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Let C ⊂ IRn be a closed convex subset, A an n × m real matrix and b ∈ IRm. Consider the following linear split feasibility problem (LSFP) find x ∈ C such that ATx ≤ b, if such x exist. The problem has lot of applications, e.g., the problem of computed tomography or the problem of intensity
Results 1  10
of
85