Results 1  10
of
4,813
Model consistency of partly smooth regularizers
, 2014
"... This paper studies leastsquare regression penalized with partly smooth convex regularizers. This class of functions is very large and versatile allowing to promote solutions conforming to some notion of lowcomplexity. Indeed, they force solutions of variational problems to belong to a lowdimensi ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
This paper studies leastsquare regression penalized with partly smooth convex regularizers. This class of functions is very large and versatile allowing to promote solutions conforming to some notion of lowcomplexity. Indeed, they force solutions of variational problems to belong to a low
Simple square smoothing regularization operators
 Electron. Trans. Numer. Anal
"... Abstract. Tikhonov regularization of linear discrete illposed problems often is applied with a finite difference regularization operator that approximates a loworder derivative. These operators generally are represented by banded rectangular matrices with fewer rows than columns. They therefore ca ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. Tikhonov regularization of linear discrete illposed problems often is applied with a finite difference regularization operator that approximates a loworder derivative. These operators generally are represented by banded rectangular matrices with fewer rows than columns. They therefore
Smoothing Regularizers for Projective Basis Function Networks
, 1996
"... Smoothing regularizers for radial basis functions have been studied extensively, but no general smoothing regularizers for projective basis functions (PBFs), such as the widelyused sigmoidal PBFs, have heretofore th been proposed. We derive new classes of algebraicallysimple morder smoothing reg ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
Smoothing regularizers for radial basis functions have been studied extensively, but no general smoothing regularizers for projective basis functions (PBFs), such as the widelyused sigmoidal PBFs, have heretofore th been proposed. We derive new classes of algebraicallysimple morder smoothing
A Smoothing Regularizer for Feedforward and Recurrent Neural Networks
, 1996
"... We derive a smoothing regularizer for dynamic network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. For two layer networks with recurrent conn ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
We derive a smoothing regularizer for dynamic network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. For two layer networks with recurrent
A Smoothing Regularizer for Recurrent Neural Networks
 Advances in Neural Information Processing Systems 8, Proceedings of the 1995 Conference
, 1995
"... We derive a smoothing regularizer for recurrent network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. The closedform expression of the regul ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We derive a smoothing regularizer for recurrent network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. The closedform expression
Generation of anisotropicsmoothness regularization filters for EIT
 IEEE Transactions on Medical Imaging
, 2002
"... Abstract—In the inverse conductivity problem, as in any illposed inverse problem, regularization techniques are necessary in order to stabilize inversion. A common way to implement regularization in electrical impedance tomography is to use Tikhonov regularization. The inverse problem is formulated ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
as a minimization of two terms: the mismatch of the measurements against the model, and the regularization functional. Most commonly, differential operators are used as regularization functionals, leading to smooth solutions. Whenever the imaged region presents discontinuities in the conductivity
Smooth regularization of bangbang optimal control problems
 IEEE Trans. Automat. Control
"... Consider the minimal time control problem for a singleinput controlaffine system ˙x = X(x) + u1Y1(x) in IR n, where the scalar control u1(·) satisfies the constraint u1(·)  � 1. When applying a shooting method for solving this kind of optimal control problem, one may encounter numerical problem ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
problems due to the fact that the shooting function is not smooth whenever the control is bangbang. In this article we propose the following smoothing procedure. For ε> 0 small, we consider the minimal time problem for the control system ˙x = X(x) + u ε mX 1Y1(x) + ε controls u ε i(·), i = 1,..., m
VariableSmoothing Regularization Methods For Inverse Problems
"... this paper we will investigate the degree to which these variable parameters can be used to control smoothing via the examination of a several numerical examples. For fixed r = r(t) we will also discuss selection of the optimal regularization parameter ff = ff(t). This discussion will take place in ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
this paper we will investigate the degree to which these variable parameters can be used to control smoothing via the examination of a several numerical examples. For fixed r = r(t) we will also discuss selection of the optimal regularization parameter ff = ff(t). This discussion will take place
The ‘‘weight smoothing’’ regularization of MLP for Jacobian stabilization
 IEEE Trans. Neural Networks
, 1999
"... Abstract — In an approximation problem with a neural network, a lowoutput root mean square (rms) error is not always a universal criterion. In this paper, we investigate problems where the Jacobians—first derivative of an output value with respect to an input value—of the approximation model are ne ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
solutions can be chosen by the learning process. We propose to introduce the smoothness of Jacobian profiles as an a priori information via a regularization technique and develop a new and efficient learning algorithm, called “weight smoothing. ” We assess the robustness of the weight smoothing algorithm
Surface Mesh Smoothing, Regularization and Feature Detection
, 2008
"... We describe a hybrid algorithm that is designed to reconstruct a piecewise smooth surface mesh from noisy input. While denoising, our method simultaneously regularizes triangle meshes on flat regions for further mesh processing and preserves crease sharpness for faithful reconstruction. A clustering ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We describe a hybrid algorithm that is designed to reconstruct a piecewise smooth surface mesh from noisy input. While denoising, our method simultaneously regularizes triangle meshes on flat regions for further mesh processing and preserves crease sharpness for faithful reconstruction. A
Results 1  10
of
4,813