Results 1  10
of
8,366
SECONDORDER DERIVATIVES AND REARRANGEMENTS
 VOL. 105, NO. 3 DUKE MATHEMATICAL JOURNAL
, 2000
"... ..."
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
 Advances in Neural Information Processing Systems 5
, 1993
"... We investigate the use of information from all second order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization and increase the speed of further training. ..."
Abstract

Cited by 204 (2 self)
 Add to MetaCart
We investigate the use of information from all second order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization and increase the speed of further training.
On Discretization of Secondorder Derivatives in Smoothed Particle Hydrodynamics
"... Abstractâ€”Discretization of spatial derivatives is an important issue in meshfree methods especially when the derivative terms contain nonlinear coefficients. In this paper, various methods used for discretization of secondorder spatial derivatives are investigated in the context of Smoothed Partic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstractâ€”Discretization of spatial derivatives is an important issue in meshfree methods especially when the derivative terms contain nonlinear coefficients. In this paper, various methods used for discretization of secondorder spatial derivatives are investigated in the context of Smoothed
On discontinuous Galerkin discretizations of secondorder derivatives
"... Some properties of a Local discontinuous Galerkin (LDG) algorithm are demonstrated for the problem of evaluting a second derivative g = fxx for a given f. (This is a somewhat unusual problem, but it is useful for understanding the initial transient response of an algorithm for diffusion equations.) ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Some properties of a Local discontinuous Galerkin (LDG) algorithm are demonstrated for the problem of evaluting a second derivative g = fxx for a given f. (This is a somewhat unusual problem, but it is useful for understanding the initial transient response of an algorithm for diffusion equations
On discontinuous Galerkin discretizations of secondorder derivatives
"... Abstract Some properties of a Local discontinuous Galerkin (LDG) algorithm are demonstrated for the problem of evaluting a second derivative g = f xx for a given f . (This is a somewhat unusual problem, but it is useful for understanding the initial transient response of an algorithm for diffusion ..."
Abstract
 Add to MetaCart
Abstract Some properties of a Local discontinuous Galerkin (LDG) algorithm are demonstrated for the problem of evaluting a second derivative g = f xx for a given f . (This is a somewhat unusual problem, but it is useful for understanding the initial transient response of an algorithm for diffusion
ClassicCurvature: A Second Order Derivatives Based Approach
"... Abstract: Given a signal data in either one or in multiple dimensions, and given a model function fitted to the signal data, it is possible to calculate the curvature through the computation of all of the second order derivatives of the model function using rigorous calculus methodologies. This manu ..."
Abstract
 Add to MetaCart
Abstract: Given a signal data in either one or in multiple dimensions, and given a model function fitted to the signal data, it is possible to calculate the curvature through the computation of all of the second order derivatives of the model function using rigorous calculus methodologies
On the second order derivatives of convex functions on the Heisenberg group
 Ann. Scuola Norm. Sup. Pisa Cl. Sci
"... A classical result of Aleksandrov asserts that convex functions inR n are twice differentiable a.e., and a first step to prove it is to show that these functions have second order distributional derivatives which are measures, see [3, pp. 239245]. On the Heisenberg group, and more generally in Carn ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
A classical result of Aleksandrov asserts that convex functions inR n are twice differentiable a.e., and a first step to prove it is to show that these functions have second order distributional derivatives which are measures, see [3, pp. 239245]. On the Heisenberg group, and more generally
On second order derivatives of convex functions on infinite dimensional . . .
, 2004
"... We consider convex functions on infinite dimensional spaces equipped with measures. Our main results give some estimates of the first and second derivatives of a convex function, where second derivatives are considered from two different points of view: as point functions and as measures. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We consider convex functions on infinite dimensional spaces equipped with measures. Our main results give some estimates of the first and second derivatives of a convex function, where second derivatives are considered from two different points of view: as point functions and as measures.
NUMERICAL DIFFERENTIATION FOR THE SECOND ORDER DERIVATIVE OF FUNCTIONS WITH SEVERAL VARIABLES
"... Abstract. We propose a regularized optimization problem for computing numerical differentiation for the second order derivative for functions with two variables from the noisy values of the function at scattered points, and give the proof of the existence and uniqueness of the solution of this probl ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. We propose a regularized optimization problem for computing numerical differentiation for the second order derivative for functions with two variables from the noisy values of the function at scattered points, and give the proof of the existence and uniqueness of the solution
Jump detection in regression surfaces using both firstorder and secondorder derivatives
 Journal of Computational and Graphical Statistics
, 2007
"... We consider the problem of detecting jump location curves of regression surfaces. In the literature, most existing methods detect jumps in regression surfaces based on estimation of either the firstorder or the secondorder derivatives of regression surfaces. Methods based on the firstorder deriva ..."
Abstract

Cited by 8 (8 self)
 Add to MetaCart
We consider the problem of detecting jump location curves of regression surfaces. In the literature, most existing methods detect jumps in regression surfaces based on estimation of either the firstorder or the secondorder derivatives of regression surfaces. Methods based on the firstorder
Results 1  10
of
8,366