Results 1 
4 of
4
regularized Neural Networks are Improperly Learnable in Polynomial Time
"... Abstract We study the improper learning of multilayer neural networks. Suppose that the neural network to be learned has k hidden layers and that the 1 norm of the incoming weights of any neuron is bounded by L. We present a kernelbased method, such that with probability at least 1 − δ, it learn ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract We study the improper learning of multilayer neural networks. Suppose that the neural network to be learned has k hidden layers and that the 1 norm of the incoming weights of any neuron is bounded by L. We present a kernelbased method, such that with probability at least 1 − δ, it learns a predictor whose generalization error is at most worse than that of the neural network. The sample complexity and the time complexity of the presented method are polynomial in the input dimension and in and on the activation function, independent of the number of neurons. The algorithm applies to both sigmoidlike activation functions and ReLUlike activation functions. It implies that any sufficiently sparse neural network is learnable in polynomial time.
Improper Deep Kernels
"... Abstract Neural networks have recently reemerged as a powerful hypothesis class, yielding impressive classification accuracy in multiple domains. However, their training is a nonconvex optimization problem which poses theoretical and practical challenges. Here we address this difficulty by turnin ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Neural networks have recently reemerged as a powerful hypothesis class, yielding impressive classification accuracy in multiple domains. However, their training is a nonconvex optimization problem which poses theoretical and practical challenges. Here we address this difficulty by turning to "improper" learning of neural nets. In other words, we learn a classifier that is not a neural net but is competitive with the best neural net model given a sufficient number of training examples. Our approach relies on a novel kernel construction scheme in which the kernel is a result of integration over the set of all possible instantiation of neural models. It turns out that the corresponding integral can be evaluated in closedform via a simple recursion. Thus we translate the nonconvex learning problem of a neural net to an SVM with an appropriate kernel. We also provide sample complexity results which depend on the stability of the optimal neural net.
Dictionary Learning and Sparse Coding for Thirdorder Supersymmetric Tensors
, 2015
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
Local Convolutional Features with Unsupervised Training for Image Retrieval
, 2015
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.