Results

**1 - 2**of**2**### Tensor-Variate Restricted Boltzmann Machines

"... Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus re-sul ..."

Abstract
- Add to MetaCart

Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus re-sulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-order interaction struc-tures. This paper introduces Tensor-variate Restricted Boltz-mann Machines (TvRBMs) which generalize RBMs to cap-ture the multiplicative interaction between data modes and the latent variables. TvRBMs are highly compact in that the number of free parameters grows only linear with the number of modes. We demonstrate the capacity of TvRBMs on three real-world applications: handwritten digit classification, face recognition and EEG-based alcoholic diagnosis. The learnt features of the model are more discriminative than the rivals, resulting in better classification performance. 1

### Tensor-variate Restricted Boltzmann Machines

"... Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus re-sul ..."

Abstract
- Add to MetaCart

Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus re-sulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-order interaction struc-tures. This paper introduces Tensor-variate Restricted Boltz-mann Machines (TvRBMs) which generalize RBMs to cap-ture the multiplicative interaction between data modes and the latent variables. TvRBMs are highly compact in that the number of free parameters grows only linear with the number of modes. We demonstrate the capacity of TvRBMs on three real-world applications: handwritten digit classification, face recognition and EEG-based alcoholic diagnosis. The learnt features of the model are more discriminative than the rivals, resulting in better classification performance. 1