Results 1 
2 of
2
Succinct Sampling from Discrete Distributions
"... We revisit the classic problem of sampling from a discrete distribution: Given n nonnegative wbit integers x1,..., xn, the task is to build a data structure that allows sampling i with probability proportional to xi. The classic solution is Walker’s alias method that takes, when implemented on a W ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We revisit the classic problem of sampling from a discrete distribution: Given n nonnegative wbit integers x1,..., xn, the task is to build a data structure that allows sampling i with probability proportional to xi. The classic solution is Walker’s alias method that takes, when implemented on a Word RAM, O(n) preprocessing time, O(1) expected query time for one sample, and n(w+2 lg n+o(1)) bits of space. Using the terminology of succinct data structures, this solution has redundancy 2n lg n + o(n) bits, i.e., it uses 2n lg n + o(n) bits in addition to the information theoretic minimum required for storing the input. In this paper, we study whether this space usage can be improved. In the systematic case, in which the input is readonly, we present a novel data structure using r + O(w) redundant
CMAES with Optimal Covariance Update and Storage Complexity
"... Abstract The covariance matrix adaptation evolution strategy (CMAES) is arguably one of the most powerful realvalued derivativefree optimization algorithms, finding many applications in machine learning. The CMAES is a Monte Carlo method, sampling from a sequence of multivariate Gaussian distr ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract The covariance matrix adaptation evolution strategy (CMAES) is arguably one of the most powerful realvalued derivativefree optimization algorithms, finding many applications in machine learning. The CMAES is a Monte Carlo method, sampling from a sequence of multivariate Gaussian distributions. Given the function values at the sampled points, updating and storing the covariance matrix dominates the time and space complexity in each iteration of the algorithm. We propose a numerically stable quadratictime covariance matrix update scheme with minimal memory requirements based on maintaining triangular Cholesky factors. This requires a modification of the cumulative stepsize adaption (CSA) mechanism in the CMAES, in which we replace the inverse of the square root of the covariance matrix by the inverse of the triangular Cholesky factor. Because the triangular Cholesky factor changes smoothly with the matrix square root, this modification does not change the behavior of the CMAES in terms of required objective function evaluations as verified empirically. Thus, the described algorithm can and should replace the standard CMAES if updating and storing the covariance matrix matters.