Results 1  10
of
48
Parallel white noise generation on a gpu via cryptographic hash
 In Proceedings of SI3D
, 2008
"... A good random number generator is essential for many graphics applications. As more such applications move onto parallel processing, it is vital that a good parallel random number generator be used. Unfortunately, most random number generators today are still sequential, exposing performance bottle ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
A good random number generator is essential for many graphics applications. As more such applications move onto parallel processing, it is vital that a good parallel random number generator be used. Unfortunately, most random number generators today are still sequential, exposing performance bottlenecks and denying random accessibility for parallel computations. Furthermore, popular parallel random number generators are still based off sequential methods and can exhibit statistical bias. In this paper, we propose a random number generator that maps well onto a parallel processor while possessing white noise distribution. Our generator is based on cryptographic hash functions whose statistical robustness has been examined under heavy scrutiny by cryptologists. We implement our generator as a GPU pixel program, allowing us to compute random numbers in parallel just like ordinary texture fetches: given a texture coordinate per pixel, instead of returning a texel as in ordinary texture fetches, our pixel program computes a random noise value based on this given texture coordinate. We demonstrate that our approach features the best quality, speed, and random accessibility for graphics applications.
Efficient Maximal PoissonDisk Sampling
, 2011
"... We solve the problem of generating a uniform Poissondisk sampling that is both maximal and unbiased over bounded nonconvex domains. To our knowledge this is the first provably correct algorithm with time and space dependent only on the number of points produced. Our method has two phases, both b ..."
Abstract

Cited by 26 (10 self)
 Add to MetaCart
We solve the problem of generating a uniform Poissondisk sampling that is both maximal and unbiased over bounded nonconvex domains. To our knowledge this is the first provably correct algorithm with time and space dependent only on the number of points produced. Our method has two phases, both based on classical dartthrowing. The first phase uses a background grid of square cells to rapidly create an unbiased, nearmaximal covering of the domain. The second phase completes the maximal covering by calculating the connected components of the remaining uncovered voids, and by using their geometry to efficiently place unbiased samples that cover them. The second phase converges quickly, overcoming a common difficulty in dartthrowing methods. The deterministic memory is O(n) and the expected running time is O(n log n), where n is the output size, the number of points in the final sample. Our serial implementation verifies that the log n dependence is minor, and nearly O(n) performance for both time and memory is achieved in practice. We also present a parallel implementation on GPUs to demonstrate the parallelfriendly nature of our method, which achieves 2.4 × the performance of our serial version.
An alternative for Wang tiles: colored edges versus colored corners
 ACM TRANS. GRAPHICS
, 2006
"... In this article we revisit the concept of Wang tiles and introduce corner tiles, square tiles with colored corners. During past years, Wang tiles have become a valuable tool in computer graphics. Important applications of Wang tiles include texture synthesis, tilebased texture mapping, and generatin ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
In this article we revisit the concept of Wang tiles and introduce corner tiles, square tiles with colored corners. During past years, Wang tiles have become a valuable tool in computer graphics. Important applications of Wang tiles include texture synthesis, tilebased texture mapping, and generating Poisson disk distributions. Through their colored edges, Wang tiles enforce continuity with their direct neighbors. However, Wang tiles do not directly constrain their diagonal neighbors. This leads to continuity problems near tile corners, a problem commonly known as the corner problem. Corner tiles, on the other hand, do impose restrictions on their diagonal neighbors, and thus are not subject to the corner problem. In this article we show that previous applications of Wang tiles can also be done using corner tiles, but that corner tiles have distinct advantages for each of these applications. Compared to Wang tiles, corner tiles are easier to tile, textures synthesized with corner tiles contain more samples from the original texture, corner tiles reduce the required texture memory by a factor of two for tilebased texture mapping, and Poisson disk distributions generated with corner tiles have better spectral properties. Corner tiles result in cleaner, simpler, and more efficient applications.
Variational Blue Noise Sampling
, 2011
"... Blue noise point sampling is one of the core algorithms in computer graphics. In this paper we present a new and versatile variational framework for generating point distributions with highquality blue noise characteristics while precisely adapting to given density functions. Different from previo ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Blue noise point sampling is one of the core algorithms in computer graphics. In this paper we present a new and versatile variational framework for generating point distributions with highquality blue noise characteristics while precisely adapting to given density functions. Different from previous approaches based on discrete settings of capacityconstrained Voronoi tessellation, we cast the blue noise sampling generation as a variational problem with continuous settings. Based on an accurate evaluation of the gradient of an energy function, an efficient optimization is developed which delivers significantly faster performance than the previous optimizationbased methods. Our framework can easily be extended to generating blue noise point samples on manifold surfaces and for multiclass sampling. The optimization formulation also allows us to naturally deal with dynamic domains, such as deformable surfaces, and to yield blue noise samplings with temporal coherence. We present experimental results to validate the efficacy of our variational framework. Finally, we show a variety of applications of the proposed methods, including nonphotorealistic image stippling, color stippling, and blue noise sampling on deformable surfaces.
Blue Noise through optimal transport
, 2012
"... We present a fast, scalable algorithm to generate highquality blue noise point distributions of arbitrary density functions. At its core is a novel formulation of the recentlyintroduced concept of capacityconstrained Voronoi tessellation as an optimal transport problem. This insight leads to a co ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We present a fast, scalable algorithm to generate highquality blue noise point distributions of arbitrary density functions. At its core is a novel formulation of the recentlyintroduced concept of capacityconstrained Voronoi tessellation as an optimal transport problem. This insight leads to a continuous formulation able to enforce the capacity constraints exactly, unlike previous work. We exploit the variational nature of this formulation to design an efficient optimization technique of point distributions via constrained minimization in the space of power diagrams. Our mathematical, algorithmic, and practical contributions lead to highquality blue noise point sets with improved spectral and spatial properties.
Differential Domain Analysis for Nonuniform Sampling
"... Sampling is a core component for many graphics applications including rendering, imaging, animation, and geometry processing. The efficacy of these applications often crucially depends upon the distribution quality of the underlying samples. While uniform sampling can be analyzed by using existing ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Sampling is a core component for many graphics applications including rendering, imaging, animation, and geometry processing. The efficacy of these applications often crucially depends upon the distribution quality of the underlying samples. While uniform sampling can be analyzed by using existing spatial and spectral methods, these cannot be easily extended to general nonuniform settings, such as adaptive, anisotropic, or nonEuclidean domains. We present new methods for analyzing nonuniform sample distributions. Our key insight is that standard Fourier analysis, which depends on samples ’ spatial locations, can be reformulated into an equivalent form that depends only on the distribution of their location differentials. We call this differential domain analysis. The main benefit of this reformulation is that it bridges the fundamental connection between the samples ’ spatial statistics and their spectral properties. In addition, it allows us to generalize our method with different computation kernels and differential measurements. Using this analysis, we can quantitatively measure the spatial and spectral properties of various nonuniform sample distributions, including adaptive, anisotropic, and nonEuclidean domains.
Efficient and Flexible Sampling with Blue Noise Properties of Triangular Meshes
"... This paper deals with the problem of taking random samples over the surface of a 3D mesh describing and evaluating efficient algorithms for generating different distributions. We discuss first the problem of generating a Monte Carlo distribution in a efficient and practical way avoiding common pitf ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
This paper deals with the problem of taking random samples over the surface of a 3D mesh describing and evaluating efficient algorithms for generating different distributions. We discuss first the problem of generating a Monte Carlo distribution in a efficient and practical way avoiding common pitfalls. Then, we propose Constrained Poissondisk sampling, a new Poissondisk sampling scheme for polygonal meshes which can be easily tweaked in order to generate customized set of points such as importance sampling or distributions with generic geometric constraints. In particular, two algorithms based on this approach are presented. An indepth analysis of the frequency characterization and performance of the proposed algorithms are also presented and discussed.
Blue noise sampling with controlled aliasing
 ACM Trans. on Graphics
, 2013
"... In this article we revisit the problem of blue noise sampling with a strong focus on the spectral properties of the sampling patterns. Starting from the observation that oscillations in the power spectrum of a sampling pattern can cause aliasing artifacts in the resulting images, we synthesize two n ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
In this article we revisit the problem of blue noise sampling with a strong focus on the spectral properties of the sampling patterns. Starting from the observation that oscillations in the power spectrum of a sampling pattern can cause aliasing artifacts in the resulting images, we synthesize two new types of blue noise patterns: step blue noise with a power spectrum in the form of a step function and singlepeak blue noise with a wide zeroregion and no oscillations except for a single peak. We study the mathematical relationship of the radial power spectrum to a spatial statistic known as the radial distribution function to determine which power spectra can actually be realized and to construct the corresponding point sets. Finally, we show that both proposed sampling patterns effectively prevent structured aliasing at low sampling rates and perform well at high sampling rates.
Perception of average value in multiclass scatterplots
 Visualization and Computer Graphics, IEEE Transactions on
"... (a) Larger differences between means lead to improved performance. (b) As the number of points per class increases performance remains good (in fact it may improve). (c) Stronger cues (color) outperform weaker ones (shape). Although, participants performed well even with weak cues. (d) Combining c ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
(a) Larger differences between means lead to improved performance. (b) As the number of points per class increases performance remains good (in fact it may improve). (c) Stronger cues (color) outperform weaker ones (shape). Although, participants performed well even with weak cues. (d) Combining cues redundantly does not improve performance. (e) Irrelevant cues do not degrade performance. Here, class is shown by color, but the random shape does not degrade performance. (f) Adding irrelevant additional classes to the scatterplot does not degrade performance. Fig. 1. Summary of results: viewers can efficiently make comparative mean judgements, choosing the class with the highest average position in multiclass scatterplots across a wide variety of conditions and encodings. Abstract—The visual system can make highly efficient aggregate judgements about a set of objects, with speed roughly independent of the number of objects considered. While there is a rich literature on these mechanisms and their ramifications for visual summarization tasks, this prior work rarely considers more complex tasks requiring multiple judgements over long periods of time, and has not considered certain critical aggregation types, such as the localization of the mean value of a set of points. In this paper, we explore these questions using a common visualization task as a case study: relative mean value judgements within multiclass scatterplots. We describe how the perception literature provides a set of expected constraints on the task, and evaluate these predictions with a largescale perceptual study with crowdsourced participants. Judgements are no harder when each set contains more points, redundant and conflicting encodings, as well as additional sets, do not strongly affect performance, and judgements are harder when using less salient encodings. These results have concrete ramifications for the design of scatterplots. Index Terms—Psychophysics, Information Visualization, Perceptual Study 1
Fourier Analysis of Stochastic Sampling Strategies for Assessing Bias and Variance in Integration
"... Each pixel in a photorealistic, computer generated picture is calculated by approximately integrating all the light arriving at the pixel, from the virtual scene. A common strategy to calculate these highdimensional integrals is to average the estimates at stochastically sampled locations. The strat ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Each pixel in a photorealistic, computer generated picture is calculated by approximately integrating all the light arriving at the pixel, from the virtual scene. A common strategy to calculate these highdimensional integrals is to average the estimates at stochastically sampled locations. The strategy with which the sampled locations are chosen is of utmost importance in deciding the quality of the approximation, and hence rendered image. We derive connections between the spectral properties of stochastic sampling patterns and the first and second order statistics of estimates of integration using the samples. Our equations provide insight into the assessment of stochastic sampling strategies for integration. We show that the amplitude of the expected Fourier spectrum of sampling patterns is a useful indicator of the bias when used in numerical integration. We deduce that estimator variance is directly dependent on the variance of the sampling spectrum over multiple realizations of the sampling pattern. We then analyse Gaussian jittered sampling, a simple variant of jittered sampling, that allows a smooth tradeoff of bias for variance in uniform (regular grid) sampling. We verify our predictions using spectral measurement, quantitative integration experiments and qualitative comparisons of rendered images. Keywords: sampling