Results 1  10
of
14
Efficient Monte Carlo simulation via the generalized splitting method. Statistics and Computing
, 2011
"... We describe a new Monte Carlo algorithm for the consistent and unbiased estimation of multidimensional integrals and the efficient sampling from multidimensional densities. The algorithm is inspired by the classical splitting method and can be applied to general static simulation models. We provide ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
We describe a new Monte Carlo algorithm for the consistent and unbiased estimation of multidimensional integrals and the efficient sampling from multidimensional densities. The algorithm is inspired by the classical splitting method and can be applied to general static simulation models. We provide examples from rareevent probability estimation, counting, and sampling, demonstrating that the proposed method can outperform existing Markov chain sampling methods in terms of convergence speed and accuracy.
Xiu: An efficient surrogatebased method for computing rare failure probability.
 Journal of Computational Physics,
, 2011
"... a b s t r a c t In this paper, we present an efficient numerical method for evaluating rare failure probability. The method is based on a recently developed surrogatebased method from Li and Xiu [J. Li, D. Xiu, Evaluation of failure probability via surrogate models, J. Comput. Phys. 229 (2010) 896 ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
a b s t r a c t In this paper, we present an efficient numerical method for evaluating rare failure probability. The method is based on a recently developed surrogatebased method from Li and Xiu [J. Li, D. Xiu, Evaluation of failure probability via surrogate models, J. Comput. Phys. 229 (2010) 89668980] for failure probability computation. The method by Li and Xiu is of hybrid nature, in the sense that samples of both the surrogate model and the true physical model are used, and its efficiency gain relies on using only very few samples of the true model. Here we extend the capability of the method to rare probability computation by using the idea of importance sampling (IS). In particular, we employ crossentropy (CE) method, which is an effective method to determine the biasing distribution in IS. We demonstrate that, by combining with the CE method, a surrogatebased IS algorithm can be constructed and is highly efficient for rare failure probability computationit incurs much reduced simulation efforts compared to the traditional CEIS method. In many cases, the new method is capable of capturing failure probability as small as 10 À12 $ 10 À6 with only several hundreds samples.
Generalized CrossEntropy Methods
"... The crossentropy and minimum crossentropy methods are wellknown Monte Carlo simulation techniques for rareevent probability estimation and optimization. In this paper we investigate how these methods can be extended to provide a general nonparametric crossentropy framework based on φdivergenc ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
The crossentropy and minimum crossentropy methods are wellknown Monte Carlo simulation techniques for rareevent probability estimation and optimization. In this paper we investigate how these methods can be extended to provide a general nonparametric crossentropy framework based on φdivergence distance measures. We show how the χ2 distance in particular yields a viable alternative to KullbackLeibler distance. The theory is illustrated with various examples from density estimation, rareevent simulation and continuous multiextremal optimization. 1
Statedependent importance sampling schemes via minimum crossentropy
, 2008
"... We present a method to obtain state and timedependent importance sampling estimators by repeatedly solving a minimum crossentropy (MCE) program as the simulation progresses. This MCEbased approach lends a foundation to the natural notion to stop changing the measure when it is no longer needed ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We present a method to obtain state and timedependent importance sampling estimators by repeatedly solving a minimum crossentropy (MCE) program as the simulation progresses. This MCEbased approach lends a foundation to the natural notion to stop changing the measure when it is no longer needed. We use this method to obtain a state and timedependent estimator for the onetailed probability of a lighttailed i.i.d. sum that is logarithmically efficient in general and strongly efficient when the jumps are Gaussian. We go on to construct an estimator for the twotailed problem which is shown to be similarly efficient. We consider minor variants of the algorithm obtained via
Author manuscript, published in "MIM'2013 Manufacturing Modelling, Management and Control, Saint Petersburg: Russian Federation (2013)" Evaluation of Minimal Data Size by Using Entropy, in a HMM Maintenance Manufacturing Use
, 2013
"... Abstract: In this paper, we wish to find a minimal data size in order to better conceptualize industrial maintenance activities. We based our study on data given by a Synthetic Hidden Markov Model. This synthetic model is intended to produce real industrial maintenance observations (or “symbols”), w ..."
Abstract
 Add to MetaCart
Abstract: In this paper, we wish to find a minimal data size in order to better conceptualize industrial maintenance activities. We based our study on data given by a Synthetic Hidden Markov Model. This synthetic model is intended to produce real industrial maintenance observations (or “symbols”), with a corresponding degradation indicator. These time series events are shown as Markov chains, also called “signatures”. The production of symbols is generated by using a uniform and a normal distribution. The evaluation is made by applying Shannon entropy on the HMM parameters. The results show a minimal number of data for each distribution studied. After a discussion about the use of a new “Sliding Window ” of symbols usable in a Computerized Maintenance Management System, we developed two industrial applications and compare them with the best optimized “signature ” previously found.
DataDriven Anomaly Detection based on a Bias Change?
"... Abstract: This paper proposes offline and online datadriven approaches to anomaly detection based on generalized likelihood ratio tests for a bias change. The procedure is divided into two steps. Assuming availability of a nominal dataset, a nonparametric density estimate is obtained in the first ..."
Abstract
 Add to MetaCart
Abstract: This paper proposes offline and online datadriven approaches to anomaly detection based on generalized likelihood ratio tests for a bias change. The procedure is divided into two steps. Assuming availability of a nominal dataset, a nonparametric density estimate is obtained in the first step, prior to the test. Second, the unknown bias change is estimated from test data. Based on the expectation maximization (EM) algorithm, batch and sequential maximum likelihood estimators of the bias change are derived for the case where the density estimate is given by a Gaussian mixture. Approximate asymptotic expressions for the probabilities of error are suggested based on available results. Real world experiments illustrate the approach.
Marginal Likelihood Estimation with the CrossEntropy Method
, 2012
"... We consider an adaptive importance sampling approach to estimating the marginal likelihood, a quantity that is fundamental in Bayesian model comparison and Bayesian model averaging. This approach is motivated by the difficulty of obtaining an accurate estimate through existing algorithms that use M ..."
Abstract
 Add to MetaCart
We consider an adaptive importance sampling approach to estimating the marginal likelihood, a quantity that is fundamental in Bayesian model comparison and Bayesian model averaging. This approach is motivated by the difficulty of obtaining an accurate estimate through existing algorithms that use Markov chain Monte Carlo (MCMC) draws, where the draws are typically costly to obtain and highly correlated in highdimensional settings. In contrast, we use the crossentropy (CE) method, a versatile adaptive Monte Carlo algorithm originally developed for rareevent simulation. The main advantage of the importance sampling approach is that random samples can be obtained from some convenient density with little additional costs. As we are generating independent draws instead of correlated MCMC draws, the increase in simulation effort is much smaller should one wish to reduce the numerical standard error of the estimator. Moreover, the importance density derived via the CE method is grounded in information theory, and therefore, is in a welldefined sense optimal. We demonstrate the utility of the proposed approach by two empirical applications involving women’s labor market participation and U.S. macroeconomic time series. In both applications the proposed CE method compares favorably to existing estimators.
MIMO Wireless Communications
"... © 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to ..."
Abstract
 Add to MetaCart
(Show Context)
© 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. SoftInput SoftOutput King Decoder for Coded
Simulation of Linear Block Codes over BSC
"... © 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to ..."
Abstract
 Add to MetaCart
(Show Context)
© 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. Suboptimal Importance Sampling for Fast
IMPORTANCE SAMPLING FOR PARAMETRIC ESTIMATION
"... We consider a class of parametric estimation problems where the goal is efficient estimation of a quantity of interest for many instances that differ in some model or decision parameters. We have proposed an approach, called DataBase Monte Carlo (DBMC), that uses variance reduction techniques in a “ ..."
Abstract
 Add to MetaCart
We consider a class of parametric estimation problems where the goal is efficient estimation of a quantity of interest for many instances that differ in some model or decision parameters. We have proposed an approach, called DataBase Monte Carlo (DBMC), that uses variance reduction techniques in a “constructive ” way in this setting: Information is gathered through sampling at a set of parameter values and is used to construct effective variance reducing algorithms when estimating at other parameters. We have used DBMC along with the variance reduction techniques of stratification and control variates. In this paper we present results for the application of DBMC in conjunction with importance sampling. We use the optimal sampling measure at a nominal parameter as a sampling measure at neighboring parameters and analyze the variance of the resulting importance sampling estimator. Experimental results for this implementation are provided. 1