Results 1 
6 of
6
Optimal control of multiscale systems using reducedorder models, submitted
, 2014
"... We study optimal control of diffusions with slow and fast variables and address a question raised by practitioners: is it possible to first eliminate the fast variables before solving the optimal control problem and then use the optimal control computed from the reducedorder model to control the or ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
We study optimal control of diffusions with slow and fast variables and address a question raised by practitioners: is it possible to first eliminate the fast variables before solving the optimal control problem and then use the optimal control computed from the reducedorder model to control the original, highdimensional system? The strategy “first reduce, then optimize”—rather than “first optimize, then reduce”—is motivated by the fact that solving optimal control problems for highdimensional multiscale systems is numerically challenging and often computationally prohibitive. We state sufficient and necessary conditions, under which the “first reduce, then control ” strategy can be employed and discuss when it should be avoided. We further give numerical examples that illustrate the “first reduce, then optmize ” approach and discuss possible pitfalls. 1
Linear Theory for Filtering Nonlinear Multiscale Systems with Model Error
, 2014
"... In this paper, we study filtering of multiscale dynamical systems with model error arising from limitations in resolving the smaller scale processes. In particular, the analysis assumes the availability of continuoustime noisy observations of all components of the slow variables. Mathematically, t ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
In this paper, we study filtering of multiscale dynamical systems with model error arising from limitations in resolving the smaller scale processes. In particular, the analysis assumes the availability of continuoustime noisy observations of all components of the slow variables. Mathematically, this paper presents new results on higherorder asymptotic expansion of the first two moments of a conditional measure. In particular, we are interested in the application of filtering multiscale problems in which the conditional distribution is defined over the slow variables, given noisy observation of the slow variables alone. From the mathematical analysis, we learn that for a continuous time linear model with Gaussian noise, there exists a unique choice of parameters in a linear reduced model for the slow variables which gives the optimal filtering when only the slow variables are observed. Moreover, these parameters simultaneously give the optimal equilibrium statistical estimates of the underlying system, and as a consequence they can be estimated offline from the equilibrium statistics of the true signal. By examining a nonlinear test model, we show that the linear theory extends in this nonGaussian, nonlinear configuration as long as we know the optimal stochastic parameterization and the correct observation model. However, when the stochastic parameterization model is inappropriate, parameters chosen for good filter performance may give poor equilibrium statistical estimates and vice versa; this finding is based on analytical and numerical results
FILTERING THE MAXIMUM LIKELIHOOD FOR MULTISCALE PROBLEMS
"... Abstract. Filtering and parameter estimation under partial information for multiscale problems is studied in this paper. After proving mean square convergence of the nonlinear filter to a filter of reduced dimension, we establish that the conditional (on the observations) loglikelihood process has ..."
Abstract
 Add to MetaCart
Abstract. Filtering and parameter estimation under partial information for multiscale problems is studied in this paper. After proving mean square convergence of the nonlinear filter to a filter of reduced dimension, we establish that the conditional (on the observations) loglikelihood process has a correction term given by a type of central limit theorem. To achieve this we assume that the operator of the (hidden) fast process has a discrete spectrum and an orthonormal basis of eigenfunctions. Based on these results, we then propose to estimate the unknown parameters of the model based on the limiting loglikelihood, which is an easier function to optimize because it of reduced dimension. We also establish consistency and asymptotic normality of the maximum likelihood estimator based on the reduced loglikelihood. Simulation results illustrate our theoretical findings. 1.
Notes on: Data Assimilation with Model Error from Unresolved Scales
, 2014
"... ar ..."
(Show Context)