Results 11  20
of
161
OBSTACLES TO HIGHDIMENSIONAL PARTICLE FILTERING
"... Particle filters are ensemblebased assimilation schemes that, unlike the ensemble Kalman filter, employ a fully nonlinear and nonGaussian analysis step to compute the probability distribution function (pdf) of a system’s state conditioned on a set of observations. Evidence is provided that the ens ..."
Abstract

Cited by 93 (5 self)
 Add to MetaCart
Particle filters are ensemblebased assimilation schemes that, unlike the ensemble Kalman filter, employ a fully nonlinear and nonGaussian analysis step to compute the probability distribution function (pdf) of a system’s state conditioned on a set of observations. Evidence is provided that the ensemble size required for a successful particle filter scales exponentially with the problem size. For the simple example in which each component of the state vector is independent, Gaussian and of unit variance, and the observations are of each state component separately with independent, Gaussian errors, simulations indicate that the required ensemble size scales exponentially with the state dimension. In this example, the particle filter requires at least 1011 members when applied to a 200dimensional state. Asymptotic results, following the work of Bengtsson, Bickel and collaborators, are provided for two cases: one in which each prior state component is independent and identically distributed, and one in which both the prior pdf and the observation errors are Gaussian. The asymptotic theory reveals that, in both cases, the required ensemble size scales exponentially with the variance of the observation loglikelihood, rather than with the state dimension per se. 2
2004: A ThreeDimensional Variational Data Assimilation System for MM5: Implementation and Initial Results Mon
 Wea. Rev
, 2004
"... A limitedarea threedimensional variational data assimilation (3DVAR) system applicable to both synoptic and mesoscale numerical weather prediction is described. The system is designed for use in timecritical realtime applications and is freely available to the data assimilation community for gene ..."
Abstract

Cited by 71 (7 self)
 Add to MetaCart
(Show Context)
A limitedarea threedimensional variational data assimilation (3DVAR) system applicable to both synoptic and mesoscale numerical weather prediction is described. The system is designed for use in timecritical realtime applications and is freely available to the data assimilation community for general research. The unique features of this implementation of 3DVAR include (a) an analysis space represented by recursive filters and truncated eigenmodes of the background error covariance matrix, (b) the inclusion of a cyclostrophic term in 3DVAR’s explicit mass–wind balance equation, and (c) the use of the software architecture of the Weather Research and Forecast (WRF) model to permit efficient performance on distributedmemory platforms. The 3DVAR system is applied to a multiresolution, nesteddomain forecast system. Resolution and seasonaldependent background error statistics are presented. A typhoon bogusing case study is performed to illustrate the 3DVAR response to a single surface pressure observation and its subsequent impact on numerical forecasts of the fifthgeneration Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5). Results are also presented from an initial realtime MM5based application of 3DVAR. 1.
Which is better, an ensemble of positivenegative pairs or a centered spherical simplex ensemble? Monthly Weather Rev
, 2004
"... New methods to center the initial ensemble perturbations on the analysis are introduced and compared with the commonly used centering method of positive–negative paired perturbations. In the new method, one linearly dependent perturbation is added to a set of linearly independent initial perturbatio ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
(Show Context)
New methods to center the initial ensemble perturbations on the analysis are introduced and compared with the commonly used centering method of positive–negative paired perturbations. In the new method, one linearly dependent perturbation is added to a set of linearly independent initial perturbations to ensure that the sum of the new initial perturbations equals zero; the covariance calculated from the new initial perturbations is equal to the analysis error covariance estimated by the independent initial perturbations, and all of the new initial perturbations are equally likely. The new method is illustrated by applying it to the ensemble transform Kalman filter (ETKF) ensemble forecast scheme, and the resulting ensemble is called the spherical simplex ETKF ensemble. It is shown from a multidimensional Taylor expansion that the symmetric positive–negative paired centering would yield a more accurate forecast ensemble mean and covariance than the spherical simplex centering if the ensemble were large enough to span all initial uncertain directions and thus the analysis error covariance was modeled precisely. However, when the number of uncertain directions is larger than the ensemble size, the spherical simplex centering has the advantage of allowing almost twice as many uncertain directions to be spanned as the symmetric positive–negative paired centering. The performances of the spherical simplex ETKF and symmetric positive–negative paired ETKF ensembles are compared by using the Community Climate Model
Exploiting local low dimensionality of the atmospheric dynamics . . .
 PHYS. REV. LETT
, 2002
"... Recent studies (Patil et al. 2001, 2002) have shown that, when the Earth’s surface is divided up into local regions of moderate size, vectors of the forecast uncertainties in such regions tend to lie in a subspace of much lower dimension than that of the full atmospheric state vector. In this paper ..."
Abstract

Cited by 51 (17 self)
 Add to MetaCart
Recent studies (Patil et al. 2001, 2002) have shown that, when the Earth’s surface is divided up into local regions of moderate size, vectors of the forecast uncertainties in such regions tend to lie in a subspace of much lower dimension than that of the full atmospheric state vector. In this paper we show how this finding can be exploited to formulate a potentially accurate and efficient data assimilation technique. The basic idea is that, since the expected forecast errors lie in a locally low dimensional subspace, the analysis resulting from the data assimilation should also lie in this subspace. This implies that operations only on relatively low dimensional matrices are required. The data assimilation analysis is done locally in a manner allowing massively parallel computation to be exploited. The local analyses are then used to construct global states for advancement to the next forecast time. Potential advantages of the method are discussed. 1
4DVar or Ensemble Kalman Filter?
 TELLUS
, 2007
"... We consider the relative advantages of two advanced data assimilation systems, 4DVar and ensemble Kalman filter (EnKF), currently in use or under consideration for operational implementation. With the Lorenz model, we explore the impact of tuning assimilation parameters such as the assimilation wi ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
We consider the relative advantages of two advanced data assimilation systems, 4DVar and ensemble Kalman filter (EnKF), currently in use or under consideration for operational implementation. With the Lorenz model, we explore the impact of tuning assimilation parameters such as the assimilation window length and background error covariance in 4DVar, variance inflation in EnKF, and the effect of model errors and reduced observation coverage. For short assimilation windows EnKF gives more accurate analyses. Both systems reach similar levels of accuracy if long windows are used for 4DVar. For infrequent observations, when ensemble perturbations grow nonlinearly and become nonGaussian, 4DVar attains lower errors than EnKF. If the model is imperfect, the 4DVar with long windows requires weak constraint. Similar results are obtained with a quasigeostrophic channel model. EnKF experiments made with the primitive equations SPEEDY model provide comparisons with 3DVar and guidance on model error and ‘observation localization’. Results obtained using operational models and both simulated and real observations indicate that currently EnKF is becoming competitive with 4DVar, and that the experience acquired with each of these methods can be used to improve the other. A table summarizes the pros and cons of the two methods.
T.: A method for assimilation of Lagrangian data
 Weather Rev
"... Difficulties in the assimilation of Lagrangian data arise because the state of the prognostic model is generally described in terms of Eulerian variables computed on a fixed grid in space, as a result there is no direct connection between the model variables and Lagrangian observations that carry ti ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
Difficulties in the assimilation of Lagrangian data arise because the state of the prognostic model is generally described in terms of Eulerian variables computed on a fixed grid in space, as a result there is no direct connection between the model variables and Lagrangian observations that carry timeintegrated information. A method is presented for assimilating Lagrangian tracer positions, observed at discrete times, directly into the model. The idea is to augment the model with tracer advection equations and to track the correlations between the flow and the tracers via the extended Kalman filter. The augmented model state vector includes tracer coordinates and is updated through the correlations to the observed tracers. The technique is tested for point vortex flows: an NF point vortex system with a Gaussian noise term is modeled by its deterministic counterpart. Positions of ND tracer particles are observed at regular time intervals and assimilated into the model. Numerical experiments demonstrate successful system tracking for (NF, ND) � (2, 1), (4, 2), provided the observations are reasonably frequent and accurate and the system noise level is not too high. The performance of the filter strongly depends on initial tracer positions (drifter launch locations). Analysis of this dependence shows that the good launch locations are separated from the bad ones by Lagrangian flow structures (separatrices or invariant manifolds of the velocity field). The method is compared to an alternative indirect approach, where the flow velocity, estimated from two (or more) consecutive drifter observations, is assimilated directly into the model. 1.
2008b), ‘Data assimilation: Mathematical and statistical perspectives
 Internat. J. Numer. Methods Fluids
"... The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of th ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
(Show Context)
The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of these approaches are then introduced, leading to filtering (generalizing 3DVAR) and a form of smoothing (generalizing 4DVAR and weak constraint 4DVAR) and the optimization methods are shown to be maximum a posteriori estimators for the probability distributions implied by these statistical approaches; and (iii) by taking a general dynamical systems perspective on the subject it is shown that the incorporation of Lagrangian data can be handled by a straightforward extension of the preceding concepts. We argue that the smoothing approach to data assimilation, based on statistical analogues of 4DVAR and weak constraint 4DVAR, provides the optimal solution to the assimilation of space–time distributed data into a model. The optimal solution obtained is a probability distribution on the relevant class of functions (initial conditions or timedependent solutions). The approach is a useful one in the first instance because it clarifies the notion of what is the optimal solution, thereby providing a benchmark against which existing approaches can be evaluated. In the longer term it also provides the potential for new methods to create ensembles of solutions to the model, incorporating the available data in an optimal
Evolving the subspace of the threedimensional multiscale ocean variability: Massachusetts Bay
 J. Marine Systems
, 2001
"... A data and dynamics driven approach to estimate, decompose, organize and analyze the evolving threedimensional variability of ocean fields is outlined. Variability refers here to the statistics of the differences between ocean states and a reference state. In general, these statistics evolve in tim ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A data and dynamics driven approach to estimate, decompose, organize and analyze the evolving threedimensional variability of ocean fields is outlined. Variability refers here to the statistics of the differences between ocean states and a reference state. In general, these statistics evolve in time and space. For a first endeavor, the variability subspace defined by the dominant eigendecomposition of a normalized form of the variability covariance is evolved. A multiscale methodology for its initialization and forecast is outlined. It combines data and primitive equation dynamics within a MonteCarlo approach. The methodology is applied to part of a multidisciplinary experiment that occurred in Massachusetts Bay in late summer and early fall of 1998. For a 4day time period, the threedimensional and multivariate properties of the variability standard deviations and dominant eigenvectors are studied. Two variability patterns are discussed in detail. One relates to a displacement of the Gulf of Maine coastal current offshore from Cape Ann, with the creation of adjacent mesoscale recirculation cells. The other relates to a Baywide coastal upwelling mode from Barnstable Harbor to Gloucester in response to strong southerly winds. Snapshots and tendencies of physical fields and trajectories of simulated Lagrangian drifters are employed to diagnose and illustrate the use of the dominant variability covariance. The variability subspace is shown to guide the dynamical analysis of the physical fields. For the stratified conditions, it is found that strong wind events can alter the structures of the buoyancy flow and that circulation features are more variable than previously described, on multiple
A random map implementation of implicit filters
"... Implicit particle filters for data assimilation generate highprobability samples by representing each particle location as a separate function of a common reference variable. This representation requires that a certain underdetermined equation be solved for each particle and at each time an observa ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
(Show Context)
Implicit particle filters for data assimilation generate highprobability samples by representing each particle location as a separate function of a common reference variable. This representation requires that a certain underdetermined equation be solved for each particle and at each time an observation becomes available. We present a new implementation of implicit filters in which we find the solution of the equation via a random map. As examples, we assimilate data for a stochastically driven Lorenz system with sparse observations and for a stochastic KuramotoSivashinski equation with observations that are sparse in both space and time.