Results 1  10
of
280
Adaptive Sampling With the Ensemble Transform . . .
, 2001
"... A suboptimal Kalman filter called the ensemble transform Kalman filter (ET KF) is introduced. Like other Kalman filters, it provides a framework for assimilating observations and also for estimating the effect of observations on forecast error covariance. It differs from other ensemble Kalman filt ..."
Abstract

Cited by 321 (19 self)
 Add to MetaCart
A suboptimal Kalman filter called the ensemble transform Kalman filter (ET KF) is introduced. Like other Kalman filters, it provides a framework for assimilating observations and also for estimating the effect of observations on forecast error covariance. It differs from other ensemble Kalman filters in that it uses ensemble transformation and a normalization to rapidly obtain the prediction error covariance matrix associated with a particular deployment of observational resources. This rapidity enables it to quickly assess the ability of a large number of future feasible sequences of observational networks to reduce forecast error variance. The ET KF was used by the National Centers for Environmental Prediction in the Winter Storm Reconnaissance missions of 1999 and 2000 to determine where aircraft should deploy dropwindsondes in order to improve 2472h forecasts over the continental United States. The ET KF may be applied to any wellconstructed set of ensemble perturbations. The ET KF
An Ensemble Adjustment Kalman Filter for Data Assimilation
, 2001
"... A theory for estimating the probability distribution of the state of a model given a set of observations exists. This nonlinear ..."
Abstract

Cited by 283 (12 self)
 Add to MetaCart
A theory for estimating the probability distribution of the state of a model given a set of observations exists. This nonlinear
Ensemble forecasting at NCEP and the breeding method
 Mon. Wea. Rev
, 1997
"... The breeding method has been used to generate perturbations for ensemble forecasting at the National Centers for Environmental Prediction (formerly known as the National Meteorological Center) since December 1992. At that time a single breeding cycle with a pair of bred forecasts was implemented. In ..."
Abstract

Cited by 193 (16 self)
 Add to MetaCart
The breeding method has been used to generate perturbations for ensemble forecasting at the National Centers for Environmental Prediction (formerly known as the National Meteorological Center) since December 1992. At that time a single breeding cycle with a pair of bred forecasts was implemented. In March 1994, the ensemble was expanded to seven independent breeding cycles on the Cray C90 supercomputer, and the forecasts were extended to 16 days. This provides 17 independent global forecasts valid for two weeks every day. For efficient ensemble forecasting, the initial perturbations to the control analysis should adequately sample the space of possible analysis errors. It is shown that the analysis cycle is like a breeding cycle: it acts as a nonlinear perturbation model upon the evolution of the real atmosphere. The perturbation (i.e., the analysis error), carried forward in the firstguess forecasts, is ‘‘scaled down’ ’ at regular intervals by the use of observations. Because of this, growing errors associated with the evolving state of the atmosphere develop within the analysis cycle and dominate subsequent forecast error growth. The breeding method simulates the development of growing errors in the analysis cycle. A difference field between two nonlinear forecasts is carried forward (and scaled down at regular intervals) upon the evolving atmospheric analysis fields. By construction, the bred vectors are superpositions of the leading local (timedependent)
DistanceDependent Filtering of Background Error Covariance Estimates in an Ensemble Kalman Filter
, 2001
"... The usefulness of a distancedependent reduction of background error covariance estimates in an ensemble Kalman filter is demonstrated. Covariances are reduced by performing an elementwise multiplication of the background error covariance matrix with a correlation function with local support. This ..."
Abstract

Cited by 189 (31 self)
 Add to MetaCart
The usefulness of a distancedependent reduction of background error covariance estimates in an ensemble Kalman filter is demonstrated. Covariances are reduced by performing an elementwise multiplication of the background error covariance matrix with a correlation function with local support. This reduces noisiness and results in an improved background error covariance estimate, which generates a reducederror ensemble of model initial conditions. The benefits
A Hybrid Ensemble Kalman Filter / 3DVariational Analysis Scheme
"... A hybrid 3dimensional variational (3DVar) / ensemble Kalman filter analysis scheme is demonstrated using a quasigeostrophic model under perfectmodel assumptions. Four networks with differing observational densities are tested, including one network with a data void. The hybrid scheme operates by ..."
Abstract

Cited by 123 (18 self)
 Add to MetaCart
(Show Context)
A hybrid 3dimensional variational (3DVar) / ensemble Kalman filter analysis scheme is demonstrated using a quasigeostrophic model under perfectmodel assumptions. Four networks with differing observational densities are tested, including one network with a data void. The hybrid scheme operates by computing a set of parallel data assimilation cycles, with each member of the set receiving unique perturbed observations. The perturbed observations are generated by adding random noise consistent with observation error statistics to the control set of observations. Background error statistics for the data assimilation are estimated from a linear combination of timeinvariant 3DVar covariances and flowdependent covariances developed from the ensemble of shortrange forecasts. The hybrid scheme allows the user to weight the relative contributions of the 3DVar and ensemblebased background covariances. The analysis scheme was cycled for 90 days, with new observations assimilated every 12 h...
Interpretation Of Rank Histograms For Verifying Ensemble Forecasts
, 2000
"... Rank histograms are a tool for evaluating ensemble forecasts. They are useful for determining the reliability of ensemble forecasts and for diagnosing errors in its mean and spread. Rank histograms are generated by repeatedly tallying the rank of the verification (usually, an observation) relative t ..."
Abstract

Cited by 109 (7 self)
 Add to MetaCart
Rank histograms are a tool for evaluating ensemble forecasts. They are useful for determining the reliability of ensemble forecasts and for diagnosing errors in its mean and spread. Rank histograms are generated by repeatedly tallying the rank of the verification (usually, an observation) relative to values from an ensemble sorted from lowest to highest. However, an uncritical use of the rank histogram can lead to misinterpretations of the qualities of that ensemble. For example, a flat rank histogram, ususally taken as a sign of reliability, can still be generated from unreliable ensembles. Similarly, a Ushaped rank histogram, commonly understood as indicating a lack of variability in the ensemble, can also be a sign of conditional bias. It is also shown that flat rank histograms can be generated for some model variables if the variance of the ensemble is correctly specified, yet if covariances between model grid points are improperly specified, rank histograms for combinations of mo...
Data assimilation via error subspace statistical estimation. Part I: Theory and schemes
, 1999
"... A rational approach is used to identify efficient schemes for data assimilation in nonlinear ocean–atmosphere models. The conditional mean, a minimum of several cost functionals, is chosen for an optimal estimate. After stating the present goals and describing some of the existing schemes, the const ..."
Abstract

Cited by 96 (15 self)
 Add to MetaCart
A rational approach is used to identify efficient schemes for data assimilation in nonlinear ocean–atmosphere models. The conditional mean, a minimum of several cost functionals, is chosen for an optimal estimate. After stating the present goals and describing some of the existing schemes, the constraints and issues particular to ocean–atmosphere data assimilation are emphasized. An approximation to the optimal criterion satisfying the goals and addressing the issues is obtained using heuristic characteristics of geophysical measurements and models. This leads to the notion of an evolving error subspace, of variable size, that spans and tracks the scales and processes where the dominant errors occur. The concept of error subspace statistical estimation (ESSE) is defined. In the present minimum error variance approach, the suboptimal criterion is based on a continued and energetically optimal reduction of the dimension of error covariance matrices. The evolving error subspace is characterized by error singular vectors and values, or in other words, the error principal components and coefficients. Schemes for filtering and smoothing via ESSE are derived. The data–forecast melding minimizes variance in the error subspace. Nonlinear Monte Carlo forecasts integrate the error subspace in time. The smoothing is based on a statistical approximation approach. Comparisons with existing filtering and smoothing procedures
Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation
 MONTHLY WEATHER REVIEW VOLUME
, 2005
"... Ensemble prediction systems typically show positive spreaderror correlation, but they are subject to forecast bias and dispersion errors, and are therefore uncalibrated. This work proposes the use of ensemble model output statistics (EMOS), an easytoimplement postprocessing technique that address ..."
Abstract

Cited by 78 (14 self)
 Add to MetaCart
(Show Context)
Ensemble prediction systems typically show positive spreaderror correlation, but they are subject to forecast bias and dispersion errors, and are therefore uncalibrated. This work proposes the use of ensemble model output statistics (EMOS), an easytoimplement postprocessing technique that addresses both forecast bias and underdispersion and takes into account the spreadskill relationship. The technique is based on multiple linear regression and is akin to the superensemble approach that has traditionally been used for deterministicstyle forecasts. The EMOS technique yields probabilistic forecasts that take the form of Gaussian predictive probability density functions (PDFs) for continuous weather variables and can be applied to gridded model output. The EMOS predictive mean is a biascorrected weighted average of the ensemble member forecasts, with coefficients that can be interpreted in terms of the relative contributions of the member models to the ensemble, and provides a highly competitive deterministicstyle forecast. The EMOS predictive variance is a linear function of the ensemble variance. For fitting the EMOS coefficients, the method of minimum continuous ranked probability score (CRPS) estimation is introduced. This technique finds the coefficient values that optimize the CRPS for the training data. The EMOS technique was applied to 48h forecasts of sea level pressure and surface temperature over the North American Pacific Northwest in spring 2000, using the University of Washington mesoscale ensemble. When compared to the biascorrected ensemble, deterministicstyle EMOS forecasts of sea level pressure had rootmeansquare error 9 % less and mean absolute error 7 % less. The EMOS predictive PDFs were sharp, and much better calibrated than the raw ensemble or the biascorrected ensemble.
Which is better, an ensemble of positivenegative pairs or a centered spherical simplex ensemble? Monthly Weather Rev
, 2004
"... New methods to center the initial ensemble perturbations on the analysis are introduced and compared with the commonly used centering method of positive–negative paired perturbations. In the new method, one linearly dependent perturbation is added to a set of linearly independent initial perturbatio ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
(Show Context)
New methods to center the initial ensemble perturbations on the analysis are introduced and compared with the commonly used centering method of positive–negative paired perturbations. In the new method, one linearly dependent perturbation is added to a set of linearly independent initial perturbations to ensure that the sum of the new initial perturbations equals zero; the covariance calculated from the new initial perturbations is equal to the analysis error covariance estimated by the independent initial perturbations, and all of the new initial perturbations are equally likely. The new method is illustrated by applying it to the ensemble transform Kalman filter (ETKF) ensemble forecast scheme, and the resulting ensemble is called the spherical simplex ETKF ensemble. It is shown from a multidimensional Taylor expansion that the symmetric positive–negative paired centering would yield a more accurate forecast ensemble mean and covariance than the spherical simplex centering if the ensemble were large enough to span all initial uncertain directions and thus the analysis error covariance was modeled precisely. However, when the number of uncertain directions is larger than the ensemble size, the spherical simplex centering has the advantage of allowing almost twice as many uncertain directions to be spanned as the symmetric positive–negative paired centering. The performances of the spherical simplex ETKF and symmetric positive–negative paired ETKF ensembles are compared by using the Community Climate Model