Results 1  10
of
945
Random Early Detection Gateways for Congestion Avoidance
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1993
"... This paper presents Random Early Detection (RED) gateways for congestion avoidance in packetswitched networks. The gateway detects incipient congestion by computing the average queue size. The gateway could notify connections of congestion either by dropping packets arriving at the gateway or by ..."
Abstract

Cited by 2687 (31 self)
 Add to MetaCart
This paper presents Random Early Detection (RED) gateways for congestion avoidance in packetswitched networks. The gateway detects incipient congestion by computing the average queue size. The gateway could notify connections of congestion either by dropping packets arriving at the gateway or by setting a bit in packet headers. When the average queue size exceeds a preset threshold,the gateway drops or marks each arriving packet with a certain probability, where the exact probability is a function of the average queue size. RED gateways keep the average queue size low while allowing occasional bursts of packets in the queue. During congestion, the probability that the gateway notifies a particular connection to reduce its window is roughly proportional to that connection's share of the bandwidth throughthe gateway. RED gateways are designed to accompany a transportlayer congestion control protocol such as TCP.The RED gateway has no bias against bursty traffic and avoids the global synchronization of many connectionsdecreasing their window at the same time. Simulations of a TCP/IP network are used to illustrate the performance of RED gateways.
Sequential Monte Carlo Methods for Dynamic Systems
 Journal of the American Statistical Association
, 1998
"... A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ..."
Abstract

Cited by 649 (12 self)
 Add to MetaCart
(Show Context)
A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ingredients: importance sampling and resampling, rejection sampling, and Markov chain iterations. We deliver a guideline on how they should be used and under what circumstance each method is most suitable. Through the analysis of differences and connections, we consolidate these methods into a generic algorithm by combining desirable features. In addition, we propose a general use of RaoBlackwellization to improve performances. Examples from econometrics and engineering are presented to demonstrate the importance of RaoBlackwellization and to compare different Monte Carlo procedures. Keywords: Blind deconvolution; Bootstrap filter; Gibbs sampling; Hidden Markov model; Kalman filter; Markov...
Stochastic Trends and Economic Fluctuations
 American Economic Review
, 1991
"... Are business cycles mainly the result of permanent shocks to productivity? This paper uses a longrun restriction implied by a large class of realbusinesscycle modelsidentifying permanent productivity shocks as shocks to the common stochastic trend in output, consumption, and investmentto provid ..."
Abstract

Cited by 244 (6 self)
 Add to MetaCart
Are business cycles mainly the result of permanent shocks to productivity? This paper uses a longrun restriction implied by a large class of realbusinesscycle modelsidentifying permanent productivity shocks as shocks to the common stochastic trend in output, consumption, and investmentto provide new evidence on this question. Econometric tests indicate that this commonstochastictrend / cointegration implication is consistent with postwar U.S. data. However, in systems with nominal variables, the estimates of this common stochastic trend indicate that permanent productivity shocks typically explain less than half of the businesscycle variability in output, consumption, and investment. (JEL E32, C32) A central, surprising, and controversial result of some current research on real business cycles is the claim that a common stochastic trendthe cumulative effect of permanent shocks to productivityunderlies the bulk of economic fluctuations. If confirmed, this finding would imply that many other forces have been relatively unimportant over historical business cycles, including the monetary and fiscal policy shocks stressed in traditional macroeconomic analysis. This paper shows that the hypothesis of a common stochastic productivity trend has a set of econometric implications that allows us to test for its presence, measure its importance, and extract estimates of its realized value. Applying these procedures to consumption, investment, and output for the postwar United States, we find results that both support and contradict this claim in the realbusinesscycle literature. The U.S. data are consis
Formulating, Identifying and Estimating the Technology of Cognitive and Noncognitive Skill Formation
"... This paper estimates models of the evolution of cognitive and noncognitive skills and explores the role of family environments in shaping these skills at different stages of the life cycle of the child. Central to this analysis is identification of the technology of skill formation. We estimate a dy ..."
Abstract

Cited by 202 (39 self)
 Add to MetaCart
(Show Context)
This paper estimates models of the evolution of cognitive and noncognitive skills and explores the role of family environments in shaping these skills at different stages of the life cycle of the child. Central to this analysis is identification of the technology of skill formation. We estimate a dynamic factor model to solve the problem of endogeneity of inputs and multiplicity of inputs relative to instruments. We identify the scale of the factors by estimating their effects on adult outcomes. In this fashion we avoid reliance on test scores and changes in test scores that have no natural metric. Parental investments are generally more effective in raising noncognitive skills. Noncognitive skills promote the formation of cognitive skills but, in most specifications of our model, cognitive skills do not promote the formation of noncognitive skills. Parental inputs have different effects at different stages of the child’s life cycle with cognitive skills affected more at early ages and noncognitive skills affected more at later ages.
2003), “Measuring the Natural Rate of Interest
 Review of Economics and Statistics
"... A key variable for the conduct of monetary policy is the natural rate of interest { the real interest rate consistent with output equaling potential and stable inflation. Economic theory implies that the natural rate of interest varies over time and depends on the trend growth rate of output. In thi ..."
Abstract

Cited by 148 (21 self)
 Add to MetaCart
A key variable for the conduct of monetary policy is the natural rate of interest { the real interest rate consistent with output equaling potential and stable inflation. Economic theory implies that the natural rate of interest varies over time and depends on the trend growth rate of output. In this paper we apply the Kalman lter to jointly estimate the natural rate of interest, potential output, and its trend growth rate, and examine the empirical relationship between these estimated unobserved series. We nd substantial variation in the natural rate of interest over the past four decades in the United States. Our natural rate estimates vary about oneforone with changes in the trend growth rate. We show that policymakers ’ mismeasurement of the natural rate of interest can cause a signicant deterioration in macroeconomic stabilization.
Indicator Variables for Optimal Policy
, 2000
"... The optimal weights on indicators in models with partial information about the state of the economy and forwardlooking variables are derived and interpreted, both for equilibria under discretion and under commitment. An example of optimal monetary policy with a partially observable potential output ..."
Abstract

Cited by 142 (20 self)
 Add to MetaCart
The optimal weights on indicators in models with partial information about the state of the economy and forwardlooking variables are derived and interpreted, both for equilibria under discretion and under commitment. An example of optimal monetary policy with a partially observable potential output and a forwardlooking indicator is examined. The optimal response to the optimal estimate of potential output displays certaintyequivalence, whereas the optimal response to the imperfect observation of output depends on the noise in this observation.
Power and performance management of virtualized computing environments via lookahead control
 in Proc. IEEE Intl. Conf. on Autonomic Computing (ICAC
, 2008
"... There is growing incentive to reduce the power consumed by largescale data centers that host online services such as banking, retail commerce, and gaming. Virtualization is a promising approach to consolidating multiple online services onto a smaller number of computing resources. A virtualized se ..."
Abstract

Cited by 132 (5 self)
 Add to MetaCart
(Show Context)
There is growing incentive to reduce the power consumed by largescale data centers that host online services such as banking, retail commerce, and gaming. Virtualization is a promising approach to consolidating multiple online services onto a smaller number of computing resources. A virtualized server environment allows computing resources to be shared among multiple performanceisolated platforms called virtual machines. By dynamically provisioning virtual machines, consolidating the workload, and turning servers on and off as needed, data center operators can maintain the desired qualityofservice (QoS) while achieving higher server utilization and energy efficiency. We implement and validate a dynamic resource provisioning framework for virtualized server environments wherein the provisioning problem is posed as one of sequential optimization under uncertainty and solved using a lookahead control scheme. The proposed approach accounts for the switching costs incurred while provisioning virtual machines and explicitly encodes the corresponding risk in the optimization problem. Experiments using the Trade6 enterprise application show that a server cluster managed by the controller conserves, on average, 22 % of the power required by a system without dynamic
Output Gap Uncertainty: Does it Matter for the Taylor Rule
 Monetary Policy Under Uncertainty
, 1999
"... International Settlements, and from time to time by other economists, and are published by the Bank. The papers are on subjects of topical interest and are technical in character. The views expressed in them are those of their authors and not necessarily the views of the BIS. Copies of publications ..."
Abstract

Cited by 124 (1 self)
 Add to MetaCart
International Settlements, and from time to time by other economists, and are published by the Bank. The papers are on subjects of topical interest and are technical in character. The views expressed in them are those of their authors and not necessarily the views of the BIS. Copies of publications are available from:
Marginalized particle filters for mixed linear/nonlinear statespace models
 IEEE Transactions on Signal Processing
, 2005
"... Abstract—The particle filter offers a general numerical tool to approximate the posterior density function for the state in nonlinear and nonGaussian filtering problems. While the particle filter is fairly easy to implement and tune, its main drawback is that it is quite computer intensive, with th ..."
Abstract

Cited by 112 (33 self)
 Add to MetaCart
(Show Context)
Abstract—The particle filter offers a general numerical tool to approximate the posterior density function for the state in nonlinear and nonGaussian filtering problems. While the particle filter is fairly easy to implement and tune, its main drawback is that it is quite computer intensive, with the computational complexity increasing quickly with the state dimension. One remedy to this problem is to marginalize out the states appearing linearly in the dynamics. The result is that one Kalman filter is associated with each particle. The main contribution in this paper is the derivation of the details for the marginalized particle filter for a general nonlinear statespace model. Several important special cases occurring in typical signal processing applications will also be discussed. The marginalized particle filter is applied to an integrated navigation system for aircraft. It is demonstrated that the complete highdimensional system can be based on a particle filter using marginalization for all but three states. Excellent performance on real flight data is reported. Index Terms—Kalman filter, marginalization, navigation systems, nonlinear systems, particle filter, state estimation. I.
Estimation of stochastic volatility models via Monte Carlo Maximum Likelihood
, 1998
"... This paper discusses the Monte Carlo maximum likelihood method of estimating stochastic volatility (SV) models. The basic SV model can be expressed as a linear state space model with log chisquare disturbances. The likelihood function can be approximated arbitrarily accurately by decomposing it int ..."
Abstract

Cited by 110 (10 self)
 Add to MetaCart
This paper discusses the Monte Carlo maximum likelihood method of estimating stochastic volatility (SV) models. The basic SV model can be expressed as a linear state space model with log chisquare disturbances. The likelihood function can be approximated arbitrarily accurately by decomposing it into a Gaussian part, constructed by the Kalman filter, and a remainder function, whose expectation is evaluated by simulation. No modifications of this estimation procedure are required when the basic SV model is extended in a number of directions likely to arise in applied empirical research. This compares favorably with alternative approaches. The finite sample performance of the new estimator is shown to be comparable to the Monte Carlo Markov chain (MCMC) method.