Results 1  10
of
189,183
Empirical exchange rate models of the Seventies: do they fit out of sample?
 JOURNAL OF INTERNATIONAL ECONOMICS
, 1983
"... This study compares the outofsample forecasting accuracy of various structural and time series exchange rate models. We find that a random walk model performs as well as any estimated model at one to twelve month horizons for the dollar/pound, dollar/mark, dollar/yen and tradeweighted dollar exch ..."
Abstract

Cited by 854 (12 self)
 Add to MetaCart
This study compares the outofsample forecasting accuracy of various structural and time series exchange rate models. We find that a random walk model performs as well as any estimated model at one to twelve month horizons for the dollar/pound, dollar/mark, dollar/yen and tradeweighted dollar
Static Scheduling of Synchronous Data Flow Programs for Digital Signal Processing
 IEEE TRANSACTIONS ON COMPUTERS
, 1987
"... Large grain data flow (LGDF) programming is natural and convenient for describing digital signal processing (DSP) systems, but its runtime overhead is costly in real time or costsensitive applications. In some situations, designers are not willing to squander computing resources for the sake of pro ..."
Abstract

Cited by 598 (37 self)
 Add to MetaCart
flow (SDF) differs from traditional data flow in that the amount of data produced and consumed by a data flow node is specified a priori for each input and output. This is equivalent to specifying the relative sample rates in signal processing system. This means that the scheduling of SDF nodes need
Minimum Error Rate Training in Statistical Machine Translation
, 2003
"... Often, the training procedure for statistical machine translation models is based on maximum likelihood or related criteria. A general problem of this approach is that there is only a loose relation to the final translation quality on unseen text. In this paper, we analyze various training cri ..."
Abstract

Cited by 757 (7 self)
 Add to MetaCart
Often, the training procedure for statistical machine translation models is based on maximum likelihood or related criteria. A general problem of this approach is that there is only a loose relation to the final translation quality on unseen text. In this paper, we analyze various training
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 604 (12 self)
 Add to MetaCart
Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical
Improved Boosting Algorithms Using Confidencerated Predictions
 MACHINE LEARNING
, 1999
"... We describe several improvements to Freund and Schapire’s AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find impr ..."
Abstract

Cited by 940 (26 self)
 Add to MetaCart
improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns
Analysis of relative gene expression data using realtime quantitative
 PCR and 2 ���CT method. Methods 25
, 2001
"... of the target gene relative to some reference group The two most commonly used methods to analyze data from realtime, quantitative PCR experiments are absolute quantificasuch as an untreated control or a sample at time zero tion and relative quantification. Absolute quantification deter in a time ..."
Abstract

Cited by 2666 (6 self)
 Add to MetaCart
of the target gene relative to some reference group The two most commonly used methods to analyze data from realtime, quantitative PCR experiments are absolute quantificasuch as an untreated control or a sample at time zero tion and relative quantification. Absolute quantification deter in a
Changes in relative wages, 19631987—Supply and demand factors
 Quarterly Journal of Economics
, 1992
"... A simple supply and demand framework is used to analyze changes in the U. S. wage structure from 1963 to 1987. Rapid secular growth in the demand for moreeducated workers, "moreskilled " workers, and females appears to be the driving force behind observed changes in the wage structure. M ..."
Abstract

Cited by 1109 (23 self)
 Add to MetaCart
. Measured changes in the allocation of labor between industries and occupations strongly favored college graduates and females throughout the period. Movements in the college wage premium over this period appear to be strongly related to fluctuations in the rate of growth of the supply of college graduates
A new mathematical model for relative quantification in realtime RTPCR. Nucleic Acids Res
"... Use of the realtime polymerase chain reaction (PCR) to amplify cDNA products reverse transcribed from mRNA is on the way to becoming a routine tool in molecular biology to study low abundance gene expression. Realtime PCR is easy to perform, provides the necessary accuracy and produces reliable as ..."
Abstract

Cited by 1088 (4 self)
 Add to MetaCart
in comparison to a reference gene transcript. Therefore, a new mathematical model is presented. The relative expression ratio is calculated only from the realtime PCR efficiencies and the crossing point deviation of an unknown sample versus a control. This model needs no calibration curve. Control levels were
Inflation and Growth
, 1996
"... In recent years, many central banks have placed increased emphasis on price stability. Monetary policyâwhether expressed in terms of interest rates or growth of monetary aggregatesâhas been increasingly geared toward the achievement of low and stable inflation. Central bankers and most other obs ..."
Abstract

Cited by 3577 (23 self)
 Add to MetaCart
observers view price stability as a worthy objective because they think that inflation is costly. Some of these costs involve the average rate of inflation, and others relate to the variability and uncertainty of inflation. But the general idea is that businesses and households are thought to perform poorly
Compressive sensing
 IEEE Signal Processing Mag
, 2007
"... The Shannon/Nyquist sampling theorem tells us that in order to not lose information when uniformly sampling a signal we must sample at least two times faster than its bandwidth. In many applications, including digital image and video cameras, the Nyquist rate can be so high that we end up with too m ..."
Abstract

Cited by 696 (62 self)
 Add to MetaCart
The Shannon/Nyquist sampling theorem tells us that in order to not lose information when uniformly sampling a signal we must sample at least two times faster than its bandwidth. In many applications, including digital image and video cameras, the Nyquist rate can be so high that we end up with too
Results 1  10
of
189,183