Results 1  10
of
1,678
Towards a modern theory of adaptive networks: expectation and prediction
 Psychol. Rev
, 1981
"... Many adaptive neural network theories are based on neuronlike adaptive elements that can behave as single unit analogs of associative conditioning. In this article we develop a similar adaptive element, but one which is more closely in accord with the facts of animal learning theory than elements co ..."
Abstract

Cited by 276 (17 self)
 Add to MetaCart
Many adaptive neural network theories are based on neuronlike adaptive elements that can behave as single unit analogs of associative conditioning. In this article we develop a similar adaptive element, but one which is more closely in accord with the facts of animal learning theory than elements commonly studied in adaptive network research. We suggest that an essential feature of classical conditioning that has been largely overlooked by adaptive network theorists is its predictive nature. The adaptive element we present learns to increase its response rate in anticipation of increased stimulation, producing a conditioned response before the occurrence of the unconditioned stimulus. The element also is in strong agreement with the behavioral data regarding the effects of stimulus context, since it is a temporally refined extension of the RescorlaWagner model. We show by computer simulation that the element becomes sensitive to the most reliable, nonredundant, and earliest predictors of reinforcement. We also point out that the model solves many of the stability and saturation problems encountered in network simulations. Finally, we discuss our model in light of recent advances in the physiology and biochemistry of synaptic mechanisms. One way to bridge the gap between behavioral and neural views of learning is to postulate neural analogs of behavioral modification paradigms. Hebb's suggestion that when a cell A repeatedly and persistently takes part in firing another cell B, then A's efficiency in firing B is increased, is the most familiar of these postulates (Hebb, 1949). This rule for synaptic plasticity is a neural analog of associative conditioning and continues to exert a powerful influence on theoretical and experimental research in learning and memory. Neural network models designed to explore the behavioral possibilities of modifiable structures typically em
The distribution of realized stock return volatility
, 2001
"... We examine "realized" daily equity return volatilities and correlations obtained from highfrequency intraday transaction prices on individual stocks in the Dow Jones ..."
Abstract

Cited by 245 (21 self)
 Add to MetaCart
We examine "realized" daily equity return volatilities and correlations obtained from highfrequency intraday transaction prices on individual stocks in the Dow Jones
Advanced Spectral Methods for Climatic Time Series
, 2001
"... The analysis of uni or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical eld of ..."
Abstract

Cited by 220 (35 self)
 Add to MetaCart
The analysis of uni or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical eld of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory.
Stochastic Volatility for Lévy Processes
, 2001
"... Three processes re°ecting persistence of volatility are initially formulated by evaluating three L¶evy processes at a time change given by the integral of a mean reverting square root process. The model for the mean reverting time change is then generalized to include NonGaussian models that are so ..."
Abstract

Cited by 212 (12 self)
 Add to MetaCart
Three processes re°ecting persistence of volatility are initially formulated by evaluating three L¶evy processes at a time change given by the integral of a mean reverting square root process. The model for the mean reverting time change is then generalized to include NonGaussian models that are solutions to OU (OrnsteinUhlenbeck) equations driven by one sided discontinuous L¶evy processes permitting correlation with the stock. Positive stock price processes are obtained by exponentiating and mean correcting these processes, or alternatively by stochastically exponentiating these processes. The characteristic functions for the log price can be used to yield option prices via the fast Fourier transform. In general, mean corrected exponentiation performs better than employing the stochastic exponential. It is observed that the mean corrected exponential model is not a martingale in the ¯ltration in which it is originally de¯ned. This leads us to formulate and investigate the important property of martingale marginals where we seek martingales in altered ¯ltrations consistent with the one dimensional marginal distributions of the level of the process at each future date. 1
Skills, Tasks and Technologies: Implications for Employment and Earnings
 Handbook of Labor Economics Volume 4, Orley Ashenfelter and
, 2010
"... A central organizing framework of the voluminous recent literature studying changes in the returns to skills and the evolution of earnings inequality is what we refer to as the canonical model, which elegantly and powerfully operationalizes the supply and demand for skills by assuming two distinct s ..."
Abstract

Cited by 173 (13 self)
 Add to MetaCart
A central organizing framework of the voluminous recent literature studying changes in the returns to skills and the evolution of earnings inequality is what we refer to as the canonical model, which elegantly and powerfully operationalizes the supply and demand for skills by assuming two distinct skill groups that perform two different and imperfectly substitutable tasks or produce two imperfectly substitutable goods. Technology is assumed to take a factoraugmenting form, which, by complementing either high or low skill workers, can generate skill biased demand shifts. In this paper, we argue that despite its notable successes, the canonical model is largely silent on a number of central empirical developments of the last three decades, including: (1) significant declines in real wages of low skill workers, particularly low skill males; (2) nonmonotone changes in wages at different parts of the earnings distribution during different decades; (3) broadbased increases in employment in high skill and low skill occupations relative to middle skilled occupations (i.e., job ‘polarization’); (4) rapid diffusion of new technologies that directly substitute capital for labor in tasks previously performed by moderatelyskilled workers; and (5) expanding offshoring opportunities, enabled by technology, which allow foreign labor to substitute for domestic workers specific tasks. Motivated by these patterns, we argue that it is valuable to consider a richer framework for analyzing how recent changes in the earnings and employment
The Virtue of Patience: Concurrent Programming with and without Waiting
, 1990
"... We consider the implementation of atomic operations that either write several shared variables, or that both read and write shared variables. We show that, in general, such operations cannot be implemented in a wMtfree manner using atomic registers. ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
We consider the implementation of atomic operations that either write several shared variables, or that both read and write shared variables. We show that, in general, such operations cannot be implemented in a wMtfree manner using atomic registers.
Estimation of Stochastic Volatility Models with Diagnostics
 Journal of Econometrics
, 1995
"... Efficient Method of Moments (EMM) is used to fit the standard stochastic volatility model and various extensions to several daily financial time series. EMM matches to the score of a model determined by data analysis called the score generator. Discrepancies reveal characteristics of data that stoch ..."
Abstract

Cited by 126 (11 self)
 Add to MetaCart
Efficient Method of Moments (EMM) is used to fit the standard stochastic volatility model and various extensions to several daily financial time series. EMM matches to the score of a model determined by data analysis called the score generator. Discrepancies reveal characteristics of data that stochastic volatility models cannot approximate. The two score generators employed here are "Semiparametric ARCH" and "Nonlinear Nonparametric". With the first, the standard model is rejected, although some extensions are accepted. With the second, all versions are rejected. The extensions required for an adequate fit are so elaborate that nonparametric specifications are probably more convenient. Corresponding author: George Tauchen, Duke University, Department of Economics, Social Science Building, Box 90097, Durham NC 277080097 USA, phone 19196601812, FAX 19196848974, email get@tauchen.econ.duke.edu. 0 1 Introduction The stochastic volatility model has been proposed as a descripti...
DNAbinding specificity of GATA family transcription factors
, 1993
"... DNAbinding specificity of GATA family transcription factors. ..."
Abstract

Cited by 93 (6 self)
 Add to MetaCart
DNAbinding specificity of GATA family transcription factors.
Optimal Monetary Policy with Durable Consumption Goods
, 2005
"... We document that the durable goods sector is much more interestsensitive than the nondurables sector, and then investigate the implications of these sectoral differences for monetary policy. We formulate a twosector general equilibrium model that is calibrated both to match the sectoral response ..."
Abstract

Cited by 69 (1 self)
 Add to MetaCart
We document that the durable goods sector is much more interestsensitive than the nondurables sector, and then investigate the implications of these sectoral differences for monetary policy. We formulate a twosector general equilibrium model that is calibrated both to match the sectoral responses to a monetary shock derived from our empirical VAR, and to imply an empirically realistic degree of sectoral output volatility and comovement. While the social welfare function involves sectorspecific output gaps and inflation rates, the performance of the optimal policy rule can be closely approximated by a simple rule that targets a weighted average of aggregate wage and price inflation. In contrast, a rule that stabilizes a more narrow measure of final goods price inflation performs poorly in terms of social welfare.
Results 1  10
of
1,678