Results 1  10
of
154
Flexible smoothing with Bsplines and penalties
 STATISTICAL SCIENCE
, 1996
"... Bsplines are attractive for nonparametric modelling, but choosing the optimal number and positions of knots is a complex task. Equidistant knots can be used, but their small and discrete number allows only limited control over smoothness and fit. We propose to use a relatively large number of knots ..."
Abstract

Cited by 395 (6 self)
 Add to MetaCart
Bsplines are attractive for nonparametric modelling, but choosing the optimal number and positions of knots is a complex task. Equidistant knots can be used, but their small and discrete number allows only limited control over smoothness and fit. We propose to use a relatively large number of knots and a difference penalty on coefficients of adjacent Bsplines. We show connections to the familiar spline penalty on the integral of the squared second derivative. A short overview of Bsplines, their construction, and penalized likelihood is presented. We discuss properties of penalized Bsplines and propose various criteria for the choice of an optimal penalty parameter. Nonparametric logistic regression, density estimation and scatterplot smoothing are used as examples. Some details of the computations are presented.
Model Selection and Model Averaging in Phylogenetics: Advantages of Akaike Information Criterion and Bayesian Approaches Over Likelihood Ratio Tests
, 2004
"... Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the sel ..."
Abstract

Cited by 377 (8 self)
 Add to MetaCart
Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the selection of substitution models in phylogenetics from a theoretical, philosophical and practical point of view, and summarize this comparison in table format. We argue that the most commonly implemented model selection approach, the hierarchical likelihood ratio test, is not the optimal strategy for model selection in phylogenetics, and that approaches like the Akaike Information Criterion (AIC) and Bayesian methods offer important advantages. In particular, the latter two methods are able to simultaneously compare multiple nested or nonnested models, assess model selection uncertainty, and allow for the estimation of phylogenies and model parameters using all available models (modelaveraged inference or multimodel inference). We also describe how the relative importance of the different parameters included in substitution models can be depicted. To illustrate some of these points, we have applied AICbased model averaging to 37 mitochondrial DNA sequences from the subgenus Ohomopterus (genus Carabus) ground beetles described by Sota and Vogler (2001).
How to Tell When Simpler, More Unified, or Less Ad Hoc Theories will Provide More Accurate Predictions
, 1994
"... Traditional analyses of the curve fitting problem maintain that the data do not indicate what form the fitted curve should take. Rather, this issue is said to be settled by prior probabilities, by simplicity, or by a background theory. In this paper, we describe a result due to Akaike [1973], which ..."
Abstract

Cited by 117 (32 self)
 Add to MetaCart
Traditional analyses of the curve fitting problem maintain that the data do not indicate what form the fitted curve should take. Rather, this issue is said to be settled by prior probabilities, by simplicity, or by a background theory. In this paper, we describe a result due to Akaike [1973], which shows how the data can underwrite an inference concerning the curve’s form based on an estimate of how predictively accurate it will be. We argue that this approach throws light on the theoretical virtues of parsimoniousness, unification, and non ad hocness, on the dispute about Bayesianism, and on empiricism and scientific realism.
Key Concepts in Model Selection: Performance and Generalizability
 Journal of Mathematical Psychology
, 2000
"... methods of model selection, and how do they work? Which methods perform better than others, and in what circumstances? These questions rest on a number of key concepts in a relatively underdeveloped field. The aim of this essay is to explain some background concepts, highlight some of the results in ..."
Abstract

Cited by 69 (13 self)
 Add to MetaCart
(Show Context)
methods of model selection, and how do they work? Which methods perform better than others, and in what circumstances? These questions rest on a number of key concepts in a relatively underdeveloped field. The aim of this essay is to explain some background concepts, highlight some of the results in this special issue, and to add my own. The standard methods of model selection include classical hypothesis testing, maximum likelihood, Bayes method, minimum description length, crossvalidation and Akaike’s information criterion. They all provide an implementation of Occam’s razor, in which parsimony or simplicity is balanced against goodnessoffit. These methods primarily take account of the sampling errors in parameter estimation, although their relative success at this task depends on the circumstances. However, the aim of model selection should also include the ability of a model to generalize to predictions in a different domain. Errors of extrapolation, or generalization, are different from errors of parameter estimation. So, it seems that simplicity and parsimony may be an additional factor in managing these errors, in which case the standard methods of model selection are incomplete implementations of Occam’s razor. 1. WHAT IS MODEL SELECTION? William of Ockham (1285 1347/49) will always be remembered for his famous postulations of Ockham’s razor (also spelled ‘Occam’), which states that entities are not to be multiplied beyond necessity. In a similar vein, Sir Isaac Newton’s first rule of hypothesizing instructs us that we are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. While they This paper is derived from a presentation at the Methods of Model Selection symposium at Indiana University
Construction and use of linear regression models for processor performance analysis
 In Proc. 12th IEEE Symposium on High Performance Computer Architecture
, 2006
"... Processor architects have a challenging task of evaluating a large design space consisting of several interacting parameters and optimizations. In order to assist architects in making crucial design decisions, we build linear regression models that relate processor performance to microarchitectural ..."
Abstract

Cited by 63 (2 self)
 Add to MetaCart
(Show Context)
Processor architects have a challenging task of evaluating a large design space consisting of several interacting parameters and optimizations. In order to assist architects in making crucial design decisions, we build linear regression models that relate processor performance to microarchitectural parameters, using simulation based experiments. We obtain good approximate models using an iterative process in which Akaike’s information criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using Doptimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We used this procedure to establish the relationship of the CPI performance response to 26 key microarchitectural parameters using a detailed cyclebycycle superscalar processor simulator. The resulting models provide a significance ordering on all microarchitectural parameters and their interactions, and explain the performance variations of microarchitectural techniques. 1.
Saccade target selection in frontal eye field of macaque. I. Visual and premovement activation
 The Journal of Neuroscience
, 1995
"... We investigated how the brain selects the targets for eye movements, a process in which the outcome of visual processing is converted into guided action. Macaque monkeys were trained to make a saccade to fixate a salient target presented either alone or with multiple distracters during visual searc ..."
Abstract

Cited by 50 (14 self)
 Add to MetaCart
We investigated how the brain selects the targets for eye movements, a process in which the outcome of visual processing is converted into guided action. Macaque monkeys were trained to make a saccade to fixate a salient target presented either alone or with multiple distracters during visual search. Neural activity was recorded in the frontal eye field, a cortical area at the interface of visual processing and eye movement production. Neurons discharging after stimulus presentation and before saccade initiation were analyzed. The initial visual response of frontal eye field neurons was modulated by the presence of multiple stimuli and by whether a saccade was going to be produced, but the initial visual response did not discriminate the target of the search array from the distracters. In the latent period before saccade initiation, the activity of most
Effects of species’ ecology on the accuracy of distribution models
 Ecography
, 2007
"... In the face of accelerating biodiversity loss and limited data, species distribution models which statistically capture and predict species ’ occurrences based on environmental correlates are increasingly used to inform conservation strategies. Additionally, distribution models and their fit provi ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
In the face of accelerating biodiversity loss and limited data, species distribution models which statistically capture and predict species ’ occurrences based on environmental correlates are increasingly used to inform conservation strategies. Additionally, distribution models and their fit provide insights on the broadscale environmental niche of species. To investigate whether the performance of such models varies with species ’ ecological characteristics, we examined distribution models for 1329 bird species in southern and eastern Africa. The models were constructed at two spatial resolutions with both logistic and autologistic regression. Satellitederived environmental indices served as predictors, and model accuracy was assessed with three metrics: sensitivity, specificity and the area under the curve (AUC) of receiver operating characteristics plots. We then determined the relationship between each measure of accuracy and ten ecological species characteristics using generalised linear models. Among the ecological traits tested, species ’ range size, migratory status, affinity for wetlands and endemism proved most influential on the performance of distribution models. The number of habitat types frequented (habitat tolerance), trophic rank, body mass, preferred habitat structure and association with subresolution habitats also showed some effect. In contrast, conservation status made no significant impact. These findings did not differ from one spatial resolution to the next. Our analyses thus provide conservation scientists and resource managers with a rule of thumb that helps distinguish, on the basis of ecological traits, between species whose
Adaptation to stable and unstable dynamics achieved by combined impedance control and inverse dynamics model
 J. Neurophysiol
, 2003
"... and Theodore E. Milner. Adaptation to stable and unstable dynamics achieved by combined impedance control and inverse dynamics model. J Neurophysiol 90: 3270–3282, 2003; 10.1152/jn.01112.2002. This study compared adaptation in novel force fields where trajectories were initially either stable or uns ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
(Show Context)
and Theodore E. Milner. Adaptation to stable and unstable dynamics achieved by combined impedance control and inverse dynamics model. J Neurophysiol 90: 3270–3282, 2003; 10.1152/jn.01112.2002. This study compared adaptation in novel force fields where trajectories were initially either stable or unstable to elucidate the processes of learning novel skills and adapting to new environments. Subjects learned to move in a null force field (NF), which was unexpectedly changed either to a velocitydependent force field (VF), which resulted in perturbed but stable hand trajectories, or a positiondependent divergent force field (DF), which resulted in unstable trajectories. With practice, subjects learned to compensate for the perturbations produced by both force fields. Adaptation was characterized by an initial increase in the activation of all muscles followed by a gradual reduction. The time course of the increase in activation was correlated with a reduction in handpath error for the DF but not for the VF.
Understanding LongRange Correlations in DNA Sequences
 PHYSICA D
, 1994
"... In this paper, we review the literature on statistical longrange correlation in DNA sequences. We examine the current evidence for these correlations, and conclude that a mixture of many length scales (including some relatively long ones) in DNA sequences is responsible for the observed 1/f like ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
(Show Context)
In this paper, we review the literature on statistical longrange correlation in DNA sequences. We examine the current evidence for these correlations, and conclude that a mixture of many length scales (including some relatively long ones) in DNA sequences is responsible for the observed 1/f like spectral component. We note the complexity of the correlation structure in DNA sequences. The observed complexity often makes it hard, or impossible, to decompose the sequence into a few statistically stationary regions. We suggest that, based on the complexity of DNA sequences, a fruitful approach to understand longrange correlation is to model duplication, and other rearrangement processes, in DNA sequences. One model, called "expansionmodification system", contains only point duplication and point mutation. Though simplistic, this model is able to generate sequences with 1/f spectra. We emphasize the importance of DNA duplication in its contribution to the observed longrang...
Loosening the constraints on illusory conjunctions: Assessing the roles of exposure duration and attention.
 Journal of Experimental Psychology: Human Perception and Performance,
, 1995
"... ..."