Results 1  10
of
25
Clever homunculus: Is there an endogenous act of control in the explicit taskcuing procedure
 Journal of Experimental Psychology: Human Perception and Performance
, 2003
"... Does the explicit taskcuing procedure require an endogenous act of control? In 5 experiments, cues indicating which task to perform preceded targets by several stimulus onset asynchronies (SOAs). Two models were developed to account for changes in reaction time (RT) with SOA. Model 1 assumed an end ..."
Abstract

Cited by 103 (20 self)
 Add to MetaCart
(Show Context)
Does the explicit taskcuing procedure require an endogenous act of control? In 5 experiments, cues indicating which task to perform preceded targets by several stimulus onset asynchronies (SOAs). Two models were developed to account for changes in reaction time (RT) with SOA. Model 1 assumed an endogenous act of task switching for cue alternations but not for cue repetitions. Model 2 assumed no such act. In Experiments 1 and 2, the cue was masked or not masked. Masking interacted underadditively with repetition and alternation, consistent with Model 2 but not Model 1. In Experiments 3 and 4, 2 cues were used for each task. RT was slower for task repetition than for cue repetition and about the same as RT for task alternation, consistent with Model 2 but not Model 1. The results suggest that the explicit taskcuing procedure does not require an endogenous act of control. Clever Hans was a remarkable horse who could add, subtract, multiply, and divide numbers, working with fractions as well as integers. His owner, von Osten, would ask him questions and Hans would tap out the answers with his hoof. An early experimental psychologist, Oskar Pfungst (1907, 1911), investigated Hans’s ability and found that the horse responded to subtle visual cues
How to Fit a Response Time Distribution
"... Among the most valuable tools in behavioral science is statistically fitting mathematical models of cognition to data, response time distributions in particular. However, techniques for fitting distributions vary widely and little is known about the efficacy of different techniques. In this article, ..."
Abstract

Cited by 89 (1 self)
 Add to MetaCart
Among the most valuable tools in behavioral science is statistically fitting mathematical models of cognition to data, response time distributions in particular. However, techniques for fitting distributions vary widely and little is known about the efficacy of different techniques. In this article, we assessed several fitting techniques by simulating six widely cited models of response time and using the fitting procedures to recover model parameters. The techniques include the maximization of likelihood and leastsquares fits of the theoretical distributions to different empirical estimates of the simulated distributions. A running example was used to illustrate the different estimation and fitting procedures. The simulation studies revealed that empirical density estimates are biased even for very large sample sizes. Some fitting techniques yielded more accurate and less variable parameter estimates than others. Methods that involved leastsquares fits to density estimates generally yielded very poor parameter estimates. How to Fit a Response Time Distribution The importance of considering the entire response time (RT) distribution in testing formal models of cognition is now widely appreciated. Fitting a model to mean RT alone can mask important details of the data that examination of the entire distribution would reveal, such as the behavior of fast and slow responses across the conditions of an experiment (e.g., Heathcote, Popiel & Mewhort, 1991), the extent of facilitation between perceptual channels (Miller, 1982), and the effects of practice on RT quantiles (Logan, 1992). Techniques for testing hypotheses based on the RT distribution have been developed (Townsend, 1990). In addition, the RT distribution provides an important meeting ground between theory and da...
Statistical Mimicking of Reaction Time Data: Single Process Models, Parameter Variability and Mixtures
"... Statistical mimicking issues involving reaction time measures are introduced and discussed in this article. Often, discussions of mimicking have concerned the question of the serial vs. parallel processing of inputs to the cognitive system. We will demonstrate that there are several alternative st ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
Statistical mimicking issues involving reaction time measures are introduced and discussed in this article. Often, discussions of mimicking have concerned the question of the serial vs. parallel processing of inputs to the cognitive system. We will demonstrate that there are several alternative structures that mimic various existing models in the literature. In particular, single process models have been neglected in this area. When parameter variability is incorporated into single process models, resulting in discrete or continuous mixtures of reaction time distributions, the observed reaction time distribution alone is no longer as useful in allowing inferences to be made about the architecture of the process that produced it. Many of the issues are raised explicitly in examination of four different case studies of mimicking. Rather than casting a shadow over the use of quantitative methods in testing models of cognitive processes, these examples emphasize the importance of examining reaction time data armed with the tools of quantitative analysis, the importance of collecting data from the context of specific process models, and also the importance of expanding the data base to include other dependent measures.
Deriving exact predictions from the cascade model
 Psychological Review
, 1982
"... McClelland's (1979) cascade model is investigated, and it is shown that the model does not have a welldefined reaction time (RT) distribution function because it always predicts a nonzero probability that a response never occurs. By conditioning on the event that a response does occur, RT dens ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
McClelland's (1979) cascade model is investigated, and it is shown that the model does not have a welldefined reaction time (RT) distribution function because it always predicts a nonzero probability that a response never occurs. By conditioning on the event that a response does occur, RT density and distribution functions are derived, thus allowing most RT statistics to be computed directly and eliminating the need for computer simulations. Using these results, an investigation of the model revealed that (a) it predicts mean RT additivity in most cases of pure insertion or selective influence; (b) it predicts only a very small increase in standard deviations as mean RT increases; and (c) it does not mimic the distribution of discretestage models that have a serial stage with an exponentially distributed duration. Recently, McClelland (1979) proposed a continuoustime linear systems model of simple cognitive processes based on sequential banks of parallel integrators. This model, referred to by
Selective influence and response time cumulative distribution functions in serial–parallel task networks
 Journal of Mathematical Psychology
, 2000
"... We analyze sets of mental processes, some of which are concurrent and some of which are sequential, under the assumption that the processes are partially ordered, that is, arranged in a directed acyclic network. Information about the process arrangement can be discovered by examining the effects on ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
We analyze sets of mental processes, some of which are concurrent and some of which are sequential, under the assumption that the processes are partially ordered, that is, arranged in a directed acyclic network. Information about the process arrangement can be discovered by examining the effects on response time of selectively influencing process durations. Previous work has mainly focused on analyses of mean response times. Here we consider analyses based on cumulative distribution functions, for one of the major classes of directed acyclic networks, serialparallel networks. When two processes are selectively influenced, patterns in the cumulative distribution functions can be used to test whether the processes are sequential or concurrent and whether the task network has AND gates or OR gates. Cumulative distribution functions are potentially more informative than means, and some previous results for means are shown to follow from our results for cumulative distribution functions. 2000 Academic Press Considerable evidence indicates that the mental processes involved in some information processing tasks are neither entirely serial nor entirely parallel. Many alternative forms of processing are conceivable; one of the plausible and useful forms is a directed acyclic network, such as the one in Fig. 1. The network represents a combination of sequential and concurrent processing. The stimulus is presented at o, and processes x and z begin immediately and concurrently. Process y begins as soon as x is finished. The response is made at r as soon as both y and z are finished, if there is an AND gate, or as soon as either y or z is finished, if there is an OR gate. There are many examples of tasks for which good accounts of response time data have been given by directed acyclic networks (although not always by that name). doi:10.1006jmps.1999.1268, available online at
Parallel processing response times and experimental determination of the stopping rule
 Journal of Mathematical Psychology
, 1997
"... It was formerly demonstrated that virtually all reasonable exhaustive serial models, and a more constrained set of exhaustive parallel models, cannot predict critical effects associated with selfterminating models. The present investigation greatly generalizes the parallel class of models covered b ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
It was formerly demonstrated that virtually all reasonable exhaustive serial models, and a more constrained set of exhaustive parallel models, cannot predict critical effects associated with selfterminating models. The present investigation greatly generalizes the parallel class of models covered by similar ‘‘impossibility’ ’ theorems. Specifically, we prove that if an exhaustive parallel model is not super capacity, and if targets are processed at least as fast as nontargets, then it cannot predict such (selfterminating) effects. Such effects are ubiquitous in the experimental literature, offering strong confirmation for selfterminating processing.] 1997 Academic Press
A note on the stopsignal paradigm, or how to observe the unobservable
 Psychological Review
, 1990
"... A new theoretical analysis of the stopsignal paradigm is proposed. With the concepts of crude and net hazard functions, the nonobservable controllatency distribution can be estimated from observable reaction times. This result allows a test of Logan & Cowan's (1984) model without any sim ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
A new theoretical analysis of the stopsignal paradigm is proposed. With the concepts of crude and net hazard functions, the nonobservable controllatency distribution can be estimated from observable reaction times. This result allows a test of Logan & Cowan's (1984) model without any simplifying assumptions. In studies using response time as the major dependent variable, the stopsignal procedure has been used by several investigators in an attempt to reveal the time course of hypothesized underlying sensory or cognitive processes or both (e.g., Lappin &Eriksen, 1966; Oilman, 1973). In this paradigm, subjects are given a primary reaction time task to perform, and a stop signal that tells subjects to withhold their responses is sometimes presented shortly after the primary stimulus. Logan and Cowan (1984) presented a theory of inhibition of thought and action that focuses on this paradigm. The purpose of this note is to present a new theoretical analysis of the stopsignal paradigm
Empirical recovery of response time decomposition rules I: Samplelevel decomposition tests
 Journal of Mathematical Psychology
, 1996
"... Psychology, 39, 285314) developed a mathematical theory for the decomposability of response time (RT) into two component times that are selectively influenced by different factors and are either stochastically independent or perfectly positively stochastically interdependent (in which case they are ..."
Abstract

Cited by 9 (7 self)
 Add to MetaCart
Psychology, 39, 285314) developed a mathematical theory for the decomposability of response time (RT) into two component times that are selectively influenced by different factors and are either stochastically independent or perfectly positively stochastically interdependent (in which case they are increasing functions of a common random variable). In this theory, RT is obtained from its component times by means of an associative and commutative operation. For any such operation, there is a decomposition test, a relationship between observable RT distributions that holds if and (under mild constraints) only if the RTs are decomposable by means of this operation. In this paper, we construct a samplelevel version of these decomposition tests that serve to determine whether RTs that are represented by finite samples are decomposable by means of a given operation (under a given form of stochastic relationship between component times, independence or perfect positive interdependence). The decision is based on the asymp
Mental architectures with selectively influenced but stochastically interdependent components
, 2003
"... The way external factors influence distribution functions for the overall time required to perform a mental task (such as responding to a stimulus, or solving a problem) may be informative as to the underlying mental architecture, the hypothetical network of interconnected processes some of which ar ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
The way external factors influence distribution functions for the overall time required to perform a mental task (such as responding to a stimulus, or solving a problem) may be informative as to the underlying mental architecture, the hypothetical network of interconnected processes some of which are selectively influenced by some of the external factors. Under the assumption that all processes contributing to the overall performance time are stochastically independent, several basic results have been previously established. These results relate patterns of response time distribution functions produced by manipulating external factors to such questions as whether the hypothetical constituent processes in the mental architecture enter AND gates or OR gates, and whether pairs of processes are sequential or concurrent. The present study shows that all these results are also valid for stochastically interdependent component times, provided the selective dependence of these components upon external factors is understood within the framework of a recently proposed theory of selective influence. According to this theory each component is representable as a function of three arguments: the factor set selectively influencing it, a componentspecific source of randomness, and a source of randomness shared by all the components.
On the whats and hows of retrieval in the acquisition of a simple skill
 Journal of Experimental Psychology: Learning, Memory and Cognition
, 1999
"... Two general views on the role of memory in cognitive skills—an instancebased theory and an associative perspective—were compared with respect to their general assumptions about the information involved and the processes that operate on that information. Characteristics of memory information were ex ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Two general views on the role of memory in cognitive skills—an instancebased theory and an associative perspective—were compared with respect to their general assumptions about the information involved and the processes that operate on that information. Characteristics of memory information were examined in terms of predictions for transfer to various stimulus forms as a function of 2 types of learning conditions. Characteristics of memory processes were examined using a set of general process models. Results of 4 experiments indicate that (a) neither theoretical perspective was capable of accounting for all the observed transfer effects, indicating needed refinements to informational assumptions, and that (b) 1 class of process assumptions was consistently supported, whereas other classes were consistently contradicted, indicating a general set of process characteristics that can be used in further model development. The assumption that memory—retained information and the processes that operate on it—plays an important role in the acquisition and expression of cognitive skills (such as mathematics, chess, music composition, etc.) is a common one. Yet, despite the agreement that memory is critical, the nature of the memory information and the processes that operate on it have tended to be considered separately (e.g., see the general discussions in Massaro, 1998; Thomas, 1996) and in modelspecific rather than contrastive terms