Results 1  10
of
24
The potential of using quantum theory to build models of cognition.
 Topics in Cognitive Science,
, 2013
"... Abstract Quantum cognition research applies abstract, mathematical principles of quantum theory to inquiries in cognitive science. It differs fundamentally from alternative speculations about quantum brain processes. This topic presents new developments within this research program. In the introduc ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
(Show Context)
Abstract Quantum cognition research applies abstract, mathematical principles of quantum theory to inquiries in cognitive science. It differs fundamentally from alternative speculations about quantum brain processes. This topic presents new developments within this research program. In the introduction to this topic, we try to answer three questions: Why apply quantum concepts to human cognition? How is quantum cognitive modeling different from traditional cognitive modeling? What cognitive processes have been modeled using a quantum account? In addition, a brief introduction to quantum probability theory and a concrete example is provided to illustrate how a quantum cognitive model can be developed to explain paradoxical empirical findings in psychological literature. Keywords: Quantum probability; Classical probability; Cognitive process; Compatibility; Entanglement; NonBoolean logic With astonishing counterintuitive ramifications, quantum theory is the best empirically confirmed scientific theory in human history. It is "essential to every natural science" and its practical applications, such as the laser and the transistor, form the direct basis of at least onethird of our current economy (Rosenblum & Kuttner, 2006, p. 81). However, applying quantum theory to human cognition is not merely a simple extension of a most successful scientific theory. Rather, this endeavor is driven by a myriad of puzzling findings and stubborn challenges in psychological literature, by deep resonations between basic notions of quantum theory and psychological conceptions and intuitions, and by the Correspondence should be sent to Zheng Wang, School of Communication, The Ohio State University, 3045 Derby Hall, 154 N Oval Mall, Columbus OH 432101339 Wang, Z., Busemeyer, J. R., Atmanspacher, H., & Pothos, E. M. (2013). The potential of using quantum theory to build models of cognition. Topics in Cognitive Science, 5, 672688. doi: 10.1111/tops.12043 exhibited potential of the theory to provide coherent and mathematically principled explanations for the puzzles and challenges in human cognitive research Given the still nascent status of quantum cognition research, it is important to note that it differs from the approaches which treat (parts of) the brain literally as material quantum systems or a quantum computer (e.g., This topic presents new developments within the quantum cognition modeling research program. In the introduction to this special issue, we try to answer three questions: Why apply quantum concepts to human cognition? How is quantum cognitive modeling different from traditional cognitive modeling? What cognitive processes have been modeled using a quantum account? In addition, a brief introduction to quantum probability theory can be found in the Appendix A. Why use quantum theory to build models of human cognition? Quantum physics was created to explain puzzling findings that were impossible to understand using traditional classical physical theories. In the process of creating quantum mechanics, physicists also had to accept basically new ways of thinking that ultimately included a novel understanding of probabilities. Currently, we see a similar development happening in areas of cognitive science. Previously, almost all cognitive modeling research has relied on principles derived from classical probability theory and mathematical principles of classical mechanics. Although tremendous progress has been achieved in understanding cognition in this way, empirical findings have accumulated which seem puzzling within the framework of classical probability theory. For example, in decision research, to explain all kinds of decision fallacies and bias, various tool boxes of heuristics had to be proposed to bypass or patch the classical theoretical framework (e.g., First, the challenge of formalizing psychological concepts of conflict, ambiguity, and uncertainty. Traditional cognitive models assume that at each moment, a person is in a definite (technically, nondispersive) state with respect to certain judgment and cognition; however, the person's true state is unknown to the modeler at each moment, and thus the model can only assign a probability to a cognitive response with some value at each Third, the challenge of formalizing order effects of cognitive measurements. The change in cognitive states that results from answering one question can cause a person to respond differently to subsequent questions Fifth, the challenge of understanding nondecomposability of cognition. Conventionally in cognitive science, concepts and processes are assumed to be "decomposable" so that the whole can be understood in terms of its constituent components. This decomposability is reflected in the common assumption that there exists a complete joint probability distribution across all measurements on the cognitive system, from which one can reconstruct a pairwise joint distribution for any pair of measurements. However, psychological literature has supplied ample findings indicating otherwise. For example, facial cognition is hallmarked by strong reliance on holistic properties of faces (e.g., How do quantum models differ from traditional models? Four groups of key concepts and principles of quantum theory best illustrate how quantum cognitive models differ from traditional cognitive models in the study of cognitive processes. These four groups closely relate to each other but emphasize different features of quantum theory. An illustrative example using some of the key concepts and principles can be found in the Appendix A. First, dispersive states and quantum probabilities. When Heisenberg and Schr€ odinger presented their different versions of the new quantum mechanics in terms of "matrix mechanics" and "wave mechanics" in the mid1920s, their work enabled enormous, rapid progress in understanding the behavior of the microworld. In three articles during 19271929 (reprinted in 1962), von Neumann rigorously proved the equivalence of matrix and wave mechanics, and introduced the mathematical framework of Hilbert space and operator calculus to develop a statistical ensemble formalism for quantum theory. It is summarized in his monograph, Mathematical Foundations of Quantum Mechanics (1932), which is still regarded as the standard reference for orthodox quantum theory. Von Neumann's approach is statistical because the predictions of the theory refer to ensembles of experimental results so that only probabilistic predictions are possible for individual results. It is crucial to note that the rules for quantum probabilities differ from those of classical probabilities. Not only mixed ensemble states but also pure quantum superposition states are dispersive: They cannot be represented pointwise, as we are used to doing in classical point mechanics. Born's rule States of individual quantum systems are called pure states, which are represented as vectors in a vector space (the Hilbert space). Pure states are dispersive, and every superposition of pure states is another pure state. Another class of dispersive states is mixed states, which are states of statistical ensembles of pure states with different probabilities. They are also called statistical states. While dispersive pure states are considered the source of a "genuine" (ontic) randomness in quantum theory, the dispersion of mixed states is an ensemble property, often interpreted as expressing (epistemic) incomplete knowledge about the exact state of the system (beim The usual way of formulating Bell inequalities refers to measurements of pairs of spatially separated variables. However, it is also possible to derive temporal Bell inequalities Third, noncommuting observables and complementarity. Another distinction between classical and quantum behavior is whether observables (i.e., measurements) are commuting or noncommuting. While systems studied in classical physics are restricted to commuting observables, quantum systems can have both commuting (e.g., mass and charge) and noncommuting observables (e.g., position and momentum). As discussed earlier, the noncommutativity of observables A and B can be characterized as AB 6 ¼ BA. This expresses an order effect unknown for classical observables, which obey the law of commutativity, AB = BA. Any observable can be decomposed into its eigenstates. If observables A and B do not commute, they are incompatible and have no complete orthonormal system of common eigenstatesthat is, they may share some but not all eigenstates. A As observables in quantum theory play a double role as codifications of properties of a system and operations on the state of a system, they can also be considered "actions." In this sense, noncommutativity means that the action sequence of B and then A on an initial state leads to a result different from what is produced by reversing the sequence of the actions. The uncertainty relations introduced by Heisenberg (1927) are a direct consequence of this novel feature of noncommuting observables in quantum theory. As mentioned above, maximally incompatible observables are said to be complementary. Fourth, nonBoolean and partial Boolean logic, and complementarity. In the mid1930s, von Neumann realized a number of severe limitations of his original framework of quantum theory and proposed two generalizations. One was an algebraic approach, which expressed quantum theory in terms of generally noncommutative operator algebras Quantum logic entails a violation of the Boolean law of the excluded middle or tertium nondatur. In an early lattice theoretical formulation of quantum logic The above characterization, condensed and abstract, is intended to underline how general quantum principles can be formulated. The Appendix A provides a concrete example which illustrates how a quantum cognitive model can be developed to explain paradoxical empirical findings in psychological literature. Current quantum cognitive models The tightly connected key features of quantum theory, as sketched in the preceding sections, illustrate the basic procedures for using quantum theory to understand human cognition. As mentioned, some of these key conceptions, most notably complementarity, were proposed by psychologists before they proved to be essential for quantum physics. However, in quantum physics, these conceptions were rigorously formalized to enable precise empirical predictions and testing. The basic notions of quantum theory not only represent essential features of how nature is organized but also of how our fields of knowledge are organized, which is a key focus of cognitive science. Some of the founding fathers of quantum theory, most ardently Niels Bohr, were convinced that the theory's central notions would prove meaningful outside of physics, in areas such as psychology and philosophy The first steps to utilize Bohr's proposal were made in the 1990s by Aerts and colleagues (e.g., Ambiguous perception. A good example is bistable perception, which concerns alternating views of ambiguous figures, such as the Necker cube. Semantic networks. The difficult issue of meaning in natural languages is often explored in terms of semantic networks. Gabora and Aerts (2002) described the contextual manner in which concepts are evoked, used, and combined to generate meaning. Their ideas about concept association in an evolutionary context were further developed in recent work Order effects of cognitive measurements. Question order effects in surveys have been recognized for a long time (Schwarz & Sudman, 1992) but are still insufficiently understood today. In this introduction, we have discussed how quantum theory was developed in physics to address the puzzling findings that the order of measurements can affect the final observed probabilities. Thus, it seems natural to apply quantum probability theory to understanding the effects of the order of measurements in human cognitive behavior Memory. Findings that are puzzling from a classical probability perspective exist in more basic human cognitive processes as well, such as basic memory processes. One paradoxical phenomenon is called the episodic overdistribution effect, which can be explained using a simple quantum model involving the superposition of memory traces These active research areas illustrate the potential of using the set of coherent concepts of quantum theory to formalize a large variety of cognitive phenomena. These phenomena often seem puzzling from the classical probability framework. Perhaps in contrast to the common impression of being mysterious, quantum theory is inherently consistent with deeply rooted psychological conceptions and intuitions. It offers a fresh conceptual framework for explaining empirical puzzles of cognition and provides a rich new source of alternative formal tools for cognitive modeling.
J.V.: A qualified Kolmogorovian account of probabilistic contextuality
 Lecture Notes in Computer Science
"... ar ..."
(Show Context)
A quantum probability explanation in Fock space for borderline contradictions
 Journal of Mathematical Psychology
, 2014
"... ar ..."
Probabilistic Contextuality in EPR/Bohmtype Systems with Signaling Allowed,” arXiv:1406.0243
, 2014
"... iv ..."
(Show Context)
Quantum cognition: A new theoretical approach to psychology. Trends in Cognitive Science.
, 2015
"... What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that huma ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach. A new approach to cognitive science Cognitive scientists have long struggled to form a comprehensive understanding of how humans make judgments and decisions under conflict and uncertainty. Decades of research have seen two approaches crystallize to the surface 'heuristic' and 'rational.' The heuristic approach is firmly rooted in Herbert Simon's notion of bounded rationality To what probability theory does quantum cognition subscribe? This question may come as a surprise to some readers because, by and large, cognitive scientists have been exposed to a single probability theory what we will call classical (more technically, Kolmogorov) probability theory, on which Bayesian models rest. There are, however, several viable probability theories upon which to build probabilistic models of cognition From quantum physics to quantum cognition Less than 20 years ago a group of scientists pioneered the bold idea of applying the abstract principles from quantum theory outside physics to the field of human cognition Feature Review 13646613/ ß
MeaningFocused and QuantumInspired Information Retrieval
 In Quantum Interaction
, 2014
"... In recent years, quantumbased methods have promisingly integrated the traditional procedures in information retrieval (IR) and natural language processing (NLP). Inspired by our research on the identification and application of quantum structures in cognition, more specifically our work on the re ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
In recent years, quantumbased methods have promisingly integrated the traditional procedures in information retrieval (IR) and natural language processing (NLP). Inspired by our research on the identification and application of quantum structures in cognition, more specifically our work on the representation of concepts and their combinations, we put forward a ‘quantum meaning based ’ framework for structured query retrieval in text corpora and standardized testing corpora. This scheme for IR rests on considering as basic notions, (i) ‘entities of meaning’, e.g., concepts and their combinations, and (ii) traces of such entities of meaning, which is how documents are considered in this approach. The meaning content of these ‘entities of meaning ’ is reconstructed by solving an ‘inverse problem ’ in the quantum formalism, consisting of reconstructing the full states of the entities of meaning from their collapsed states identified as traces in relevant documents. The advantages with respect to traditional approaches, such as Latent Semantic Analysis (LSA), are discussed by means of concrete examples.
A new fundamental evidence of nonclassical structure in the combination of natural concepts
, 2015
"... We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension, etc.) from the rules of classical (fuzzy) logic ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension, etc.) from the rules of classical (fuzzy) logic and probability theory observed by several scholars in concept theory, while our data were successfully modeled in a quantumtheoretic framework developed by ourselves. In this paper, we isolate a new, very stable and systematic pattern of violation of classicality that occurs in concept combinations. In addition, the strength and regularity of this nonclassical effect leads us to believe that it occurs at a more fundamental level than the deviations observed up to now. It is our opinion that we have identified a deep nonclassical mechanism determining not only how concepts are combined but, rather, how they are formed. We show that this effect can be faithfully modeled in a twosector Fock space structure, and that it can be exactly explained by assuming that human thought is the supersposition of two processes, a 'logical reasoning', guided by 'logic', and a 'conceptual reasoning' guided by 'emergence', and that the latter generally prevails over the former. All these findings provide a new fundamental support to our quantumtheoretic approach to human cognition.