Results 11  20
of
121,010
Blind Signal Separation: Statistical Principles
, 2003
"... Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mut ..."
Abstract

Cited by 522 (4 self)
 Add to MetaCart
Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual independence between the signals. The weakness of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem, to show how they stem from basic principles and how they relate to each other.
Computational LambdaCalculus and Monads
, 1988
"... The calculus is considered an useful mathematical tool in the study of programming languages, since programs can be identified with terms. However, if one goes further and uses fijconversion to prove equivalence of programs, then a gross simplification 1 is introduced, that may jeopardise the ..."
Abstract

Cited by 505 (7 self)
 Add to MetaCart
The calculus is considered an useful mathematical tool in the study of programming languages, since programs can be identified with terms. However, if one goes further and uses fijconversion to prove equivalence of programs, then a gross simplification 1 is introduced, that may jeopardise the applicability of theoretical results to real situations. In this paper we introduce a new calculus based on a categorical semantics for computations. This calculus provides a correct basis for proving equivalence of programs, independent from any specific computational model. 1 Introduction This paper is about logics for reasoning about programs, in particular for proving equivalence of programs. Following a consolidated tradition in theoretical computer science we identify programs with the closed terms, possibly containing extra constants, corresponding to some features of the programming language under consideration. There are three approaches to proving equivalence of programs: ffl T...
Regression quantiles
 Econometrica
, 1978
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 870 (19 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
A Framework for Defining Logics
 JOURNAL OF THE ASSOCIATION FOR COMPUTING MACHINERY
, 1993
"... The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system of ariti ..."
Abstract

Cited by 807 (45 self)
 Add to MetaCart
The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system of arities. The treatment of rules and proofs focuses on his notion of a judgement. Logics are represented in LF via a new principle, the judgements as types principle, whereby each judgement is identified with the type of its proofs. This allows for a smooth treatment of discharge and variable occurrence conditions and leads to a uniform treatment of rules and proofs whereby rules are viewed as proofs of higherorder judgements and proof checking is reduced to type checking. The practical benefit of our treatment of formal systems is that logicindependent tools such as proof editors and proof checkers can be constructed.
Assessing agreement on classification tasks: the kappa statistic
 Computational Linguistics
, 1996
"... Currently, computational linguists and cognitive scientists working in the area of discourse and dialogue argue that their subjective judgments are reliable using several different statistics, none of which are easily interpretable or comparable to each other. Meanwhile, researchers in content analy ..."
Abstract

Cited by 829 (9 self)
 Add to MetaCart
Currently, computational linguists and cognitive scientists working in the area of discourse and dialogue argue that their subjective judgments are reliable using several different statistics, none of which are easily interpretable or comparable to each other. Meanwhile, researchers in content analysis have already experienced the same difficulties and come up with a solution in the kappa statistic. We discuss what is wrong with reliability measures as they are currently used for discourse and dialogue work in computational linguistics and cognitive science, and argue that we would be better off as a field adopting techniques from content analysis. 1
Segmentation of brain MR images through a hidden Markov random field model and the expectationmaximization algorithm
 IEEE TRANSACTIONS ON MEDICAL. IMAGING
, 2001
"... The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limi ..."
Abstract

Cited by 619 (14 self)
 Add to MetaCart
The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limitationâ€”no spatial information is taken into account. This causes the FM model to work only on welldefined images with low levels of noise; unfortunately, this is often not the the case due to artifacts such as partial volume effect and bias field distortion. Under these conditions, FM modelbased methods produce unreliable results. In this paper, we propose a novel hidden Markov random field (HMRF) model, which is a stochastic process generated by a MRF whose state sequence cannot be observed directly but which can be indirectly estimated through observations. Mathematically, it can be shown that the FM model is a degenerate version of the HMRF model. The advantage of the HMRF model derives from the way in which the spatial information is encoded through the mutual influences of neighboring sites. Although MRF modeling has been employed in MR image segmentation by other researchers, most reported methods are limited to using MRF as a general prior in an FM modelbased approach. To fit the HMRF model, an EM algorithm is used. We show that by incorporating both the HMRF model and the EM algorithm into a HMRFEM framework, an accurate and robust segmentation can be achieved. More importantly, the HMRFEM framework can easily be combined with other techniques. As an example, we show how the bias field correction algorithm of Guillemaud and Brady (1997) can be incorporated into this framework to achieve a threedimensional fully automated approach for brain MR image segmentation.
Temporal and modal logic
 HANDBOOK OF THEORETICAL COMPUTER SCIENCE
, 1995
"... We give a comprehensive and unifying survey of the theoretical aspects of Temporal and modal logic. ..."
Abstract

Cited by 1300 (17 self)
 Add to MetaCart
We give a comprehensive and unifying survey of the theoretical aspects of Temporal and modal logic.
The Valuation of Options for Alternative Stochastic Processes
 Journal of Financial Economics
, 1976
"... This paper examines the structure of option valuation problems and develops a new technique for their solution. It also introduces several jump and diffusion processes which have nol been used in previous models. The technique is applied lo these processes to find explicit option valuation formulas, ..."
Abstract

Cited by 661 (4 self)
 Add to MetaCart
This paper examines the structure of option valuation problems and develops a new technique for their solution. It also introduces several jump and diffusion processes which have nol been used in previous models. The technique is applied lo these processes to find explicit option valuation formulas, and solutions to some previously unsolved problems involving the pricing ofsecurities with payouts and potential bankruptcy. 1.
Results 11  20
of
121,010