Results 1  10
of
7,646
WellDefined Directional Distance Functions and Luenberger Productivity Indicators: Diagnosis of Infeasibilities and its Remedies
, 2005
"... A recent total factor productivity indicator presented in the literature is the Luenberger indicator based upon directional distance functions. This productivity indicator turns out to be impossible to compute under certain weak conditions. The same problems can also occur under less general produc ..."
Segmentation of brain MR images through a hidden Markov random field model and the expectationmaximization algorithm
 IEEE TRANSACTIONS ON MEDICAL. IMAGING
, 2001
"... The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limi ..."
Abstract

Cited by 639 (15 self)
 Add to MetaCart
limitationâ€”no spatial information is taken into account. This causes the FM model to work only on welldefined images with low levels of noise; unfortunately, this is often not the the case due to artifacts such as partial volume effect and bias field distortion. Under these conditions, FM model
Detection and Tracking of Point Features
 International Journal of Computer Vision
, 1991
"... The factorization method described in this series of reports requires an algorithm to track the motion of features in an image stream. Given the small interframe displacement made possible by the factorization approach, the best tracking method turns out to be the one proposed by Lucas and Kanade i ..."
Abstract

Cited by 629 (2 self)
 Add to MetaCart
in 1981. The method defines the measure of match between fixedsize feature windows in the past and current frame as the sum of squared intensity differences over the windows. The displacement is then defined as the one that minimizes this sum. For small motions, a linearization of the image intensities
Probabilistic Latent Semantic Indexing
, 1999
"... Probabilistic Latent Semantic Indexing is a novel approach to automated document indexing which is based on a statistical latent class model for factor analysis of count data. Fitted from a training corpus of text documents by a generalization of the Expectation Maximization algorithm, the utilized ..."
Abstract

Cited by 1225 (10 self)
 Add to MetaCart
model is able to deal with domainspecific synonymy as well as with polysemous words. In contrast to standard Latent Semantic Indexing (LSI) by Singular Value Decomposition, the probabilistic variant has a solid statistical foundation and defines a proper generative data model. Retrieval experiments
Learning Stochastic Logic Programs
, 2000
"... Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a firstorder r ..."
Abstract

Cited by 1194 (81 self)
 Add to MetaCart
Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a first
Monopolistic competition and optimum product diversity. The American Economic Review,
, 1977
"... The basic issue concerning production in welfare economics is whether a market solution will yield the socially optimum kinds and quantities of commodities. It is well known that problems can arise for three broad reasons: distributive justice; external effects; and scale economies. This paper is c ..."
Abstract

Cited by 1911 (5 self)
 Add to MetaCart
These lead to results involving transport costs or correlations among commodities or securities, and are hard to interpret in general terms. We therefore take a direct route, noting that the convexity of indifference surfaces of a conventional utility function defined over the quantities of all potential
Climate and atmospheric history of the past 420,000 years from the Vostok ice core,
 Antarctica. Nature
, 1999
"... Antarctica has allowed the extension of the ice record of atmospheric composition and climate to the past four glacialinterglacial cycles. The succession of changes through each climate cycle and termination was similar, and atmospheric and climate properties oscillated between stable bounds. Inte ..."
Abstract

Cited by 716 (15 self)
 Add to MetaCart
. Interglacial periods differed in temporal evolution and duration. Atmospheric concentrations of carbon dioxide and methane correlate well with Antarctic airtemperature throughout the record. Presentday atmospheric burdens of these two important greenhouse gases seem to have been unprecedented during the past
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannon
Motivation through the Design of Work: Test of a Theory. Organizational Behavior and Human Performance,
, 1976
"... A model is proposed that specifies the conditions under which individuals will become internally motivated to perform effectively on their jobs. The model focuses on the interaction among three classes of variables: (a) the psychological states of employees that must be present for internally motiv ..."
Abstract

Cited by 622 (2 self)
 Add to MetaCart
of conceptual tools that are directly useful in guiding the implementation and evaluation of work redesign projects. In the paragraphs to follow, we examine several existing theoretical approaches to work redesign, with a special eye toward the measurability of the concepts employed and the action implications
The Paradyn Parallel Performance Measurement Tools
 IEEE COMPUTER
, 1995
"... Paradyn is a performance measurement tool for parallel and distributed programs. Paradyn uses several novel technologies so that it scales to long running programs (hours or days) and large (thousand node) systems, and automates much of the search for performance bottlenecks. It can provide precise ..."
Abstract

Cited by 447 (39 self)
 Add to MetaCart
. The instrumentation is controlled by the Performance Consultant module, that automatically directs the placement of instrumentation. The Performance Consultant has a welldefined notion of performance bottlenecks and program structure, so that it can associate bottlenecks with specific causes and specific parts of a
Results 1  10
of
7,646