Results 1  10
of
153
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 770 (3 self)
 Add to MetaCart
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linearGaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying RaoBlackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.
Background to Qualitative Decision Theory
 AI MAGAZINE
, 1999
"... This paper provides an overview of the field of qualitative decision theory: its motivating tasks and issues, its antecedents, and its prospects. Qualitative decision theory studies qualitative approaches to problems of decision making and their sound and effective reconciliation and integration ..."
Abstract

Cited by 95 (4 self)
 Add to MetaCart
This paper provides an overview of the field of qualitative decision theory: its motivating tasks and issues, its antecedents, and its prospects. Qualitative decision theory studies qualitative approaches to problems of decision making and their sound and effective reconciliation and integration with quantitative approaches. Though it inherits from a long tradition, the field offers a new focus on a number of important unanswered questions of common concern to artificial intelligence, economics, law, psychology, and management.
Efficient Reasoning in Qualitative Probabilistic Networks
 In Proceedings of the 11th National Conference on Artificial Intelligence (AAAI93
, 1993
"... Qualitative Probabilistic Networks (QPNs) are an abstraction of Bayesian belief networks replacing numerical relations by qualitative influences and synergies [ Wellman, 1990b ] . To reason in a QPN is to find the effect of new evidence on each node in terms of the sign of the change in belief (incr ..."
Abstract

Cited by 68 (9 self)
 Add to MetaCart
(Show Context)
Qualitative Probabilistic Networks (QPNs) are an abstraction of Bayesian belief networks replacing numerical relations by qualitative influences and synergies [ Wellman, 1990b ] . To reason in a QPN is to find the effect of new evidence on each node in terms of the sign of the change in belief (increase or decrease). We introduce a polynomial time algorithm for reasoning in QPNs, based on local sign propagation. It extends our previous scheme from singly connected to general multiply connected networks. Unlike existing graphreduction algorithms, it preserves the network structure and determines the effect of evidence on all nodes in the network. This aids metalevel reasoning about the model and automatic generation of intuitive explanations of probabilistic reasoning. Introduction A formal representation should not use more specificity than needed to support the reasoning required of it. The appropriate degree of specificity or numerical precision will vary depending on what kind o...
Efficient utility functions for ceteris paribus preferences
 In Proceedings of the Eighteenth National Conference on Artificial Intelligence
, 2002
"... Although ceteris paribus preference statements concisely represent one natural class of preferences over outcomes or goals, many applications of such preferences require numeric utility function representations to achieve computational efficiency. We provide algorithms, complete for finite universes ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
(Show Context)
Although ceteris paribus preference statements concisely represent one natural class of preferences over outcomes or goals, many applications of such preferences require numeric utility function representations to achieve computational efficiency. We provide algorithms, complete for finite universes of binary features, for converting a set of qualitative ceteris paribus preferences into quantitative utility functions.
A Review of Explanation Methods for Bayesian Networks
 Knowledge Engineering Review
, 2000
"... One of the key factors for the acceptance of expert systems in real world domains is the capability to explain their reasoning. This paper describes the basic properties that characterize explanation methods and reviews the methods developed up to date for explanation in Bayesian networks. ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
One of the key factors for the acceptance of expert systems in real world domains is the capability to explain their reasoning. This paper describes the basic properties that characterize explanation methods and reviews the methods developed up to date for explanation in Bayesian networks.
Conditional Objects as Nonmonotonic Consequence Relationships.
 IEEE Trans. Syst. Man Cybern.
, 1994
"... ..."
(Show Context)
Path Planning under TimeDependent Uncertainty
 In Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence
, 1995
"... Standard algorithms for finding the shortest path in a graph require that the cost of a path be additive in edge costs, and typically assume that costs are deterministic. We consider the problem of uncertain edge costs, with potential probabilistic dependencies among the costs. Although these depend ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
Standard algorithms for finding the shortest path in a graph require that the cost of a path be additive in edge costs, and typically assume that costs are deterministic. We consider the problem of uncertain edge costs, with potential probabilistic dependencies among the costs. Although these dependencies violate the standard dynamicprogramming decomposition, we identify a weaker stochastic consistency condition that justifies a generalized dynamicprogramming approach based on stochastic dominance. We present a revised pathplanning algorithm and prove that it produces optimal paths under timedependent uncertain costs. We illustrate the algorithm by applying it to a model of stochastic bus networks, and present sample performance results comparing it to some alternatives. For the case where all or some of the uncertainty is resolved during path traversal, we extend the algorithm to produce optimal policies. This report is based on a paper presented at the Eleventh Conference on Unc...
Inference in Bayesian Networks
, 1999
"... A Bayesian network is a compact, expressive representation of uncertain relationships among parameters in a domain. In this article, I introduce basic methods for computing with Bayesian networks, starting with the simple idea of summing the probabilities of events of interest. The article introduce ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
A Bayesian network is a compact, expressive representation of uncertain relationships among parameters in a domain. In this article, I introduce basic methods for computing with Bayesian networks, starting with the simple idea of summing the probabilities of events of interest. The article introduces major current methods for exact computation, briefly surveys approximation methods, and closes with a brief discussion of open issues.
Elicitation of Probabilities for Belief Networks: Combining Qualitative and . . .
 IN UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (95): PROCEEDINGS OF THE 11TH CONFERENCE, LOS ALTOS CA
, 1995
"... Although the usefulness of belief networks for reasoning under uncertainty is widely accepted, obtaining numerical probabilities that they require is still perceived a major obstacle. Often not enough statistical data is available to allow for reliable probability estimation. Available informa ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
Although the usefulness of belief networks for reasoning under uncertainty is widely accepted, obtaining numerical probabilities that they require is still perceived a major obstacle. Often not enough statistical data is available to allow for reliable probability estimation. Available information may not be directly amenable for encoding in the network. Finally, domain experts may be reluctant to provide numerical probabilities. In this paper, we propose a method for elicitation of probabilities from a domain expert that is noninvasive and accommodates whatever probabilistic information the expert is willing to state. We express all available information, whether qualitative or quantitative in nature, in a canonical form consisting of (in)equalities expressing constraints on the hyperspace of possible joint probability distributions. We then use this canonical form to derive secondorder probability distributions over the desired probabilities.
Qualitative Verbal Explanations in Bayesian Belief Networks
, 1996
"... Application of Bayesian belief networks in systems that interact directly with human users, such as decision support systems, requires effective user interfaces. The principal task of such interfaces is bridging the gap between probabilistic models and human intuitive approaches to modeling uncer ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
Application of Bayesian belief networks in systems that interact directly with human users, such as decision support systems, requires effective user interfaces. The principal task of such interfaces is bridging the gap between probabilistic models and human intuitive approaches to modeling uncertainty. We describe several methods for automatic generation of qualitative verbal explanations in systems based on Bayesian belief networks. We show simple techniques for explaining the structure of a belief network model and the interactions among its variables. We also present a technique for generating qualitative explanations of reasoning. Keywords: Explanation, Bayesian belief networks, qualitative probabilistic networks 1 Introduction The purpose of computing is insight, not numbers. Richard Wesley Hamming As the increasing number of successful applications in such domains as diagnosis, planning, learning, vision, and natural language processing demonstrates, Bayesian belief ne...