Results 1 
5 of
5
Tractable Bayesian Learning of Tree Augmented Naive Bayes Classifiers
 In Proceedings of the Twentieth International Conference on Machine Learning
, 2003
"... Bayesian classifiers such as Naive Bayes or Tree Augmented Naive Bayes (TAN) have shown excellent performance given their simplicity and heavy underlying independence assumptions. In this paper we introduce a classifier taking as basis the TAN models and taking into account uncertainty in model sele ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Bayesian classifiers such as Naive Bayes or Tree Augmented Naive Bayes (TAN) have shown excellent performance given their simplicity and heavy underlying independence assumptions. In this paper we introduce a classifier taking as basis the TAN models and taking into account uncertainty in model selection. To do this we introduce decomposable distributions over TANs and show that the expression resulting from the Bayesian model averaging of TAN models can be integrated into closed form if we assume the prior probability distribution to be a decomposable distribution. This result allows for the construction of a classifier with a shorter learning time and a longer classification time than TAN. Empirical results show that the classifier is, most of the cases, more accurate than TAN and approximates better the class probabilities. 1.
ContextAware Activity Recognition using TAN Classifiers
 Massachusetts Institute of Technology
, 2002
"... This thesis reviews the components necessary for designing and implementing a realtime activity recognition system for mobile computing devices. In particular, a system utilizing GPS location data and tree augmented naive Bayes (TAN) classifiers is described and evaluated. The system can successful ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
This thesis reviews the components necessary for designing and implementing a realtime activity recognition system for mobile computing devices. In particular, a system utilizing GPS location data and tree augmented naive Bayes (TAN) classifiers is described and evaluated. The system can successfully recognize activities such as shopping, going to work, returning home, and going to a restaurant. Several different sets of features are tested using both the TAN algorithm and a test bed of other competitive classifiers. Experimental results show that the system can recognize about 85 % of activities correctly using a multinet version of the TAN algorithm. Although efforts were made to design a generalpurpose system, findings indicate that the nature of the position data and many relevant features are personspecific. The results from this research provide a foundation upon which future activity aware applications can be built.
TAN classifiers based on decomposable distributions
 Machine Learning
, 2005
"... Abstract. In this paper we present several Bayesian algorithms for learning Tree Augmented Naive Bayes (TAN) models. We extend the results in Meila & Jaakkola (2000a) to TANs by proving that accepting a prior decomposable distribution over TAN’s, we can compute the exact Bayesian model averaging ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we present several Bayesian algorithms for learning Tree Augmented Naive Bayes (TAN) models. We extend the results in Meila & Jaakkola (2000a) to TANs by proving that accepting a prior decomposable distribution over TAN’s, we can compute the exact Bayesian model averaging over TAN structures and parameters in polynomial time. Furthermore, we prove that the kmaximum a posteriori (MAP) TAN structures can also be computed in polynomial time. We use these results to correct minor errors in Meila & Jaakkola (2000a) and to construct several TAN based classifiers. We show that these classifiers provide consistently better predictions over Irvine datasets and artificially generated data than TAN based classifiers proposed in the literature.
Applying General Bayesian Techniques to Improve TAN Induction
"... Tree Augmented Naive Bayes (TAN) has shown to be competitive with stateoftheart machine learning algorithms [3]. However, the TAN induction algorithm that appears in [3] can be improved in several ways. In this paper we identify three shortcomings in it and introduce two ideas to overcome those p ..."
Abstract
 Add to MetaCart
(Show Context)
Tree Augmented Naive Bayes (TAN) has shown to be competitive with stateoftheart machine learning algorithms [3]. However, the TAN induction algorithm that appears in [3] can be improved in several ways. In this paper we identify three shortcomings in it and introduce two ideas to overcome those problems: the multinomial sampling approach to learning bayesian networks and local bayesian model averaging. These ideas approaches are generic and can thus be reused to improve other learning algorithms. We empirically test the new algorithms, and conclude that in most of the cases they lead to an improvement in accuracy in the classification and in the quality of the probabilities given as predictions. 1
Maximum a Posteriori Tree Augmented Naive Bayes Classifiers
, 2003
"... Bayesian classifiers such as Naive Bayes or Tree Augmented Naive Bayes (TAN) have shown excellent performance given their simplicity and heavy underlying independence assumptions. In this paper we prove that under suitable conditions it is possible to calculate efficiently the maximum a posterior TA ..."
Abstract
 Add to MetaCart
(Show Context)
Bayesian classifiers such as Naive Bayes or Tree Augmented Naive Bayes (TAN) have shown excellent performance given their simplicity and heavy underlying independence assumptions. In this paper we prove that under suitable conditions it is possible to calculate efficiently the maximum a posterior TAN model. Furthermore, we prove that it is also possible to calculate a weighted set with the k maximum a posteriori TAN models. This allows efficient TAN ensemble learning and accounting for model uncertainty. These results can be used to construct two classifiers. Both classifiers have the advantage of allowing the introduction of prior knowledge about structure or parameters into the learning process. Empirical results show that both classifiers lead to an improvement in error rate and accuracy of the predicted class probabilities over established TAN based classifiers with equivalent complexity.