Results 1 - 10
of
21
On the Implementation of the Probabilistic Logic Programming Language ProbLog
- UNDER CONSIDERATION FOR PUBLICATION IN THEORY AND PRACTICE OF LOGIC PROGRAMMING
, 2003
"... The past few years have seen a surge of interest in the field of probabilistic logic learning and statistical relational learning. In this endeavor, many probabilistic logics have been developed. ProbLog is a recent probabilistic extension of Prolog motivated by the mining of large biological networ ..."
Abstract
-
Cited by 41 (9 self)
- Add to MetaCart
The past few years have seen a surge of interest in the field of probabilistic logic learning and statistical relational learning. In this endeavor, many probabilistic logics have been developed. ProbLog is a recent probabilistic extension of Prolog motivated by the mining of large biological networks. In ProbLog, facts can be labeled with probabilities. These facts are treated as mutually independent random variables that indicate whether these facts belong to a randomly sampled program. Different kinds of queries can be posed to ProbLog programs. We introduce algorithms that allow the efficient execution of these queries, discuss their implementation on top of the YAP-Prolog system, and evaluate their performance in the context of large networks of biological entities.
Expectation Maximization over Binary Decision Diagrams for Probabilistic Logic Programs
"... Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a M ..."
Abstract
-
Cited by 20 (13 self)
- Add to MetaCart
(Show Context)
Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a Machine Learning technique targeted to Probabilistic Logic Programs, a family of formalisms where uncertainty is represented using Logic Programming tools. Among various proposals for Probabilistic Logic Programming, the one based on the distribution semantics is gaining popularity and is the basis for languages such as ICL, PRISM, ProbLog andLogic Programs with Annotated Disjunctions. This paper proposes a technique for learning parameters of these languages. Since their equivalent Bayesian networks contain hidden variables, an Expectation Maximization (EM) algorithm is adopted. In order to speed the computation up, expectations are computed directly on the Binary Decision Diagrams that are built for inference. The resulting system, called EMBLEM for “EM over Bdds for probabilistic Logic programs Efficient Mining”, has been applied to a number of datasets and showed good performances both in terms of speed and memory usage. In particular its speed allows the execution of a high number of restarts, resulting in good quality of the solutions.
F.: Experimentation of an expectation maximization algorithm for probabilistic logic programs
- Intelligenza Artificiale
, 2012
"... Statistical Relational Learning and Probabilistic Inductive Logic Programming are two emerging fields that use representation languages able to combine logic and probability. In the field of Logic Programming, the distribution semantics is one of the prominent approaches for representing uncertainty ..."
Abstract
-
Cited by 11 (5 self)
- Add to MetaCart
(Show Context)
Statistical Relational Learning and Probabilistic Inductive Logic Programming are two emerging fields that use representation languages able to combine logic and probability. In the field of Logic Programming, the distribution semantics is one of the prominent approaches for representing uncertainty and underlies many languages such as ICL, PRISM, ProbLog and LPADs. Learning the parameters for such languages requires an Expectation Maximization algorithm since their equivalent Bayesian networks contain hidden variables. EMBLEM (EM over BDDs for probabilistic Logic programs Efficient Mining) is an EM algorithm for languages following the distribution semantics that computes expectations directly on the Binary Decision Diagrams that are built for inference. In this paper we present experiments comparing EMBLEM with LeProbLog, Alchemy, CEM, RIB and LFI-ProbLog on six real world datasets. The results show that EMBLEM is able to solve problems on which the other systems fail and it often achieves significantly higher areas under the Precision Recall and the ROC curves in a similar time.
F.: Learning the structure of probabilistic logic programs
- ILP 2011. LNCS
, 2012
"... Abstract. There is a growing interest in the field of Probabilistic Inductive Logic Programming, which uses languages that integrate logic programming and probability. Many of these languages are based on the distribution semantics and recently various authors have proposed systems for learning the ..."
Abstract
-
Cited by 9 (5 self)
- Add to MetaCart
(Show Context)
Abstract. There is a growing interest in the field of Probabilistic Inductive Logic Programming, which uses languages that integrate logic programming and probability. Many of these languages are based on the distribution semantics and recently various authors have proposed systems for learning the parameters (PRISM, LeProbLog, LFI-ProbLog and EMBLEM) or both the structure and the parameters (SEM-CP-logic) of these languages. EMBLEM for example uses an Expectation Maximization approach in which the expectations are computed on Binary Decision Diagrams. In this paper we present the algorithm SLIPCASE for “Structure LearnIng of ProbabilistiC logic progrAmS with Em over bdds”. It performs a beam search in the space of the language of Logic Programs with Annotated Disjunctions (LPAD) using the log likelihood of the data as the guiding heuristics. To estimate the log likelihood of theory refinements it performs a limited number of Expectation Maximization iterations of EMBLEM. SLIPCASE has been tested on three realworld datasetsandcomparedwithSEM-CP-logic andLearningusing Structural Motifs, an algorithm for Markov Logic Networks. The results show that SLIPCASE achieves higher areas under the precision-recall and ROC curves and is more scalable.
An algebraic Prolog for reasoning about possible worlds
- In Proceedings of the 25th AAAI Conference on Artificial Intelligence (AAAI
, 2011
"... We introduce aProbLog, a generalization of the probabilis-tic logic programming language ProbLog. An aProbLog pro-gram consists of a set of definite clauses and a set of algebraic facts; each such fact is labeled with an element of a semiring. A wide variety of labels is possible, ranging from proba ..."
Abstract
-
Cited by 4 (2 self)
- Add to MetaCart
We introduce aProbLog, a generalization of the probabilis-tic logic programming language ProbLog. An aProbLog pro-gram consists of a set of definite clauses and a set of algebraic facts; each such fact is labeled with an element of a semiring. A wide variety of labels is possible, ranging from probabil-ity values to reals (representing costs or utilities), polynomi-als, Boolean functions or data structures. The semiring is then used to calculate labels of possible worlds and of queries. We formally define the semantics of aProbLog and study the aProbLog inference problem, which is concerned with com-puting the label of a query. Two conditions are introduced that allow one to simplify the inference problem, resulting in four different algorithms and settings. Representative basic prob-lems for each of these four settings are: is there a possible world where a query is true (SAT), how many such possible worlds are there (#SAT), what is the probability of a query being true (PROB), and what is the most likely world where the query is true (MPE). We further illustrate these settings with a number of tasks requiring more complex semirings. 1
Structure learning of probabilistic logic programs by searching the clause space
- CoRR/arXiv:1309.2080
, 2013
"... ar ..."
Inducing Probabilistic Relational Rules from Probabilistic Examples∗
"... We study the problem of inducing logic programs in a probabilistic setting, in which both the example descriptions and their classification can be proba-bilistic. The setting is incorporated in the proba-bilistic rule learner ProbFOIL+, which combines principles of the rule learner FOIL with ProbLog ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
We study the problem of inducing logic programs in a probabilistic setting, in which both the example descriptions and their classification can be proba-bilistic. The setting is incorporated in the proba-bilistic rule learner ProbFOIL+, which combines principles of the rule learner FOIL with ProbLog, a probabilistic Prolog. We illustrate the approach by applying it to the knowledge base of NELL, the Never-Ending Language Learner.
ProbLog: A
- Probabilistic Prolog and Its Application in Link Discovery”. In IJCAI
, 2007
"... ProbLog [1] is a probabilistic programming language that extends Prolog along the lines of Sato’s distribution semantics. Its development focusses especially on machine learning techniques and implementation aspects. The ProbLog implementation is publicly available as part of Yap Prolog at ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
ProbLog [1] is a probabilistic programming language that extends Prolog along the lines of Sato’s distribution semantics. Its development focusses especially on machine learning techniques and implementation aspects. The ProbLog implementation is publicly available as part of Yap Prolog at
Parameter learning in prism programs with continuous random variables. arXiv preprint arXiv:1203.4287
, 2012
"... ar ..."
Acknowledgements
, 2012
"... to my parents and girlfriend without whom, none of this would have been possible... ..."
Abstract
- Add to MetaCart
(Show Context)
to my parents and girlfriend without whom, none of this would have been possible...