• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Learning Stochastic Logic Programs (2000)

Cached

  • Download as a PDF

Download Links

  • [robotics.stanford.edu]
  • [www.aaai.org]
  • [www.aaai.org]
  • [ftp.cs.york.ac.uk]
  • [ftp.cs.york.ac.uk]
  • [pdf.aminer.org]
  • [ftp.cs.york.ac.uk]
  • [ftp.cs.york.ac.uk]
  • [ftp.cs.york.ac.uk]
  • [kameken.clique.jp]
  • [www.doc.ic.ac.uk]
  • [www.doc.ic.ac.uk]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Stephen Muggleton
Citations:1194 - 81 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Muggleton00learningstochastic,
    author = {Stephen Muggleton},
    title = {Learning Stochastic Logic Programs},
    year = {2000}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic context-free grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a first-order range-restricted definite clause. This paper summarises the syntax, distributional semantics and proof techniques for SLPs and then discusses how a standard Inductive Logic Programming (ILP) system, Progol, has been modied to support learning of SLPs. The resulting system 1) nds an SLP with uniform probability labels on each definition and near-maximal Bayes posterior probability and then 2) alters the probability labels to further increase the posterior probability. Stage 1) is implemented within CProgol4.5, which differs from previous versions of Progol by allowing user-defined evaluation functions written in Prolog. It is shown that maximising the Bayesian posterior function involves nding SLPs with short derivations of the examples. Search pruning with the Bayesian evaluation function is carried out in the same way as in previous versions of CProgol. The system is demonstrated with worked examples involving the learning of probability distributions over sequences as well as the learning of simple forms of uncertain knowledge.

Keyphrases

stochastic logic program    previous version    labelled clause    worked example    stochastic context-free grammar    uncertain knowledge    simple form    near-maximal bayes posterior probability    short derivation    posterior probability    distributional semantics    proof technique    bayesian posterior function    first-order range-restricted definite clause    bayesian evaluation function    probability label    probability distribution    standard inductive logic programming    hidden markov model    uniform probability label    user-defined evaluation function   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University