• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Markov Logic Networks (2006)

Cached

  • Download as a PDF

Download Links

  • [www.cs.umass.edu]
  • [www.cs.utexas.edu]
  • [people.cs.umass.edu]
  • [www.cs.washington.edu]
  • [alchemy.cs.washington.edu]
  • [research.microsoft.com]
  • [ai.cs.washington.edu]
  • [research.microsoft.com]
  • [research.microsoft.com]
  • [www.cs.washington.edu]
  • [www.ics.uci.edu]
  • [www.cs.washington.edu]
  • [homes.cs.washington.edu]
  • [homes.cs.washington.edu]
  • [homes.cs.washington.edu]
  • [eksl.isi.edu]
  • [www.lirmm.fr]
  • [homes.cs.washington.edu]
  • [homes.cs.washington.edu]
  • [homes.cs.washington.edu]
  • [www.lirmm.fr]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Matthew Richardson , Pedro Domingos
Venue:MACHINE LEARNING
Citations:816 - 39 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Richardson06markovlogic,
    author = {Matthew Richardson and Pedro Domingos},
    title = {Markov Logic Networks},
    booktitle = {MACHINE LEARNING},
    year = {2006},
    pages = {2006},
    publisher = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge base with a weight attached to each formula (or clause). Together with a set of constants representing objects in the domain, it specifies a ground Markov network containing one feature for each possible grounding of a first-order formula in the KB, with the corresponding weight. Inference in MLNs is performed by MCMC over the minimal subset of the ground network required for answering the query. Weights are efficiently learned from relational databases by iteratively optimizing a pseudo-likelihood measure. Optionally, additional clauses are learned using inductive logic programming techniques. Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach.

Keyphrases

markov logic network    first-order formula    corresponding weight    additional clause    real-world database    probabilistic graphical model    ground network    simple approach    minimal subset    university domain    single representation    relational database    ground markov network    possible grounding    first-order knowledge base    pseudo-likelihood measure    knowledge base    first-order logic    inductive logic programming technique   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University