• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Conditional random fields: Probabilistic models for segmenting and labeling sequence data (2001)

Cached

  • Download as a PDF

Download Links

  • [www.seas.upenn.edu]
  • [l2r.cs.uiuc.edu]
  • [www.cis.upenn.edu]
  • [l2r.cs.uiuc.edu]
  • [www.wisdom.weizmann.ac.il]
  • [www2.denizyuret.com]
  • [damas.ift.ulaval.ca]
  • [www.damas.ift.ulaval.ca]
  • [www.seas.upenn.edu]
  • [www.cs.utah.edu]
  • [www.wisdom.weizmann.ac.il]
  • [l2r.cs.uiuc.edu]
  • [nlp.cs.nyu.edu]
  • [www.aladdin.cs.cmu.edu]
  • [www.cs.cmu.edu]
  • [www.cs.toronto.edu]
  • [www.cis.upenn.edu]
  • [www.cs.cmu.edu]
  • [www.cs.cmu.edu]
  • [www.cs.columbia.edu]
  • [www.cs.columbia.edu]
  • [www.menem.com]
  • [www.menem.com]
  • [repository.upenn.edu]
  • [www.facweb.iitkgp.ernet.in]
  • [www.facweb.iitkgp.ernet.in]

  • Other Repositories/Bibliography

  • CiteULike
  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by John Lafferty
Citations:3482 - 85 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Lafferty01conditionalrandom,
    author = {John Lafferty},
    title = {Conditional random fields: Probabilistic models for segmenting and labeling sequence data},
    booktitle = {},
    year = {2001},
    pages = {282--289},
    publisher = {Morgan Kaufmann}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We present conditional random fields, a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions made in those models. Conditional random fields also avoid a fundamental limitation of maximum entropy Markov models (MEMMs) and other discriminative Markov models based on directed graphical models, which can be biased towards states with few successor states. We present iterative parameter estimation algorithms for conditional random fields and compare the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data. 1.

Keyphrases

conditional random field    probabilistic model    sequence data    label sequence data    stochastic grammar    towards state    directed graphical model    maximum entropy markov model    natural-language data    hidden markov model    successor state    present iterative parameter estimation algorithm    several advantage    strong independence assumption    fundamental limitation    discriminative markov model   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University