Results 1 
3 of
3
Hingeloss Markov random fields and probabilistic soft logic
, 2015
"... A fundamental challenge in developing highimpact machine learning technologies is balancing the ability to model rich, structured domains with the ability to scale to big data. Many important problem areas are both richly structured and large scale, from social and biological networks, to knowledge ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
A fundamental challenge in developing highimpact machine learning technologies is balancing the ability to model rich, structured domains with the ability to scale to big data. Many important problem areas are both richly structured and large scale, from social and biological networks, to knowledge graphs and the Web, to images, video, and natural language. In this paper, we introduce two new formalisms for modeling structured data, distinguished from previous approaches by their ability to both capture rich structure and scale to big data. The first, hingeloss Markov random fields (HLMRFs), is a new kind of probabilistic graphical model that generalizes different approaches to convex inference. We unite three approaches from the randomized algorithms, probabilistic graphical models, and fuzzy logic communities, showing that all three lead to the same inference objective. We then derive HLMRFs by generalizing this unified objective. The second new formalism, probabilistic soft logic (PSL), is a probabilistic programming language that makes HLMRFs easy to define using a syntax based on firstorder logic. We next introduce an algorithm for inferring mostprobable variable assignments (MAP inference) that is much more scalable than generalpurpose convex optimization software, because it uses message passing to take advantage of sparse dependency structures. We then show how to learn the parameters of HLMRFs. The learned HLMRFs are as accurate as analogous discrete models, but much more scalable. Together, these algorithms enable HLMRFs and PSL to model rich, structured data at scales not previously possible.
Paireddual learning for fast training of latent variable hingeloss mrfs
 In Proceedings of the International Conference of Machine Learning
, 2015
"... Latent variables allow probabilistic graphical models to capture nuance and structure in important domains such as network science, natural language processing, and computer vision. Naive approaches to learning such complex models can be prohibitively expensive—because they require repeated inferen ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Latent variables allow probabilistic graphical models to capture nuance and structure in important domains such as network science, natural language processing, and computer vision. Naive approaches to learning such complex models can be prohibitively expensive—because they require repeated inferences to update beliefs about latent variables—so lifting this restriction for useful classes of models is an important problem. Hingeloss Markov random fields (HLMRFs) are graphical models that allow highly scalable inference and learning in structured domains, in part by representing structured problems with continuous variables. However, this representation leads to challenges when learning with latent variables. We introduce paireddual learning, a framework that greatly speeds up training by using tractable entropy surrogates and avoiding repeated inferences. Paireddual learning optimizes an objective with a pair of dual inference problems. This allows fast, joint optimization of parameters and dual variables. We evaluate on socialgroup detection, trust prediction in social networks, and image reconstruction, finding that paireddual learning trains models as accurate as those trained by traditional methods in much less time, often before traditional methods make even a single parameter update. 1.