Results 1 - 10
of
781
Markov Logic Networks
- MACHINE LEARNING
, 2006
"... We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge base with a weight attached to each formula (or clause). Together with a set of constants representing objects in the ..."
Abstract
-
Cited by 816 (39 self)
- Add to MetaCart
(Show Context)
We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge base with a weight attached to each formula (or clause). Together with a set of constants representing objects in the domain, it specifies a ground Markov network containing one feature for each possible grounding of a first-order formula in the KB, with the corresponding weight. Inference in MLNs is performed by MCMC over the minimal subset of the ground network required for answering the query. Weights are efficiently learned from relational databases by iteratively optimizing a pseudo-likelihood measure. Optionally, additional clauses are learned using inductive logic programming techniques. Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach.
SNOPT: An SQP Algorithm For Large-Scale Constrained Optimization
, 2002
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract
-
Cited by 597 (24 self)
- Add to MetaCart
(Show Context)
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse. We discuss
A Limited Memory Algorithm for Bound Constrained Optimization
- SIAM JOURNAL ON SCIENTIFIC COMPUTING
, 1994
"... An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based ..."
Abstract
-
Cited by 572 (9 self)
- Add to MetaCart
An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based
CUTE: Constrained and unconstrained testing environment
, 1993
"... The purpose of this paper is to discuss the scope and functionality of a versatile environment for testing small and large-scale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we belie ..."
Abstract
-
Cited by 188 (3 self)
- Add to MetaCart
The purpose of this paper is to discuss the scope and functionality of a versatile environment for testing small and large-scale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we believe that they will be useful in their own right and should be available to researchers for their development of optimization software. The tools are available by anonymous ftp from a number of sources and may, in many cases, be installed automatically. The scope of a major collection of test problems written in the standard input format (SIF) used by the LANCELOT software package is described. Recognising that most software was not written with the SIF in mind, we provide tools to assist in building an interface between this input format and other optimization packages. These tools already provide a link between the SIF and an number of existing packages, including MINOS and OSL. In ad...
Predicting clicks: Estimating the click-through rate for new ads
- In Proceedings of the 16th International World Wide Web Conference (WWW-07
, 2007
"... Search engine advertising has become a significant element of the Web browsing experience. Choosing the right ads for the query and the order in which they are displayed greatly affects the probability that a user will see and click on each ad. This ranking has a strong impact on the revenue the sea ..."
Abstract
-
Cited by 166 (1 self)
- Add to MetaCart
(Show Context)
Search engine advertising has become a significant element of the Web browsing experience. Choosing the right ads for the query and the order in which they are displayed greatly affects the probability that a user will see and click on each ad. This ranking has a strong impact on the revenue the search engine receives from the ads. Further, showing the user an ad that they prefer to click on improves user satisfaction. For these reasons, it is important to be able to accurately estimate the click-through rate of ads in the system. For ads that have been displayed repeatedly, this is empirically measurable, but for new ads, other means must be used. We show that we can use features of ads, terms, and advertisers to learn a model that accurately predicts the click-though rate for new ads. We also show that using our model improves the convergence and performance of an advertising system. As a result, our model increases both revenue and user satisfaction.
Representations Of Quasi-Newton Matrices And Their Use In Limited Memory Methods
, 1996
"... We derive compact representations of BFGS and symmetric rank-one matrices for optimization. These representations allow us to efficiently implement limited memory methods for large constrained optimization problems. In particular, we discuss how to compute projections of limited memory matrices onto ..."
Abstract
-
Cited by 160 (11 self)
- Add to MetaCart
We derive compact representations of BFGS and symmetric rank-one matrices for optimization. These representations allow us to efficiently implement limited memory methods for large constrained optimization problems. In particular, we discuss how to compute projections of limited memory matrices onto subspaces. We also present a compact representation of the matrices generated by Broyden's update for solving systems of nonlinear equations.
Supervised Random Walks: Predicting and Recommending Links in Social Networks
"... Predicting the occurrence of links is a fundamental problem in networks. In the link prediction problem we are given a snapshot of a network and would like to infer which interactions among existing members are likely to occur in the near future or which existing interactions are we missing. Althoug ..."
Abstract
-
Cited by 147 (3 self)
- Add to MetaCart
(Show Context)
Predicting the occurrence of links is a fundamental problem in networks. In the link prediction problem we are given a snapshot of a network and would like to infer which interactions among existing members are likely to occur in the near future or which existing interactions are we missing. Although this problem has been extensively studied, the challenge of how to effectively combine the information from the network structure with rich node and edge attribute data remains largely open. We develop an algorithm based on Supervised Random Walks that naturally combines the information from the network structure with node and edge level attributes. We achieve this by using these attributes to guide a random walk on the graph. We formulate a supervised learning task where the goal is to learn a function that assigns strengths to edges in the network such that a random walker is more likely to visit the nodes to which new links will be created in the future. We develop an efficient training algorithm to directly learn the edge strength estimation function. Our experiments on the Facebook social graph and large collaboration networks show that our approach outperforms state-of-theart unsupervised approaches as well as approaches that are based on feature extraction.
cdec: A decoder, alignment, and learning framework for finite-state and context-free translation models
- In Proceedings of ACL System Demonstrations
, 2010
"... We present cdec, an open source framework for decoding, aligning with, and training a number of statistical machine translation models, including word-based models, phrase-based models, and models based on synchronous context-free grammars. Using a single unified internal representation for translat ..."
Abstract
-
Cited by 134 (53 self)
- Add to MetaCart
(Show Context)
We present cdec, an open source framework for decoding, aligning with, and training a number of statistical machine translation models, including word-based models, phrase-based models, and models based on synchronous context-free grammars. Using a single unified internal representation for translation forests, the decoder strictly separates model-specific translation logic from general rescoring, pruning, and inference algorithms. From this unified representation, the decoder can extract not only the 1- or k-best translations, but also alignments to a reference, or the quantities necessary to drive discriminative training using gradient-based or gradient-free optimization techniques. Its efficient C++ implementation means that memory use and runtime performance are significantly better than comparable decoders. 1
Line Search Algorithms With Guaranteed Sufficient Decrease
- ACM Trans. Math. Software
, 1992
"... The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathologi ..."
Abstract
-
Cited by 121 (0 self)
- Add to MetaCart
(Show Context)
The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations. LINE SEARCH ALGORITHMS WITH GUARANTEED SUFFICIENT DECREASE Jorge J. Mor'e and David J. Thuente 1 Introduction Given a continuously differentiable function OE : IR ! IR defined on [0; 1) with OE 0 (0) ! 0, and constants and j in (0; 1), we are interested in finding an ff ? 0 such that OE(ff) OE(0) + OE 0 (0)ff (1:1) and jOE 0 (ff)j jjOE 0 (0)j: (1:2) The development of a search procedure that satisfies these conditions is a crucial ingredient in a line search meth...