Results 1 
7 of
7
OffenseDefense Approach to Ranking Team Sports
"... The rank of an object is its relative importance to the other objects in the set. Often a rank is an integer assigned from the set {1, 2,..., n}. Ideally an assignment of available ranks ({1, 2,..., n}) to n objects is onetoone. However in certain circumstances it is possible that more than one o ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
The rank of an object is its relative importance to the other objects in the set. Often a rank is an integer assigned from the set {1, 2,..., n}. Ideally an assignment of available ranks ({1, 2,..., n}) to n objects is onetoone. However in certain circumstances it is possible that more than one object is assigned the same rank. A ranking model is
Multiple Kernel Learning Using Nearest Neighbor Classifiers
"... We study the problem of multiple kernel learning (MKL) in a classification setting. We first examine the kernel alignment metric and show that maximizing the alignment of a kernel with the target kernel Y Y T corresponds to a constrained minimization of the margin loss of a weighted NearestNeighb ..."
Abstract
 Add to MetaCart
(Show Context)
We study the problem of multiple kernel learning (MKL) in a classification setting. We first examine the kernel alignment metric and show that maximizing the alignment of a kernel with the target kernel Y Y T corresponds to a constrained minimization of the margin loss of a weighted NearestNeighbor (NN) classifier. Current MKL methods (both single and twostage) use the Support Vector Machine classifier and the hinge loss. We expand the framework to include the NN classifier and the margin loss, in addition to the hinge loss for classification. This results in multiple combinations of classifier and loss functions for multiple kernel learning. We make a thorough empirical study of the combinations. The NN classifier is particularly suitable to perform MKL on large datasets, with training a speedup of O (n2) over MKL algorithms that use SVMs. 1
Genome analysis HiCorrector: a fast, scalable and
"... memoryefficient package for normalizing ..."
(Show Context)
Recommended Citation
, 2015
"... This Doctoral Dissertation is brought to you for free and open access by the Electrical and Computer Engineering at UKnowledge. It has been accepted ..."
Abstract
 Add to MetaCart
(Show Context)
This Doctoral Dissertation is brought to you for free and open access by the Electrical and Computer Engineering at UKnowledge. It has been accepted
A PARALLEL PREPROCESSING FOR THE OPTIMAL ASSIGNMENT PROBLEM BASED ON DIAGONAL SCALING
"... ar ..."
(Show Context)
Motion Deblurring With Graph Laplacian Regularization
"... In this paper, we develop a regularization framework for image deblurring based on a new definition of the normalized graph Laplacian. We apply a fast scaling algorithm to the kernel similarity matrix to derive the symmetric, doubly stochastic filtering matrix from which the normalized Laplacian mat ..."
Abstract
 Add to MetaCart
In this paper, we develop a regularization framework for image deblurring based on a new definition of the normalized graph Laplacian. We apply a fast scaling algorithm to the kernel similarity matrix to derive the symmetric, doubly stochastic filtering matrix from which the normalized Laplacian matrix is built. We use this new definition of the Laplacian to construct a cost function consisting of data fidelity and regularization terms to solve the illposed motion deblurring problem. The final estimate is obtained by minimizing the resulting cost function in an iterative manner. Furthermore, the spectral properties of the Laplacian matrix equip us with the required tools for spectral analysis of the proposed method. We verify the effectiveness of our iterative algorithm via synthetic and real examples.