• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 9,487
Next 10 →

Support vector machine learning for interdependent and structured output spaces

by Ioannis Tsochantaridis, Thomas Hofmann, Thorsten Joachims, Yasemin Altun - In ICML , 2004
"... Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernel-based methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs suchas multiple depe ..."
Abstract - Cited by 450 (20 self) - Add to MetaCart
dependent output variables and structured output spaces. We propose to generalize multiclass Support Vector Machine learning in a formulation that involves features extracted jointly from inputs and outputs. The resulting optimization problem is solved efficiently by a cutting plane algorithm that exploits

Large margin methods for structured and interdependent output variables

by Ioannis Tsochantaridis, Thorsten Joachims, Thomas Hofmann, Yasemin Altun - JOURNAL OF MACHINE LEARNING RESEARCH , 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract - Cited by 624 (12 self) - Add to MetaCart
Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses

Output Space Sampling for Graph Patterns

by Mohammad Al Hasan, Mohammed J. Zaki , 2009
"... Recent interest in graph pattern mining has shifted from finding all frequent subgraphs to obtaining a small subset of frequent subgraphs that are representative, discriminative or significant. The main motivation behind that is to cope with the scalability problem that the graph mining algorithms s ..."
Abstract - Cited by 22 (5 self) - Add to MetaCart
suffer when mining databases of large graphs. Another motivation is to obtain a succinct output set that is informative and useful. In the same spirit, researchers also proposed sampling based algorithms that sample the output space of the frequent patterns to obtain representative subgraphs

Multi-Task Output Space Regularization

by Sergey Feldman, Bela A. Frigyik, Maya R. Gupta, Luca Cazzanti, Peter Sadowski
"... We investigate multi-task learning from an output space regularization perspective. Most multi-task approaches tie together related tasks by constraining them to share input spaces and function classes. In contrast to this, we propose a multi-task paradigm which we call output space regularization, ..."
Abstract - Add to MetaCart
We investigate multi-task learning from an output space regularization perspective. Most multi-task approaches tie together related tasks by constraining them to share input spaces and function classes. In contrast to this, we propose a multi-task paradigm which we call output space regularization

Gradient Boosting for Kernelized Output Spaces

by Louis Wehenkel
"... A general framework is proposed for gradient boosting in supervised learning problems where the loss function is defined using a kernel over the output space. It extends boosting in a principled way to complex output spaces (images, text, graphs etc.) and can be applied to a general class of base le ..."
Abstract - Cited by 4 (1 self) - Add to MetaCart
A general framework is proposed for gradient boosting in supervised learning problems where the loss function is defined using a kernel over the output space. It extends boosting in a principled way to complex output spaces (images, text, graphs etc.) and can be applied to a general class of base

Output space search for structured prediction.

by Janardhan Rao Doppa , Alan Fern , Prasad Tadepalli - In ICML. , 2012
"... Abstract We consider a framework for structured prediction based on search in the space of complete structured outputs. Given a structured input, an output is produced by running a time-bounded search procedure guided by a learned cost function, and then returning the least cost output uncovered du ..."
Abstract - Cited by 10 (4 self) - Add to MetaCart
Abstract We consider a framework for structured prediction based on search in the space of complete structured outputs. Given a structured input, an output is produced by running a time-bounded search procedure guided by a learned cost function, and then returning the least cost output uncovered

Gradient boosting for kernelized output spaces

by Pierre Geurts, Louis Wehenkel, Hal Id Hal, Louis Wehenkel - ICML , 2007
"... HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

ADIOS: Architectures Deep In Output Space

by Moustapha Cissé , Maruan Al-Shedivat , Samy Bengio , Bengio@google Com , Google Brain
"... Abstract Multi-label classification is a generalization of binary classification where the task consists in predicting sets of labels. With the availability of ever larger datasets, the multi-label setting has become a natural one in many applications, and the interest in solving multi-label proble ..."
Abstract - Add to MetaCart
Abstract Multi-label classification is a generalization of binary classification where the task consists in predicting sets of labels. With the availability of ever larger datasets, the multi-label setting has become a natural one in many applications, and the interest in solving multi-label problems has grown significantly. As expected, deep learning approaches are now yielding state-of-the-art performance for this class of problems. Unfortunately, they usually do not take into account the often unknown but nevertheless rich relationships between labels. In this paper, we propose to make use of this underlying structure by learning to partition the labels into a Markov Blanket Chain and then applying a novel deep architecture that exploits the partition. Experiments on several popular and large multi-label datasets demonstrate that our approach not only yields significant improvements, but also helps to overcome trade-offs specific to the multi-label classification setting.

Spatio-temporal energy models for the Perception of Motion

by Edward H. Adelson, James R. Bergen - J. OPT. SOC. AM. A , 1985
"... A motion sequence may be represented as a single pattern in x-y-t space; a velocity of motion corresponds to a three-dimensional orientation in this space. Motion sinformation can be extracted by a system that responds to the oriented spatiotemporal energy. We discuss a class of models for human mot ..."
Abstract - Cited by 904 (9 self) - Add to MetaCart
motion mechanisms in which the first stage consists of linear filters that are oriented in space-time and tuned in spatial frequency. The outputs of quadrature pairs of such filters are squared and summed to give a measure of motion energy. These responses are then fed into an opponent stage. Energy

Growing a Hypercubical Output Space in a Self-Organizing Feature Map

by H. -u. Bauer, Th. Villmann - IEEE Transactions on Neural Networks , 1995
"... Neural maps project data given in a (possibly high-dimensional) input space onto a neuron position in a (usually low-dimensional) output space grid. An important property of this projection is the preservation of neighborhoods; neighboring neurons in output space respond to neighboring data points i ..."
Abstract - Cited by 58 (11 self) - Add to MetaCart
Neural maps project data given in a (possibly high-dimensional) input space onto a neuron position in a (usually low-dimensional) output space grid. An important property of this projection is the preservation of neighborhoods; neighboring neurons in output space respond to neighboring data points
Next 10 →
Results 1 - 10 of 9,487
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University