• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 12,756
Next 10 →

Evolution Strategies for Vector Optimization

by Frank Kursawe - Parallel Problem Solving from Nature. 1st Workshop, PPSN I, volume 496 of Lecture Notes in Computer Science , 1992
"... Evolution strategies --- a stochastic optimization method originally designed for single criterion problems --- have been modified in such a way that they can also tackle multiple criteria problems. Instead of computing only one efficient solution interactively, a decision maker can collect as many ..."
Abstract - Cited by 121 (2 self) - Add to MetaCart
Evolution strategies --- a stochastic optimization method originally designed for single criterion problems --- have been modified in such a way that they can also tackle multiple criteria problems. Instead of computing only one efficient solution interactively, a decision maker can collect as many

Super efficiency in vector optimization

by J. M. Borwein, D. Zhuang - Trans. Amer. Math. Soc , 1993
"... Abstract. We introduce a new concept of efficiency in vector optimization. This concept, super efficiency, is shown to have many desirable properties. In particular, we show that in reasonable settings the super efficient points of a set are norm-dense in the efficient frontier. We also provide a Ch ..."
Abstract - Cited by 13 (0 self) - Add to MetaCart
Abstract. We introduce a new concept of efficiency in vector optimization. This concept, super efficiency, is shown to have many desirable properties. In particular, we show that in reasonable settings the super efficient points of a set are norm-dense in the efficient frontier. We also provide a

in Vector Optimization. The linear case.

by Elisa Pagani, Letizia Pellegrini, Elisa Pagani, Letizia Pellegrini , 2009
"... Abstract In this paper we consider a vector optimization problem; we present some scalarization techniques for finding all the vector optimal points of this problem and we discuss the relationships between these methods. Moreover, in the linear case, the study of dual variables is carried on by mean ..."
Abstract - Add to MetaCart
Abstract In this paper we consider a vector optimization problem; we present some scalarization techniques for finding all the vector optimal points of this problem and we discuss the relationships between these methods. Moreover, in the linear case, the study of dual variables is carried

Training Support Vector Machines: an Application to Face Detection

by Edgar Osuna, Robert Freund, Federico Girosi , 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract - Cited by 727 (1 self) - Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision

Duality in Vector Optimization

by Laura Carosi Johannes Jahn - Math. Programming , 1983
"... the connections between semidefinite ..."
Abstract - Cited by 58 (3 self) - Add to MetaCart
the connections between semidefinite

A Comparison of Methods for Multiclass Support Vector Machines

by Chih-Wei Hsu, Chih-Jen Lin - IEEE TRANS. NEURAL NETWORKS , 2002
"... Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary class ..."
Abstract - Cited by 952 (22 self) - Add to MetaCart
Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary

On the algorithmic implementation of multi-class kernel-based vector machines

by Koby Crammer, Yoram Singer, Nello Cristianini, John Shawe-taylor, Bob Williamson - Journal of Machine Learning Research
"... In this paper we describe the algorithmic implementation of multiclass kernel-based vector machines. Our starting point is a generalized notion of the margin to multiclass problems. Using this notion we cast multiclass categorization problems as a constrained optimization problem with a quadratic ob ..."
Abstract - Cited by 559 (13 self) - Add to MetaCart
In this paper we describe the algorithmic implementation of multiclass kernel-based vector machines. Our starting point is a generalized notion of the margin to multiclass problems. Using this notion we cast multiclass categorization problems as a constrained optimization problem with a quadratic

Making Large-Scale Support Vector Machine Learning Practical

by Thorsten Joachims , 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract - Cited by 628 (1 self) - Add to MetaCart
Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large

Sequential minimal optimization: A fast algorithm for training support vector machines

by John C. Platt - Advances in Kernel Methods-Support Vector Learning , 1999
"... This paper proposes a new algorithm for training support vector machines: Sequential Minimal Optimization, or SMO. Training a support vector machine requires the solution of a very large quadratic programming (QP) optimization problem. SMO breaks this large QP problem into a series of smallest possi ..."
Abstract - Cited by 461 (3 self) - Add to MetaCart
This paper proposes a new algorithm for training support vector machines: Sequential Minimal Optimization, or SMO. Training a support vector machine requires the solution of a very large quadratic programming (QP) optimization problem. SMO breaks this large QP problem into a series of smallest

Multiobjective Optimization Using Nondominated Sorting in Genetic Algorithms

by N. Srinivas, Kalyanmoy Deb - Evolutionary Computation , 1994
"... In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about t ..."
Abstract - Cited by 539 (5 self) - Add to MetaCart
In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about
Next 10 →
Results 1 - 10 of 12,756
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University