• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 54,248
Next 10 →

A Metrics Suite for Object Oriented Design

by Shyam R. Chidamber , Chris F. Kemerer , 1994
"... Given the central role that software development plays in the delivery and application of information technology, managers are increasingly focusing on process improvement in the software development area. This demand has spurred the provision of a number of new and/or improved approaches to softwa ..."
Abstract - Cited by 1108 (3 self) - Add to MetaCart
to software development, with perhaps the most prominent being object-orientation (00). In addition, the focus on process im-provement has increased the demand for software measures, or metrics with which to manage the process. The need for such metrics is particularly acute when an organization is adopting a

Distance metric learning, with application to clustering with sideinformation,”

by Eric P Xing , Andrew Y Ng , Michael I Jordan , Stuart Russell - in Advances in Neural Information Processing Systems 15, , 2002
"... Abstract Many algorithms rely critically on being given a good metric over their inputs. For instance, data can often be clustered in many "plausible" ways, and if a clustering algorithm such as K-means initially fails to find one that is meaningful to a user, the only recourse may be for ..."
Abstract - Cited by 818 (13 self) - Add to MetaCart
us to give efficient, local-optima-free algorithms. We also demonstrate empirically that the learned metrics can be used to significantly improve clustering performance.

Distance metric learning for large margin nearest neighbor classification

by Kilian Q. Weinberger, John Blitzer, Lawrence K. Saul - In NIPS , 2006
"... We show how to learn a Mahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract - Cited by 695 (14 self) - Add to MetaCart
. On seven data sets of varying size and difficulty, we find that metrics trained in this way lead to significant improvements in kNN classification—for example, achieving a test error rate of 1.3 % on the MNIST handwritten digits. As in support vector machines (SVMs), the learning problem reduces to a

Improved Boosting Algorithms Using Confidence-rated Predictions

by Robert E. Schapire , Yoram Singer - MACHINE LEARNING , 1999
"... We describe several improvements to Freund and Schapire’s AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find impr ..."
Abstract - Cited by 940 (26 self) - Add to MetaCart
improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns

A comparison of mechanisms for improving TCP performance over wireless links

by Hari Balakrishnan, Venkata N. Padmanabhan, Srinivasan Seshan, Randy H. Katz - IEEE/ACM TRANSACTIONS ON NETWORKING , 1997
"... Reliable transport protocols such as TCP are tuned to perform well in traditional networks where packet losses occur mostly because of congestion. However, networks with wireless and other lossy links also suffer from significant losses due to bit errors and handoffs. TCP responds to all losses by i ..."
Abstract - Cited by 927 (11 self) - Add to MetaCart
by invoking congestion control and avoidance algorithms, resulting in degraded end-to-end performance in wireless and lossy systems. In this paper, we compare several schemes designed to improve the performance of TCP in such networks. We classify these schemes into three broad categories: end

Choosing multiple parameters for support vector machines

by Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, Sayan Mukherjee - MACHINE LEARNING , 2002
"... The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing para ..."
Abstract - Cited by 470 (17 self) - Add to MetaCart
parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.

Learning Bayesian networks: The combination of knowledge and statistical data

by David Heckerman, David M. Chickering - Machine Learning , 1995
"... We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simpl ..."
Abstract - Cited by 1158 (35 self) - Add to MetaCart
We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly

Locally weighted learning

by Christopher G. Atkeson, Andrew W. Moore , Stefan Schaal - ARTIFICIAL INTELLIGENCE REVIEW , 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract - Cited by 599 (51 self) - Add to MetaCart
, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted

Analysis of TCP Performance over Mobile Ad Hoc Networks Part I: Problem Discussion and Analysis of Results

by Gavin Holland, Nitin Vaidya , 1999
"... Mobile ad hoc networks have gained a lot of attention lately as a means of providing continuous network connectivity to mobile computing devices regardless of physical location. Recently, a large amount of research has focused on the routing protocols needed in such an environment. In this two-part ..."
Abstract - Cited by 521 (5 self) - Add to MetaCart
examples, such as a situation where throughput is zero for a particular connection. We introduce a new metric, expected throughput, for the comparison of throughput in multi-hop networks, and then use this metric to show how the use of explicit link failure notification (ELFN) techniques can significantly

Text Categorization with Support Vector Machines: Learning with Many Relevant Features

by Thorsten Joachims , 1998
"... This paper explores the use of Support Vector Machines (SVMs) for learning text classifiers from examples. It analyzes the particular properties of learning with text data and identifies, why SVMs are appropriate for this task. Empirical results support the theoretical findings. SVMs achieve substan ..."
Abstract - Cited by 2303 (9 self) - Add to MetaCart
substantial improvements over the currently best performing methods and they behave robustly over a variety of different learning tasks. Furthermore, they are fully automatic, eliminating the need for manual parameter tuning.
Next 10 →
Results 1 - 10 of 54,248
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University