Results 1  10
of
3,579,345
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 12976 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Prediction of genotoxicity of chemical compounds by statistical learning methods
 Chem. Res. Toxicol
, 2005
"... Various toxicological profiles, such as genotoxic potential, need to be studied in drug discovery processes and submitted to the drug regulatory authorities for drug safety evaluation. As part of the effort for developing low cost and efficient adverse drug reaction testing tools, several statistica ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
statistical learning methods have been used for developing genotoxicity prediction systems with an accuracy of up to 73.8 % for genotoxic (GT+) and 92.8 % for nongenotoxic (GT) agents. These systems have been developed and tested by using less than 400 known GT+ and GTagents, which is significantly less
Learning probabilistic relational models
 In IJCAI
, 1999
"... A large portion of realworld data is stored in commercial relational database systems. In contrast, most statistical learning methods work only with "flat " data representations. Thus, to apply these methods, we are forced to convert our data into a flat form, thereby losing much ..."
Abstract

Cited by 619 (31 self)
 Add to MetaCart
A large portion of realworld data is stored in commercial relational database systems. In contrast, most statistical learning methods work only with "flat " data representations. Thus, to apply these methods, we are forced to convert our data into a flat form, thereby losing much
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
, 2010
"... ..."
Ensemble Methods in Machine Learning
 MULTIPLE CLASSIFIER SYSTEMS, LBCS1857
, 2000
"... Ensemble methods are learning algorithms that construct a set of classifiers and then classify new data points by taking a (weighted) vote of their predictions. The original ensemble method is Bayesian averaging, but more recent algorithms include errorcorrecting output coding, Bagging, and boostin ..."
Abstract

Cited by 607 (3 self)
 Add to MetaCart
Ensemble methods are learning algorithms that construct a set of classifiers and then classify new data points by taking a (weighted) vote of their predictions. The original ensemble method is Bayesian averaging, but more recent algorithms include errorcorrecting output coding, Bagging
A Statistical Learning Method for Logic Programs with Distribution Semantics
 IN PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LOGIC PROGRAMMING (ICLP’95
, 1995
"... When a joint distribution PF is given to a set F of facts in a logic program DB = F U R where R is a set of rules, we can further extend it to a joint distribution PDB over the set of possible least models of DB. We then define the semantics of DB with the associated distribution PF as PDB, and call ..."
Abstract

Cited by 127 (24 self)
 Add to MetaCart
When a joint distribution PF is given to a set F of facts in a logic program DB = F U R where R is a set of rules, we can further extend it to a joint distribution PDB over the set of possible least models of DB. We then define the semantics of DB with the associated distribution PF as PDB, and call it distribution semantics. While the
A comparison of statistical learning methods on the GUSTO database
 STATIST. MED
, 1998
"... We apply a battery of modern, adaptive nonlinear learning methods to a large real database of cardiac patient data. We use each method to predict 30 day mortality from a large number of potential risk factors, and we compare their performances. We find that none of the methods could outperform a re ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We apply a battery of modern, adaptive nonlinear learning methods to a large real database of cardiac patient data. We use each method to predict 30 day mortality from a large number of potential risk factors, and we compare their performances. We find that none of the methods could outperform a
Computer Prediction of Cardiovascular and Hematological Agents by Statistical Learning Methods
"... Abstract: Computational methods have been explored for predicting agents that produce therapeutic or adverse effects in cardiovascular and hematological systems. The quantitative structureactivity relationship (QSAR) method is the first statistical learning methods successfully used for predicting ..."
Abstract
 Add to MetaCart
Abstract: Computational methods have been explored for predicting agents that produce therapeutic or adverse effects in cardiovascular and hematological systems. The quantitative structureactivity relationship (QSAR) method is the first statistical learning methods successfully used for predicting
Active Learning with Statistical Models
, 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract

Cited by 677 (12 self)
 Add to MetaCart
, statisticallybased learning architectures: mixtures of Gaussians and locally weighted regression. While the techniques for neural networks are expensive and approximate, the techniques for mixtures of Gaussians and locally weighted regression are both efficient and accurate.
Results 1  10
of
3,579,345