Results

**1 - 5**of**5**### Covariate Shift Adaptation, Class-Balance Change Adaptation, and Change Detection

"... In standard supervised learning algorithms training and test data are assumed to fol-low the same probability distribution. However, because of a sample selection bias or non-stationarity of the environment, this important assumption is often violated in prac-tice, which causes a signicant estimatio ..."

Abstract
- Add to MetaCart

(Show Context)
In standard supervised learning algorithms training and test data are assumed to fol-low the same probability distribution. However, because of a sample selection bias or non-stationarity of the environment, this important assumption is often violated in prac-tice, which causes a signicant estimation bias. In this article, we review semi-supervised adaptation techniques for coping with such distribution changes. We focus on two sce-narios of such distribution change: the covariate shift (input distributions change but the input-output dependency does not change) and the class-balance change in classication (class-prior probabilities change but class-wise input distributions remain unchanged). We also show methods of change detection in probability distributions.

### 1WIREs Computational Statistics, 2013. Learning under Non-Stationarity: Covariate Shift and Class-Balance Change

"... One of the fundamental assumptions behind many supervised machine learning al-gorithms is that training and test data follow the same probability distribution. However, this important assumption is often violated in practice, for example, be-cause of an unavoidable sample selection bias or non-stati ..."

Abstract
- Add to MetaCart

(Show Context)
One of the fundamental assumptions behind many supervised machine learning al-gorithms is that training and test data follow the same probability distribution. However, this important assumption is often violated in practice, for example, be-cause of an unavoidable sample selection bias or non-stationarity of the environ-ment. Due to violation of the assumption, standard machine learning methods suffer a significant estimation bias. In this article, we consider two scenarios of such distribution change — the covariate shift where input distributions differ and class-balance change where class-prior probabilities vary in classification — and review semi-supervised adaptation techniques based on importance weighting.

### To appear in WIREs Computational Statistics. 1 Learning under Non-Stationarity: Covariate Shift and Class-Balance Change

"... One of the fundamental assumptions behind many supervised machine learning algorithms is that training and test data follow the same probability distribution. However, this important assumption is often violated in practice, for example, because of an unavoidable sample selection bias or non-station ..."

Abstract
- Add to MetaCart

(Show Context)
One of the fundamental assumptions behind many supervised machine learning algorithms is that training and test data follow the same probability distribution. However, this important assumption is often violated in practice, for example, because of an unavoidable sample selection bias or non-stationarity of the environment. Due to violation of the assumption, standard machine learning methods suffer a significant estimation bias. In this article, we consider two scenarios of such distribution change — the covariate shift where input distributions differ and classbalance change where class-prior probabilities vary in classification — and review semi-supervised adaptation techniques based on importance weighting.

### Covariate Shift Adaptation, Class-Balance Change Adaptation, and Change Detection

"... In standard supervised learning algorithms training and test data are assumed to fol-low the same probability distribution. However, because of a sample selection bias or non-stationarity of the environment, this important assumption is often violated in prac-tice, which causes a signicant estimatio ..."

Abstract
- Add to MetaCart

(Show Context)
In standard supervised learning algorithms training and test data are assumed to fol-low the same probability distribution. However, because of a sample selection bias or non-stationarity of the environment, this important assumption is often violated in prac-tice, which causes a signicant estimation bias. In this article, we review semi-supervised adaptation techniques for coping with such distribution changes. We focus on two sce-narios of such distribution change: the covariate shift (input distributions change but the input-output dependency does not change) and the class-balance change in classication (class-prior probabilities change but class-wise input distributions remain unchanged). We also show methods of change detection in probability distributions.