Results 11  20
of
727
Calculation of the Kappa Statistic for Interrater Reliability: The Case Where Raters Can Select Multiple Responses from a Large Number of Categories
"... A common measure of rater agreement where outcomes are nominal is the kappa statistic (a chancecorrected measure of agreement). You can use PROC FREQ to calculate the kappa statistic, but only if the given frequency table is square (that is, raters used the same categories). In most rater analyses ..."
Abstract
 Add to MetaCart
A common measure of rater agreement where outcomes are nominal is the kappa statistic (a chancecorrected measure of agreement). You can use PROC FREQ to calculate the kappa statistic, but only if the given frequency table is square (that is, raters used the same categories). In most rater analyses
Research Article Integrative Decomposition Procedure and Kappa Statistics for the Distinguished Single Molecular Network Construction and Analysis
"... Our method concentrates on and constructs the distinguished single gene network. An integrated method was proposed based on linear programming and a decomposition procedure with integrated analysis of the significant function cluster using Kappa statistics and fuzzy heuristic clustering. We tested t ..."
Abstract
 Add to MetaCart
Our method concentrates on and constructs the distinguished single gene network. An integrated method was proposed based on linear programming and a decomposition procedure with integrated analysis of the significant function cluster using Kappa statistics and fuzzy heuristic clustering. We tested
SAS Global Forum 2012 Posters Paper 2012012 Calculating MultiRater Observation Agreement in Health Care Research Using the SAS ® Kappa Statistic
"... This paper describes procedures for calculating multirater observation agreement using the SAS ® Kappa statistic in a health care research study of the medication administration process (MAP). In the study, Registered Nurses (RNs) modeled the oral MAP 27 times in a simulated laboratory environment. ..."
Abstract
 Add to MetaCart
This paper describes procedures for calculating multirater observation agreement using the SAS ® Kappa statistic in a health care research study of the medication administration process (MAP). In the study, Registered Nurses (RNs) modeled the oral MAP 27 times in a simulated laboratory environment
Online Supplement for: DIAGNOSIS OF SLEEP APNEA BY AUTOMATIC ANALYSIS OF NASAL PRESSURE AND FORCED OSCILLATION IMPEDANCE APPENDIX A Weighted Kappa Statistic
"... this set of weights specifically disfavors episodes classified as central apneas by one scorer and as obstructive apneas by the other (weight 0.25). The weighted observed proportional agreement between the two scorers is obtained as i g j ij ij w o n w N ) ( . 1 Abbreviating the row and column t ..."
Abstract
 Add to MetaCart
totals of the table of frequencies for the ith category by j ij i n r and i ij j n c , the weighted proportional agreement expected just by chance is estimated by i g j j i ij w e c r w N 2 ) ( . Then, weighted kappa, which may be interpreted as the chancecorrected weighted proportional
or
"... ns pe dance is a term used to mean agreement in classification e ment. The kappa statistic thus ranges between – Pe / (1Pe) ..."
Abstract
 Add to MetaCart
ns pe dance is a term used to mean agreement in classification e ment. The kappa statistic thus ranges between – Pe / (1Pe)
DOI: 10.1007/S1133601091557 A KRAEMERTYPE RESCALING THAT TRANSFORMS THE ODDS RATIO INTO THE WEIGHTED KAPPA COEFFICIENT
, 2010
"... This paper presents a simple rescaling of the odds ratio that transforms the association measure into the weighted kappa statistic for a 2 × 2 table. Key words: Cohen’s kappa, 2 × 2 association measure. ..."
Abstract
 Add to MetaCart
This paper presents a simple rescaling of the odds ratio that transforms the association measure into the weighted kappa statistic for a 2 × 2 table. Key words: Cohen’s kappa, 2 × 2 association measure.
Statistical Methodology Equivalences of weighted kappas for multiple raters
, 2012
"... a b s t r a c t Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for measuring agreement between two raters on a categorical scale. With m ≥ 3 raters, there are several views in the literature on how to define agreement. We consider a family of weighted kappas for ..."
Abstract
 Add to MetaCart
a b s t r a c t Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for measuring agreement between two raters on a categorical scale. With m ≥ 3 raters, there are several views in the literature on how to define agreement. We consider a family of weighted kappas
Fuzzy kappa for the agreement measure of fuzzy classifications $
"... In this paper, we propose an assessment method of agreement between fuzzy sets, called fuzzy Kappa which is deduced from the concept of Cohen’s Kappa statistic. In fuzzy case, the Cohen’s Kappa coefficient can be calculated generally by transforming the fuzzy sets into some crisp acut subsets. Whil ..."
Abstract
 Add to MetaCart
In this paper, we propose an assessment method of agreement between fuzzy sets, called fuzzy Kappa which is deduced from the concept of Cohen’s Kappa statistic. In fuzzy case, the Cohen’s Kappa coefficient can be calculated generally by transforming the fuzzy sets into some crisp acut subsets
Magnetic resonance classification of lumbar intervertebral disc degeneration
 Spine
, 2001
"... Study Design. A reliability study was conducted. Objectives. To develop a classification system for lumbar disc degeneration based on routine magnetic resonance imaging, to investigate the applicability of a simple algorithm, and to assess the reliability of this classification system. Summary of ..."
Abstract

Cited by 95 (0 self)
 Add to MetaCart
pendently by three observers. Intra and interobserver reliabilities were assessed by calculating kappa statistics.
Results 11  20
of
727