2.3.4. Kappa Coefficient

The Kappa coefficient is an index used for consistency testing proposed by Cohen in 1960 [21]. For images classification, consistency is whether the predicted results of the model are consistent with the actual classification results.

$$Kappa = \frac{P\_o - P\_c}{1 - P\_c} \,\text{,}\tag{6}$$

In the formula, *P*0 is the overall accuracy of the classification, which represents the probability that the classification result is consistent with the actual feature type for each random sample; *Pc* represents the probability that the classification result caused by chance is consistent with the actual feature type based on the evaluation criteria proposed by Cohen, higher than 0.8 can be regarded as the consistency of the best gradient. In this study, the calculation of the Kappa coefficient is based on the confusion matrix report from Envi 5.3.
