kappa在遥感里主要应该是使用在accuracy assessment上.比如我们就计算标准kappa值来更好的检验分类结果的正确程度.
The Kappa Index of Agreement (K):this is an important index that the crossclassification outputs.It measures the association between the two input images and helps to evaluate the output image.Its values range from -1 to +1 after adjustment for chance agreement.If the two input images are in perfect agreement (no change has occurred),K equals 1.If the two images are completely different,K takes a value of -1.If the change between the two dates occurred by chance,then Kappa equals 0.Kappa is an index of agreement between the two input images as a whole.However,it also evaluates a per-category agreement by indicating the degree to which a particular category agrees between two dates.The per-category K can be calculated using the following formula (Rosenfield and Fitzpatrick-Lins,1986):
K = (Pii - (Pi.*P.i )/ (Pi.- Pi.*P.i )
where:
Pii = Proportion of entire image in which category i agrees for both dates
Pi.= Proportion of entire image in class i in reference image
P.i = Proportion of entire image in class i non-reference image
As a per-category agreement index,it indicates how much a category have changed between the two dates.In the evaluation,each of the two images can be used as reference and the other as non-reference.
其他的我就不知道了