Archive \ Volume.11 2020 Issue 1

Survey of agreement between raters for nominal data using krippendorff's Alpha

Bizhan Shabankhani, Jamshid Yazdani Charati, Keihan Shabankhani, Saeid Kaviani Cherati
Abstract

Background: Most of the indicators used to measure IRR have limitations such as the number of raters, the number of categories, the type of variable (nominal, ordinal, interval, ratio), and missing data. The krippendorff's Alpha coefficient is the only indicator among the IRR indices, which, despite all the limitations, calculates the agreement among the raters with the appropriate confidence. Materials and Methods:  The study used indexed articles in Google scholar databases, Medline, Scopus, Springer, ScienceDirect, published in English since 2000 and also the collected data from the Design projects of sound and vibration control systems in the industry that has been done in Sari faculty of public Health. Results:  In this paper, we will introduce the method of calculating binary and nominal data in the presence of two or more raters, and in both cases the existence or absence of missing data. Conclusion:  Most of the coefficients used to measure the agreement between raters will not provide a satisfactory reliability if there are some limitations. The krippendorff's Alpha statistics can be used as an efficient statistic in assessing the extent of agreement between evaluators replacing other statistics. Of course, it should be noted that the calculations of this index are more complex than other indicators, but provide a higher reliability.



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.