Tuesday, October 22, 2019

AgreeStat 360: a cloud app for analyzing the extent of agreement among raters

AgreeStat 360 is a cloud-based app for analyzing the extent of agreement among raters. You may upload your rating data to agreestat360.com as a text file in csv format or as an Excel worksheet. AgreeStat 60 will process it instantaneously. You can compute a variety of Chance-corrected Agreement Coefficients (CAC) on nominal ratings as well as various Intraclass Correlation Coefficients (ICC) on quantitative ratings. The CAC coefficients include the percent agreement, Cohen's kappa, Gwet's AC1 and AC2, Krippendorff's alpha, Brennan-Prediger, Conger's kappa, or Fleiss' kappa. The ICC coefficients cover most ANOVA models found in the inter-rater reliability coefficients.
The only tool you will ever need to use AgreeStat 360 is a web browser from a PC or a tablet. You may even do sophisticated analyzes with your cell phone browser, although some of the forms may not display very well on the small cell phone screen.
You will need to register in order to receive a password by email that allows you to log in to a trial version and test all the features of AgreeStat 360. Test it now.

1 comment:

  1. Can this App be used to calculate Cohen's kappa and Gwet's AC1 for data sets where not all raters classify each subject (i.e. no two raters classified the same set of subjects)?