Your English writing platform
Free sign upSuggestions(2)
Exact(1)
Inter-observer agreement was assessed using the kappa value (k).
Similar(59)
In order to account for the calculation of chance we used the kappa value [ 22].
The degree of agreement between observers was assessed using the kappa -statistic; a value of >0.61 was taken to be a satisfactory agreement (Altman, 1991).
Inter- observer agreement was quantified using the kappa statistic, wherein a kappa value of 0.61 to 1.0 indicates substantial agreement.
This procedure was eligible to calculate the rate of agreement between them, using the Kappa index, finding a 0.87 value which is considered acceptable with an optimal classification (Seigel et al. 1992).
Repeatability of the scoring was tested using the kappa coefficient which resulted in a value of κ = 0.79 which means a substantial agreement.
The following interpretation of agreement was used for the kappa values: low (0 0.5), moderate (0.51-0.75) and excellent (0.76-1.0) following established guidelines [ 23].
When Ultra Fine and PolSAR data are used in combination, the kappa value of well pads and processing facilities detection reached 0.87.
The plotted lines are labeled according to the kappa value used to generate them.
Values were expressed as means±s.d.; for the quantification of apoptotic rate, the kappa value was used as the interobserver variability value.
The kappa value was used to measure the agreement between the 17 comorbidities of the CCI prevalence obtained from medical records and claims data.
Write better and faster with AI suggestions while staying true to your unique style.
Since I tried Ludwig back in 2017, I have been constantly using it in both editing and translation. Ever since, I suggest it to my translators at ProSciEditing.
Justyna Jupowicz-Kozak
CEO of Professional Science Editing for Scientists @ prosciediting.com