Thank you very much for your comment regarding the use of Cohens kappa statistics
for assessing the agreement. We agree with you that for ordinally scaled variables,
the modified Cohens Kappa with linear or quadratic weights can be used. However, the
choice of weights is arbitrary. Alternatively, an intraclass correlation coefficient
could also be used. There are also other measures of agreement, all of which have
their advantages and disadvantages (
1
,
2
).To read this article in full you will need to make a payment
Purchase one-time access:
Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online accessOne-time access price info
- For academic or personal research use, select 'Academic and Personal'
- For corporate R&D use, select 'Corporate R&D Professionals'
Subscribe:
Subscribe to Academic RadiologyAlready a print subscriber? Claim online access
Already an online subscriber? Sign in
Register: Create an account
Institutional Access: Sign in to ScienceDirect
References
- Handbook of Inter-Rater ReLiability: the definitive guide to measuring the extent of agreement among raters.Advanced Analytics, LLC, 2014
- A comparison of reliability coefficients for ordinal rating scales.J Classif. 2021; 38: 519-543https://doi.org/10.1007/s00357-021-09386-5
- The value of C-arm computed tomography in addition to conventional digital subtraction angiography in the diagnostic work-up of patients with suspected chronic thromboembolic pulmonary hypertension: an update of 300 patients.Acad Radiol. 2022; 29: S1-S10
- Conditional inequalities between Cohen’s kappa and weighted kappas.Stat Methodol. 2013; 10: 14-22https://doi.org/10.1016/j.stamet.2012.05.004
Article info
Publication history
Published online: May 20, 2023
Accepted:
April 23,
2023
Received:
April 23,
2023
Publication stage
In Press Corrected ProofIdentification
Copyright
© 2023 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.