Abstract Archives of the RSNA, 2014
Behzad Farzin MD, Presenter: Nothing to Disclose
Jean-Christophe Gentric, Abstract Co-Author: Nothing to Disclose
Olivier Naggara MD, Abstract Co-Author: Nothing to Disclose
Jean Raymond MD, Abstract Co-Author: Nothing to Disclose
Guidelines to improve reporting of reliability and agreement studies in health care were proposed by Kottner et al., in 2011 (GRRAS). We hypothesized that rater agreement studies reported in the radiology literature were suboptimal. Our purpose was to assess how agreement studies were designed and reported in our field, and identify areas for improvement.
We conducted a retrospective assessment of all articles published in 4 selected journals (Radiology, AJNR, CARJ, Journal de Radiologie; from January 2011 to December 2012). Editorials, commentaries, review articles and references to published studies were excluded. Four trained adjudicators independently evaluated pertinent articles using a 23-item form that included the 15 GRRAS criteria. One senior author reviewed all report forms.
Of 2229 source articles, 951 titles were identified, and after manual exclusion, 280 articles (12.6% of total) were found that reported agreement or reliability studies. The mean number of subjects per study was 81 ± 99. Justification for the sample size was found in 9 studies (3.2%). The number of raters was ≤ 2 in 226 studies (80.7%) and there was no intra-observer study in 212 articles (75.7%). Observers of various levels of expertise were involved in 35 studies (12.5%). Confidence intervals were provided in 98 (35.0%) and an interpretation of estimates in 147 studies (52.5%). The agreement component of the study was not mentioned in the discussion of 168 articles (60%). An adequate report, defined as at least: i) 1 intra-observer assessment, ii) 3 raters for inter-observer studies, iii) independent assessment, iv) ≥ 20 patients, and v) mention of agreement in the discussion section, was present in 4 studies (1.4%). Radiology articles dedicated to agreement were few in number (20 or 0.9%).
In spite of their importance, agreement studies are few, incompletely reported and commonly offer a cursory assessment of reliability. There are many potential research opportunities for studies of this type, which should be promoted at all levels.
The demonstration of robust intra- and inter-observer agreement or reliability in well-designed studies is essential, before dissemination of diagnostic technologies or the widespread use of diagnostic criteria, to prevent improper clinical decisions.
Farzin, B,
Gentric, J,
Naggara, O,
Raymond, J,
Agreement Studies in Radiology Research. Radiological Society of North America 2014 Scientific Assembly and Annual Meeting, - ,Chicago IL.
http://archive.rsna.org/2014/14005889.html