Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review. / Brorson, Stig; Hróbjartsson, Asbjørn.

I: Journal of Clinical Epidemiology, Bind 61, Nr. 1, 2008, s. 7-16.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Brorson, S & Hróbjartsson, A 2008, 'Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review', Journal of Clinical Epidemiology, bind 61, nr. 1, s. 7-16. https://doi.org/10.1016/j.jclinepi.2007.04.014

APA

Brorson, S., & Hróbjartsson, A. (2008). Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review. Journal of Clinical Epidemiology, 61(1), 7-16. https://doi.org/10.1016/j.jclinepi.2007.04.014

Vancouver

Brorson S, Hróbjartsson A. Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review. Journal of Clinical Epidemiology. 2008;61(1):7-16. https://doi.org/10.1016/j.jclinepi.2007.04.014

Author

Brorson, Stig ; Hróbjartsson, Asbjørn. / Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review. I: Journal of Clinical Epidemiology. 2008 ; Bind 61, Nr. 1. s. 7-16.

Bibtex

@article{c268b0a0f68f11ddbf70000ea68e967b,
title = "Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review",
abstract = "ObjectiveTo systematically review studies of observer agreement among doctors classifying proximal humeral fractures according to the Neer system.Study Design and SettingA systematic review. We searched for observational studies in which doctors classified proximal humeral fractures according to the Neer system, and randomized trials of any intervention aimed at improving agreement. We analyzed potential eligible studies independently, and data were extracted using pretested forms. Authors were contacted for missing information. Summary statistics for observer agreement were noted, and the methodological quality was assessed.ResultsWe included 11 observational studies (88 observers and 468 cases). Mean κ-values for interobserver agreement ranged from 0.17 to 0.52. Agreement did not improve through selection of experienced observers, advanced imaging modalities, or simplification of the classification system. Intra-observer agreement was moderately higher than interobserver agreement. One randomized trial (14 observers and 42 cases) reported a clear effect of training (mean κ-value 0.62 after training compared to no training 0.34).ConclusionWe found a consistently low level of observer agreement. The widely held belief that experts disagree less than nonexperts could not be supported. One randomized trial indicated that training improves agreement among both experts and nonexperts.",
keywords = "Clinical Competence, Education, Medical, Continuing, Humans, Information Storage and Retrieval, Observer Variation, Radiology, Randomized Controlled Trials as Topic, Shoulder Fractures",
author = "Stig Brorson and Asbj{\o}rn Hr{\'o}bjartsson",
year = "2008",
doi = "10.1016/j.jclinepi.2007.04.014",
language = "English",
volume = "61",
pages = "7--16",
journal = "Journal of Clinical Epidemiology",
issn = "0895-4356",
publisher = "Elsevier",
number = "1",

}

RIS

TY - JOUR

T1 - Training improves agreement among doctors using the Neer system for proximal humeral fractures in a systematic review

AU - Brorson, Stig

AU - Hróbjartsson, Asbjørn

PY - 2008

Y1 - 2008

N2 - ObjectiveTo systematically review studies of observer agreement among doctors classifying proximal humeral fractures according to the Neer system.Study Design and SettingA systematic review. We searched for observational studies in which doctors classified proximal humeral fractures according to the Neer system, and randomized trials of any intervention aimed at improving agreement. We analyzed potential eligible studies independently, and data were extracted using pretested forms. Authors were contacted for missing information. Summary statistics for observer agreement were noted, and the methodological quality was assessed.ResultsWe included 11 observational studies (88 observers and 468 cases). Mean κ-values for interobserver agreement ranged from 0.17 to 0.52. Agreement did not improve through selection of experienced observers, advanced imaging modalities, or simplification of the classification system. Intra-observer agreement was moderately higher than interobserver agreement. One randomized trial (14 observers and 42 cases) reported a clear effect of training (mean κ-value 0.62 after training compared to no training 0.34).ConclusionWe found a consistently low level of observer agreement. The widely held belief that experts disagree less than nonexperts could not be supported. One randomized trial indicated that training improves agreement among both experts and nonexperts.

AB - ObjectiveTo systematically review studies of observer agreement among doctors classifying proximal humeral fractures according to the Neer system.Study Design and SettingA systematic review. We searched for observational studies in which doctors classified proximal humeral fractures according to the Neer system, and randomized trials of any intervention aimed at improving agreement. We analyzed potential eligible studies independently, and data were extracted using pretested forms. Authors were contacted for missing information. Summary statistics for observer agreement were noted, and the methodological quality was assessed.ResultsWe included 11 observational studies (88 observers and 468 cases). Mean κ-values for interobserver agreement ranged from 0.17 to 0.52. Agreement did not improve through selection of experienced observers, advanced imaging modalities, or simplification of the classification system. Intra-observer agreement was moderately higher than interobserver agreement. One randomized trial (14 observers and 42 cases) reported a clear effect of training (mean κ-value 0.62 after training compared to no training 0.34).ConclusionWe found a consistently low level of observer agreement. The widely held belief that experts disagree less than nonexperts could not be supported. One randomized trial indicated that training improves agreement among both experts and nonexperts.

KW - Clinical Competence, Education, Medical, Continuing, Humans, Information Storage and Retrieval, Observer Variation, Radiology, Randomized Controlled Trials as Topic, Shoulder Fractures

U2 - 10.1016/j.jclinepi.2007.04.014

DO - 10.1016/j.jclinepi.2007.04.014

M3 - Journal article

C2 - 18083458

VL - 61

SP - 7

EP - 16

JO - Journal of Clinical Epidemiology

JF - Journal of Clinical Epidemiology

SN - 0895-4356

IS - 1

ER -

ID: 10209430