Crowdsourced assessment of surgical skills: A systematic review

Publikation: Bidrag til tidsskriftReviewForskningfagfællebedømt

Standard

Crowdsourced assessment of surgical skills : A systematic review. / Olsen, Rikke G.; Genét, Malthe F.; Konge, Lars; Bjerrum, Flemming.

I: American Journal of Surgery, Bind 224, Nr. 5, 2022, s. 1229-1237.

Publikation: Bidrag til tidsskriftReviewForskningfagfællebedømt

Harvard

Olsen, RG, Genét, MF, Konge, L & Bjerrum, F 2022, 'Crowdsourced assessment of surgical skills: A systematic review', American Journal of Surgery, bind 224, nr. 5, s. 1229-1237. https://doi.org/10.1016/j.amjsurg.2022.07.008

APA

Olsen, R. G., Genét, M. F., Konge, L., & Bjerrum, F. (2022). Crowdsourced assessment of surgical skills: A systematic review. American Journal of Surgery, 224(5), 1229-1237. https://doi.org/10.1016/j.amjsurg.2022.07.008

Vancouver

Olsen RG, Genét MF, Konge L, Bjerrum F. Crowdsourced assessment of surgical skills: A systematic review. American Journal of Surgery. 2022;224(5):1229-1237. https://doi.org/10.1016/j.amjsurg.2022.07.008

Author

Olsen, Rikke G. ; Genét, Malthe F. ; Konge, Lars ; Bjerrum, Flemming. / Crowdsourced assessment of surgical skills : A systematic review. I: American Journal of Surgery. 2022 ; Bind 224, Nr. 5. s. 1229-1237.

Bibtex

@article{3ee797de046d4919a61428460269c6c6,
title = "Crowdsourced assessment of surgical skills: A systematic review",
abstract = "Introduction: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills. Material and methods: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article. Results: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72–0.95, Pearson's r 0.7–0.95, Spearman Rho 0.7–0.89, linear regression 0.45–0.89). Six studies had either questionable or no significant correlation between crowd workers and experts. Conclusion: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.",
keywords = "Crowdsourced assessment, Skill assessment, Surgical education, Surgical skills",
author = "Olsen, {Rikke G.} and Gen{\'e}t, {Malthe F.} and Lars Konge and Flemming Bjerrum",
note = "Publisher Copyright: {\textcopyright} 2022 Elsevier Inc.",
year = "2022",
doi = "10.1016/j.amjsurg.2022.07.008",
language = "English",
volume = "224",
pages = "1229--1237",
journal = "American Journal of Surgery",
issn = "0002-9610",
publisher = "Elsevier",
number = "5",

}

RIS

TY - JOUR

T1 - Crowdsourced assessment of surgical skills

T2 - A systematic review

AU - Olsen, Rikke G.

AU - Genét, Malthe F.

AU - Konge, Lars

AU - Bjerrum, Flemming

N1 - Publisher Copyright: © 2022 Elsevier Inc.

PY - 2022

Y1 - 2022

N2 - Introduction: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills. Material and methods: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article. Results: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72–0.95, Pearson's r 0.7–0.95, Spearman Rho 0.7–0.89, linear regression 0.45–0.89). Six studies had either questionable or no significant correlation between crowd workers and experts. Conclusion: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.

AB - Introduction: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills. Material and methods: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article. Results: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72–0.95, Pearson's r 0.7–0.95, Spearman Rho 0.7–0.89, linear regression 0.45–0.89). Six studies had either questionable or no significant correlation between crowd workers and experts. Conclusion: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.

KW - Crowdsourced assessment

KW - Skill assessment

KW - Surgical education

KW - Surgical skills

U2 - 10.1016/j.amjsurg.2022.07.008

DO - 10.1016/j.amjsurg.2022.07.008

M3 - Review

C2 - 35961877

AN - SCOPUS:85141833923

VL - 224

SP - 1229

EP - 1237

JO - American Journal of Surgery

JF - American Journal of Surgery

SN - 0002-9610

IS - 5

ER -

ID: 335097401