Crowdsourced assessment of surgical skills: A systematic review
Publikation: Bidrag til tidsskrift › Review › Forskning › fagfællebedømt
Standard
Crowdsourced assessment of surgical skills : A systematic review. / Olsen, Rikke G.; Genét, Malthe F.; Konge, Lars; Bjerrum, Flemming.
I: American Journal of Surgery, Bind 224, Nr. 5, 2022, s. 1229-1237.Publikation: Bidrag til tidsskrift › Review › Forskning › fagfællebedømt
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Crowdsourced assessment of surgical skills
T2 - A systematic review
AU - Olsen, Rikke G.
AU - Genét, Malthe F.
AU - Konge, Lars
AU - Bjerrum, Flemming
N1 - Publisher Copyright: © 2022 Elsevier Inc.
PY - 2022
Y1 - 2022
N2 - Introduction: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills. Material and methods: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article. Results: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72–0.95, Pearson's r 0.7–0.95, Spearman Rho 0.7–0.89, linear regression 0.45–0.89). Six studies had either questionable or no significant correlation between crowd workers and experts. Conclusion: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.
AB - Introduction: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills. Material and methods: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article. Results: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72–0.95, Pearson's r 0.7–0.95, Spearman Rho 0.7–0.89, linear regression 0.45–0.89). Six studies had either questionable or no significant correlation between crowd workers and experts. Conclusion: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.
KW - Crowdsourced assessment
KW - Skill assessment
KW - Surgical education
KW - Surgical skills
U2 - 10.1016/j.amjsurg.2022.07.008
DO - 10.1016/j.amjsurg.2022.07.008
M3 - Review
C2 - 35961877
AN - SCOPUS:85141833923
VL - 224
SP - 1229
EP - 1237
JO - American Journal of Surgery
JF - American Journal of Surgery
SN - 0002-9610
IS - 5
ER -
ID: 335097401