High accuracy in classifying endoscopic severity in ulcerative colitis using convolutional neural network
Publikation: Bidrag til tidsskrift › Tidsskriftartikel › fagfællebedømt
INTRODUCTION: Evaluation of endoscopic disease severity is a crucial component in managing patients with ulcerative colitis (UC). However, endoscopic assessment suffers from substantial intra- and inter-observer variation, limiting the reliability of individual assessments. Therefore, we aimed to develop a deep learning (DL) model capable of distinguishing active from healed mucosa and differentiating between different endoscopic disease severity degrees.
METHODS: 1,484 unique endoscopic images from 467 patients were extracted for classification. Two experts classified all images independently of one another according to the Mayo endoscopic subscore (MES). In cases of disagreement, a third expert classified the images. Different convolutional neural networks were considered for automatically classifying UC severity. Five-fold cross-validation was used to develop and select the final model. Afterwards, unseen test datasets were used for model evaluation.
RESULTS: In the most challenging task - distinguishing between all categories of MES - our final model achieved a test accuracy of 0.84. When evaluating this model on the binary tasks of distinguishing MES 0 vs. 1-3 and 0-1 vs. 2-3, it achieved accuracies of 0.94 and 0.93 and areas under the receiver operating characteristic curves (AUCs) of 0.997 and 0.998 respectively.
DISCUSSION: We have developed a highly accurate new, automated way of evaluating endoscopic images from UC patients. We have demonstrated how our DL model is capable of distinguishing between all four MES levels of activity. This new automated approach may optimize and standardize the evaluation of disease severity measured by the MES across centres no matter the level of medical expertise.
|Tidsskrift||The American Journal of Gastroenterology|
|Status||Udgivet - 2022|
Copyright © 2022 by The American College of Gastroenterology.