HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

HaN-Seg : The head and neck organ-at-risk CT and MR segmentation challenge. / Podobnik, Gašper; Ibragimov, Bulat; Tappeiner, Elias; Lee, Chanwoong; Kim, Jin Sung; Mesbah, Zacharia; Modzelewski, Romain; Ma, Yihao; Yang, Fan; Rudecki, Mikołaj; Wodziński, Marek; Peterlin, Primož; Strojan, Primož; Vrtovec, Tomaž.

I: Radiotherapy and Oncology, Bind 198, 110410, 09.2024.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Podobnik, G, Ibragimov, B, Tappeiner, E, Lee, C, Kim, JS, Mesbah, Z, Modzelewski, R, Ma, Y, Yang, F, Rudecki, M, Wodziński, M, Peterlin, P, Strojan, P & Vrtovec, T 2024, 'HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge', Radiotherapy and Oncology, bind 198, 110410. https://doi.org/10.1016/j.radonc.2024.110410

APA

Podobnik, G., Ibragimov, B., Tappeiner, E., Lee, C., Kim, J. S., Mesbah, Z., Modzelewski, R., Ma, Y., Yang, F., Rudecki, M., Wodziński, M., Peterlin, P., Strojan, P., & Vrtovec, T. (2024). HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge. Radiotherapy and Oncology, 198, [110410]. https://doi.org/10.1016/j.radonc.2024.110410

Vancouver

Podobnik G, Ibragimov B, Tappeiner E, Lee C, Kim JS, Mesbah Z o.a. HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge. Radiotherapy and Oncology. 2024 sep.;198. 110410. https://doi.org/10.1016/j.radonc.2024.110410

Author

Podobnik, Gašper ; Ibragimov, Bulat ; Tappeiner, Elias ; Lee, Chanwoong ; Kim, Jin Sung ; Mesbah, Zacharia ; Modzelewski, Romain ; Ma, Yihao ; Yang, Fan ; Rudecki, Mikołaj ; Wodziński, Marek ; Peterlin, Primož ; Strojan, Primož ; Vrtovec, Tomaž. / HaN-Seg : The head and neck organ-at-risk CT and MR segmentation challenge. I: Radiotherapy and Oncology. 2024 ; Bind 198.

Bibtex

@article{fc9915bdeb854d9da51be6f90bfcf236,
title = "HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge",
abstract = "Background and purpose: To promote the development of auto-segmentation methods for head and neck (HaN) radiation treatment (RT) planning that exploit the information of computed tomography (CT) and magnetic resonance (MR) imaging modalities, we organized HaN-Seg: The Head and Neck Organ-at-Risk CT and MR Segmentation Challenge. Materials and methods: The challenge task was to automatically segment 30 organs-at-risk (OARs) of the HaN region in 14 withheld test cases given the availability of 42 publicly available training cases. Each case consisted of one contrast-enhanced CT and one T1-weighted MR image of the HaN region of the same patient, with up to 30 corresponding reference OAR delineation masks. The performance was evaluated in terms of the Dice similarity coefficient (DSC) and 95-percentile Hausdorff distance (HD95), and statistical ranking was applied for each metric by pairwise comparison of the submitted methods using the Wilcoxon signed-rank test. Results: While 23 teams registered for the challenge, only seven submitted their methods for the final phase. The top-performing team achieved a DSC of 76.9 % and a HD95 of 3.5 mm. All participating teams utilized architectures based on U-Net, with the winning team leveraging rigid MR to CT registration combined with network entry-level concatenation of both modalities. Conclusion: This challenge simulated a real-world clinical scenario by providing non-registered MR and CT images with varying fields-of-view and voxel sizes. Remarkably, the top-performing teams achieved segmentation performance surpassing the inter-observer agreement on the same dataset. These results set a benchmark for future research on this publicly available dataset and on paired multi-modal image segmentation in general.",
keywords = "Computational challenge, Computed tomography, Deep learning, Head and neck cancer, Magnetic resonance, Organs-at-risk, Radiotherapy, Segmentation",
author = "Ga{\v s}per Podobnik and Bulat Ibragimov and Elias Tappeiner and Chanwoong Lee and Kim, {Jin Sung} and Zacharia Mesbah and Romain Modzelewski and Yihao Ma and Fan Yang and Miko{\l}aj Rudecki and Marek Wodzi{\'n}ski and Primo{\v z} Peterlin and Primo{\v z} Strojan and Toma{\v z} Vrtovec",
note = "Publisher Copyright: {\textcopyright} 2024 The Authors",
year = "2024",
month = sep,
doi = "10.1016/j.radonc.2024.110410",
language = "English",
volume = "198",
journal = "Radiotherapy & Oncology",
issn = "0167-8140",
publisher = "Elsevier Ireland Ltd",

}

RIS

TY - JOUR

T1 - HaN-Seg

T2 - The head and neck organ-at-risk CT and MR segmentation challenge

AU - Podobnik, Gašper

AU - Ibragimov, Bulat

AU - Tappeiner, Elias

AU - Lee, Chanwoong

AU - Kim, Jin Sung

AU - Mesbah, Zacharia

AU - Modzelewski, Romain

AU - Ma, Yihao

AU - Yang, Fan

AU - Rudecki, Mikołaj

AU - Wodziński, Marek

AU - Peterlin, Primož

AU - Strojan, Primož

AU - Vrtovec, Tomaž

N1 - Publisher Copyright: © 2024 The Authors

PY - 2024/9

Y1 - 2024/9

N2 - Background and purpose: To promote the development of auto-segmentation methods for head and neck (HaN) radiation treatment (RT) planning that exploit the information of computed tomography (CT) and magnetic resonance (MR) imaging modalities, we organized HaN-Seg: The Head and Neck Organ-at-Risk CT and MR Segmentation Challenge. Materials and methods: The challenge task was to automatically segment 30 organs-at-risk (OARs) of the HaN region in 14 withheld test cases given the availability of 42 publicly available training cases. Each case consisted of one contrast-enhanced CT and one T1-weighted MR image of the HaN region of the same patient, with up to 30 corresponding reference OAR delineation masks. The performance was evaluated in terms of the Dice similarity coefficient (DSC) and 95-percentile Hausdorff distance (HD95), and statistical ranking was applied for each metric by pairwise comparison of the submitted methods using the Wilcoxon signed-rank test. Results: While 23 teams registered for the challenge, only seven submitted their methods for the final phase. The top-performing team achieved a DSC of 76.9 % and a HD95 of 3.5 mm. All participating teams utilized architectures based on U-Net, with the winning team leveraging rigid MR to CT registration combined with network entry-level concatenation of both modalities. Conclusion: This challenge simulated a real-world clinical scenario by providing non-registered MR and CT images with varying fields-of-view and voxel sizes. Remarkably, the top-performing teams achieved segmentation performance surpassing the inter-observer agreement on the same dataset. These results set a benchmark for future research on this publicly available dataset and on paired multi-modal image segmentation in general.

AB - Background and purpose: To promote the development of auto-segmentation methods for head and neck (HaN) radiation treatment (RT) planning that exploit the information of computed tomography (CT) and magnetic resonance (MR) imaging modalities, we organized HaN-Seg: The Head and Neck Organ-at-Risk CT and MR Segmentation Challenge. Materials and methods: The challenge task was to automatically segment 30 organs-at-risk (OARs) of the HaN region in 14 withheld test cases given the availability of 42 publicly available training cases. Each case consisted of one contrast-enhanced CT and one T1-weighted MR image of the HaN region of the same patient, with up to 30 corresponding reference OAR delineation masks. The performance was evaluated in terms of the Dice similarity coefficient (DSC) and 95-percentile Hausdorff distance (HD95), and statistical ranking was applied for each metric by pairwise comparison of the submitted methods using the Wilcoxon signed-rank test. Results: While 23 teams registered for the challenge, only seven submitted their methods for the final phase. The top-performing team achieved a DSC of 76.9 % and a HD95 of 3.5 mm. All participating teams utilized architectures based on U-Net, with the winning team leveraging rigid MR to CT registration combined with network entry-level concatenation of both modalities. Conclusion: This challenge simulated a real-world clinical scenario by providing non-registered MR and CT images with varying fields-of-view and voxel sizes. Remarkably, the top-performing teams achieved segmentation performance surpassing the inter-observer agreement on the same dataset. These results set a benchmark for future research on this publicly available dataset and on paired multi-modal image segmentation in general.

KW - Computational challenge

KW - Computed tomography

KW - Deep learning

KW - Head and neck cancer

KW - Magnetic resonance

KW - Organs-at-risk

KW - Radiotherapy

KW - Segmentation

U2 - 10.1016/j.radonc.2024.110410

DO - 10.1016/j.radonc.2024.110410

M3 - Journal article

C2 - 38917883

AN - SCOPUS:85198522504

VL - 198

JO - Radiotherapy & Oncology

JF - Radiotherapy & Oncology

SN - 0167-8140

M1 - 110410

ER -

ID: 399170664