RootPainter: deep learning segmentation of biological images with corrective annotation
Publikation: Bidrag til tidsskrift › Tidsskriftartikel › Forskning › fagfællebedømt
Forlagets udgivne version, 4,28 MB, PDF-dokument
Convolutional neural networks (CNNs) are a powerful tool for plant image analysis, but challenges remain in making them more accessible to researchers without a machine-learning background. We present RootPainter, an open-source graphical user interface based software tool for the rapid training of deep neural networks for use in biological image analysis. We evaluate RootPainter by training models for root length extraction from chicory (Cichorium intybus L.) roots in soil, biopore counting, and root nodule counting. We also compare dense annotations with corrective ones that are added during the training process based on the weaknesses of the current model. Five out of six times the models trained using RootPainter with corrective annotations created within 2 h produced measurements strongly correlating with manual measurements. Model accuracy had a significant correlation with annotation duration, indicating further improvements could be obtained with extended annotation. Our results show that a deep-learning model can be trained to a high accuracy for the three respective datasets of varying target objects, background, and image quality with < 2 h of annotation time. They indicate that, when using RootPainter, for many datasets it is possible to annotate, train, and complete data processing within 1 d.
|Udgivet - 2022
We thank Villum Foundation (DeepFrontier project, grant no. VKR023338) for financially supporting this study. Eusun Han is a Marie Curie Global Fellow working on a project SenseFuture (no. 884364) funded by European Union's Horizon 2020 Research and Innovation Programme. The biopore dataset was provided from a study supported by the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) within the framework of the research unit DFG FOR 1320.
© 2022 The Authors. New Phytologist © 2022 New Phytologist Foundation.
Antal downloads er baseret på statistik fra Google Scholar og www.ku.dk