TY - GEN
T1 - Style Enhanced Domain Adaptation Neural Network for Cross-Modality Cervical Tumor Segmentation
AU - Zheng, Boyun
AU - He, Jiahui
AU - Zhu, Jiuhe
AU - Xie, Yaoqin
AU - Zaki, Nazar
AU - Qin, Wenjian
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - Cervical tumor segmentation is an essential step of cervical cancer diagnosis and treatment. Considering that multi-modality data contain more information and are widely available in clinical routine, multi-modality medical image analysis has emerged as a significant field of study. However, annotating tumors for each modality is expensive and time-consuming. Consequently, unsupervised domain adaptation (UDA) has attracted a lot of attention for its ability to achieve excellent performance on unlabeled cross-domain data. Most current UDA methods adapt image translation networks to achieve domain adaptation, however, the generation process may create visual inconsistency and incorrect generation styles due to the instability of generative adversarial networks. Therefore, we propose a novel and efficient method without image translation networks by introducing a style enhancement method into Domain Adversarial Neural Network (DANN)-based model to improve the generalization performance of the shared segmentation network. Experimental results show that our method achieves the best performance on the cross-modality cervical tumor segmentation task compared to current state-of-the-art UDA methods.
AB - Cervical tumor segmentation is an essential step of cervical cancer diagnosis and treatment. Considering that multi-modality data contain more information and are widely available in clinical routine, multi-modality medical image analysis has emerged as a significant field of study. However, annotating tumors for each modality is expensive and time-consuming. Consequently, unsupervised domain adaptation (UDA) has attracted a lot of attention for its ability to achieve excellent performance on unlabeled cross-domain data. Most current UDA methods adapt image translation networks to achieve domain adaptation, however, the generation process may create visual inconsistency and incorrect generation styles due to the instability of generative adversarial networks. Therefore, we propose a novel and efficient method without image translation networks by introducing a style enhancement method into Domain Adversarial Neural Network (DANN)-based model to improve the generalization performance of the shared segmentation network. Experimental results show that our method achieves the best performance on the cross-modality cervical tumor segmentation task compared to current state-of-the-art UDA methods.
KW - Cervical tumor segmentation
KW - Shuffle Remap
KW - Unsupervised domain adaptation
UR - http://www.scopus.com/inward/record.url?scp=85174737148&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174737148&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-45087-7_15
DO - 10.1007/978-3-031-45087-7_15
M3 - Conference contribution
AN - SCOPUS:85174737148
SN - 9783031450860
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 140
EP - 149
BT - Computational Mathematics Modeling in Cancer Analysis - 2nd International Workshop, CMMCA 2023, Held in Conjunction with MICCAI 2023, Proceedings
A2 - Qin, Wenjian
A2 - Zaki, Nazar
A2 - Zhang, Fa
A2 - Wu, Jia
A2 - Yang, Fan
A2 - Li, Chao
PB - Springer Science and Business Media Deutschland GmbH
T2 - 2nd Workshop on Computational Mathematics Modeling in Cancer Analysis, CMMCA 2023
Y2 - 8 October 2023 through 8 October 2023
ER -