Mutual-Prototype Adaptation for Cross-Domain Polyp Segmentation
Publikation: Bidrag til tidsskrift › Tidsskriftartikel › Forskning › fagfællebedømt
Standard
Mutual-Prototype Adaptation for Cross-Domain Polyp Segmentation. / Yang, Chen; Guo, Xiaoqing; Zhu, Meilu; Ibragimov, Bulat; Yuan, Yixuan.
I: IEEE Journal of Biomedical and Health Informatics, Bind 25, Nr. 10, 2021, s. 3886-3897.Publikation: Bidrag til tidsskrift › Tidsskriftartikel › Forskning › fagfællebedømt
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Mutual-Prototype Adaptation for Cross-Domain Polyp Segmentation
AU - Yang, Chen
AU - Guo, Xiaoqing
AU - Zhu, Meilu
AU - Ibragimov, Bulat
AU - Yuan, Yixuan
N1 - Publisher Copyright: © 2013 IEEE.
PY - 2021
Y1 - 2021
N2 - Accurate segmentation of the polyps from colonoscopy images provides useful information for the diagnosis and treatment of colorectal cancer. Despite deep learning methods advance automatic polyp segmentation, their performance often degrades when applied to new data acquired from different scanners or sequences (target domain). As manual annotation is tedious and labor-intensive for new target domain, leveraging knowledge learned from the labeled source domain to promote the performance in the unlabeled target domain is highly demanded. In this work, we propose a mutual-prototype adaptation network to eliminate domain shifts in multi-centers and multi-devices colonoscopy images. We first devise a mutual-prototype alignment (MPA) module with the prototype relation function to refine features through self-domain and cross-domain information in a coarse-to-fine process. Then two auxiliary modules: progressive self-training (PST) and disentangled reconstruction (DR) are proposed to improve the segmentation performance. The PST module selects reliable pseudo labels through a novel uncertainty guided self-training loss to obtain accurate prototypes in the target domain. The DR module reconstructs original images jointly utilizing prediction results and private prototypes to maintain semantic consistency and provide complement supervision information. We extensively evaluate the proposed model in polyp segmentation performance on three conventional colonoscopy datasets: CVC-DB, Kvasir-SEG, and ETIS-Larib. The comprehensive experimental results demonstrate that the proposed model outperforms state-of-the-art methods.
AB - Accurate segmentation of the polyps from colonoscopy images provides useful information for the diagnosis and treatment of colorectal cancer. Despite deep learning methods advance automatic polyp segmentation, their performance often degrades when applied to new data acquired from different scanners or sequences (target domain). As manual annotation is tedious and labor-intensive for new target domain, leveraging knowledge learned from the labeled source domain to promote the performance in the unlabeled target domain is highly demanded. In this work, we propose a mutual-prototype adaptation network to eliminate domain shifts in multi-centers and multi-devices colonoscopy images. We first devise a mutual-prototype alignment (MPA) module with the prototype relation function to refine features through self-domain and cross-domain information in a coarse-to-fine process. Then two auxiliary modules: progressive self-training (PST) and disentangled reconstruction (DR) are proposed to improve the segmentation performance. The PST module selects reliable pseudo labels through a novel uncertainty guided self-training loss to obtain accurate prototypes in the target domain. The DR module reconstructs original images jointly utilizing prediction results and private prototypes to maintain semantic consistency and provide complement supervision information. We extensively evaluate the proposed model in polyp segmentation performance on three conventional colonoscopy datasets: CVC-DB, Kvasir-SEG, and ETIS-Larib. The comprehensive experimental results demonstrate that the proposed model outperforms state-of-the-art methods.
KW - domain adaptation
KW - Polyp segmentation
KW - prototype
KW - reconstruction
KW - self-training
UR - http://www.scopus.com/inward/record.url?scp=85105869408&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2021.3077271
DO - 10.1109/JBHI.2021.3077271
M3 - Journal article
C2 - 33945490
AN - SCOPUS:85105869408
VL - 25
SP - 3886
EP - 3897
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
SN - 2168-2194
IS - 10
ER -
ID: 284635834