Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition

Publikation: Bidrag til tidsskriftTidsskriftartikelfagfællebedømt

Standard

Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition. / Yuan, Yixuan; Qin, Wenjian; Ibragimov, Bulat; Zhang, Guanglei; Han, Bin; Meng, Max Q.H.; Xing, Lei.

I: IEEE Transactions on Automation Science and Engineering, Bind 17, Nr. 2, 8842597, 2020, s. 574-583.

Publikation: Bidrag til tidsskriftTidsskriftartikelfagfællebedømt

Harvard

Yuan, Y, Qin, W, Ibragimov, B, Zhang, G, Han, B, Meng, MQH & Xing, L 2020, 'Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition', IEEE Transactions on Automation Science and Engineering, bind 17, nr. 2, 8842597, s. 574-583. https://doi.org/10.1109/TASE.2019.2936645

APA

Yuan, Y., Qin, W., Ibragimov, B., Zhang, G., Han, B., Meng, M. Q. H., & Xing, L. (2020). Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition. IEEE Transactions on Automation Science and Engineering, 17(2), 574-583. [8842597]. https://doi.org/10.1109/TASE.2019.2936645

Vancouver

Yuan Y, Qin W, Ibragimov B, Zhang G, Han B, Meng MQH o.a. Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition. IEEE Transactions on Automation Science and Engineering. 2020;17(2):574-583. 8842597. https://doi.org/10.1109/TASE.2019.2936645

Author

Yuan, Yixuan ; Qin, Wenjian ; Ibragimov, Bulat ; Zhang, Guanglei ; Han, Bin ; Meng, Max Q.H. ; Xing, Lei. / Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition. I: IEEE Transactions on Automation Science and Engineering. 2020 ; Bind 17, Nr. 2. s. 574-583.

Bibtex

@article{252f359a206a44e48d5c0d9726635b86,
title = "Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition",
abstract = "Automatic polyp recognition in endoscopic images is challenging because of the low contrast between polyps and the surrounding area, the fuzzy and irregular polyp borders, and varying imaging light conditions. In this article, we propose a novel densely connected convolutional network with 'unbalanced discriminant (UD)' loss and 'category sensitive (CS)' loss (DenseNet-UDCS) for the task. We first utilize densely connected convolutional network (DenseNet) as the basic framework to conduct end-to-end polyp recognition task. Then, the proposed dual constraints, UD loss and CS loss, are simultaneously incorporated into the DenseNet model to calculate discriminative and suitable image features. The UD loss in our network effectively captures classification errors from both majority and minority categories to deal with the strong data imbalance of polyp images and normal ones. The CS loss imposes the ratio of intraclass and interclass variations in the deep feature learning process to enable features with large interclass variation and small intraclass compactness. With the joint supervision of UD loss and CS loss, a robust DenseNet-UDCS model is trained to recognize polyps from endoscopic images. The experimental results achieved polyp recognition accuracy of 93.19%, showing that the proposed DenseNet-UDCS can accurately characterize the endoscopic images and recognize polyps from the images. In addition, our DenseNet-UDCS model is superior in detection accuracy in comparison with state-of-the-art polyp recognition methods. Note to Practitioners - Wireless capsule endoscopy (WCE) is a crucial diagnostic tool for polyp detection and therapeutic monitoring, thanks to its noninvasive, user-friendly, and nonpainful properties. A challenge in harnessing the enormous potential of the WCE to benefit the gastrointestinal (GI) patients is that it requires clinicians to analyze a huge number of images (about 50 000 images for each patient). We propose a novel automatic polyp recognition scheme, namely, DenseNet-UDCS model, by addressing practical image unbalanced problem and small interclass variances and large intraclass differences in the data set. The comprehensive experimental results demonstrate superior reliability and robustness of the proposed model compared to the other polyp recognition approaches. Our DenseNet-UDCS model can be further applied in the clinical practice to provide valuable diagnosis information for GI disease recognition and precision medicine.",
keywords = "Category sensitive (CS) loss, densely connected convolutional network (DenseNet), polyp image classification, unbalanced discriminant (UD) loss",
author = "Yixuan Yuan and Wenjian Qin and Bulat Ibragimov and Guanglei Zhang and Bin Han and Meng, {Max Q.H.} and Lei Xing",
year = "2020",
doi = "10.1109/TASE.2019.2936645",
language = "English",
volume = "17",
pages = "574--583",
journal = "IEEE Transactions on Automation Science and Engineering",
issn = "1545-5955",
publisher = "Institute of Electrical and Electronics Engineers",
number = "2",

}

RIS

TY - JOUR

T1 - Densely Connected Neural Network with Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition

AU - Yuan, Yixuan

AU - Qin, Wenjian

AU - Ibragimov, Bulat

AU - Zhang, Guanglei

AU - Han, Bin

AU - Meng, Max Q.H.

AU - Xing, Lei

PY - 2020

Y1 - 2020

N2 - Automatic polyp recognition in endoscopic images is challenging because of the low contrast between polyps and the surrounding area, the fuzzy and irregular polyp borders, and varying imaging light conditions. In this article, we propose a novel densely connected convolutional network with 'unbalanced discriminant (UD)' loss and 'category sensitive (CS)' loss (DenseNet-UDCS) for the task. We first utilize densely connected convolutional network (DenseNet) as the basic framework to conduct end-to-end polyp recognition task. Then, the proposed dual constraints, UD loss and CS loss, are simultaneously incorporated into the DenseNet model to calculate discriminative and suitable image features. The UD loss in our network effectively captures classification errors from both majority and minority categories to deal with the strong data imbalance of polyp images and normal ones. The CS loss imposes the ratio of intraclass and interclass variations in the deep feature learning process to enable features with large interclass variation and small intraclass compactness. With the joint supervision of UD loss and CS loss, a robust DenseNet-UDCS model is trained to recognize polyps from endoscopic images. The experimental results achieved polyp recognition accuracy of 93.19%, showing that the proposed DenseNet-UDCS can accurately characterize the endoscopic images and recognize polyps from the images. In addition, our DenseNet-UDCS model is superior in detection accuracy in comparison with state-of-the-art polyp recognition methods. Note to Practitioners - Wireless capsule endoscopy (WCE) is a crucial diagnostic tool for polyp detection and therapeutic monitoring, thanks to its noninvasive, user-friendly, and nonpainful properties. A challenge in harnessing the enormous potential of the WCE to benefit the gastrointestinal (GI) patients is that it requires clinicians to analyze a huge number of images (about 50 000 images for each patient). We propose a novel automatic polyp recognition scheme, namely, DenseNet-UDCS model, by addressing practical image unbalanced problem and small interclass variances and large intraclass differences in the data set. The comprehensive experimental results demonstrate superior reliability and robustness of the proposed model compared to the other polyp recognition approaches. Our DenseNet-UDCS model can be further applied in the clinical practice to provide valuable diagnosis information for GI disease recognition and precision medicine.

AB - Automatic polyp recognition in endoscopic images is challenging because of the low contrast between polyps and the surrounding area, the fuzzy and irregular polyp borders, and varying imaging light conditions. In this article, we propose a novel densely connected convolutional network with 'unbalanced discriminant (UD)' loss and 'category sensitive (CS)' loss (DenseNet-UDCS) for the task. We first utilize densely connected convolutional network (DenseNet) as the basic framework to conduct end-to-end polyp recognition task. Then, the proposed dual constraints, UD loss and CS loss, are simultaneously incorporated into the DenseNet model to calculate discriminative and suitable image features. The UD loss in our network effectively captures classification errors from both majority and minority categories to deal with the strong data imbalance of polyp images and normal ones. The CS loss imposes the ratio of intraclass and interclass variations in the deep feature learning process to enable features with large interclass variation and small intraclass compactness. With the joint supervision of UD loss and CS loss, a robust DenseNet-UDCS model is trained to recognize polyps from endoscopic images. The experimental results achieved polyp recognition accuracy of 93.19%, showing that the proposed DenseNet-UDCS can accurately characterize the endoscopic images and recognize polyps from the images. In addition, our DenseNet-UDCS model is superior in detection accuracy in comparison with state-of-the-art polyp recognition methods. Note to Practitioners - Wireless capsule endoscopy (WCE) is a crucial diagnostic tool for polyp detection and therapeutic monitoring, thanks to its noninvasive, user-friendly, and nonpainful properties. A challenge in harnessing the enormous potential of the WCE to benefit the gastrointestinal (GI) patients is that it requires clinicians to analyze a huge number of images (about 50 000 images for each patient). We propose a novel automatic polyp recognition scheme, namely, DenseNet-UDCS model, by addressing practical image unbalanced problem and small interclass variances and large intraclass differences in the data set. The comprehensive experimental results demonstrate superior reliability and robustness of the proposed model compared to the other polyp recognition approaches. Our DenseNet-UDCS model can be further applied in the clinical practice to provide valuable diagnosis information for GI disease recognition and precision medicine.

KW - Category sensitive (CS) loss

KW - densely connected convolutional network (DenseNet)

KW - polyp image classification

KW - unbalanced discriminant (UD) loss

UR - http://www.scopus.com/inward/record.url?scp=85083252877&partnerID=8YFLogxK

U2 - 10.1109/TASE.2019.2936645

DO - 10.1109/TASE.2019.2936645

M3 - Journal article

AN - SCOPUS:85083252877

VL - 17

SP - 574

EP - 583

JO - IEEE Transactions on Automation Science and Engineering

JF - IEEE Transactions on Automation Science and Engineering

SN - 1545-5955

IS - 2

M1 - 8842597

ER -

ID: 243526540