Exploiting Residual Edge Information in Deep Fully Convolutional Neural Networks For Retinal Vessel Segmentation
- Authors
- Khan, Tariq M.; Naqvi, Syed S.; Arsalan, Muhammad; Khan, Muhamamd Aurangzeb; Khan, Haroon A.; Haider, Adnan
- Issue Date
- Jul-2020
- Publisher
- IEEE
- Keywords
- Retinal vessel segmentation; Deep fully convolutional neural network; Semantic segmentation; Low-level semantic information; Residual edge information
- Citation
- 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
- Indexed
- SCOPUS
- Journal Title
- 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/7177
- DOI
- 10.1109/ijcnn48605.2020.9207411
- ISSN
- 2161-4393
2161-4407
- Abstract
- Accurate automatic segmentation of the retinal vessels is crucial for early detection and diagnosis of vision-threatening retinal diseases. A new supervised method using a variant of the fully convolutional neural network is proposed with the advantages of reduced hyper-parameters, reduced computational/memory requirements, and robust performance in capturing tiny vessel information. The fully convolutional architectures previously employed for vessel segmentation have multiple tunable hyperparameters and difficulty in end-to-end training due to their decoder structure. We resolve this problem by sharing information from the encoder for upsampling at the decoder stage, resulting in a significantly smaller number of tunable parameters and low computational overhead at the train and test stages. Moreover, the need for pre- and post-processing steps are eradicated. Consequently, the detection accuracy is significantly improved with scores of 0.9620, 0.9623, and 0.9620 on DRIVE, STARE, and CHASE DB1 datasets respectively.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Engineering > Department of Electronics and Electrical Engineering > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.