Automated fundus image quality assessment and segmentation of optic disc using convolutional neural networks

Bhargav Bhatkalkar, Abhishek Joshi, Srikanth Prabhu, Sulatha Bhandary

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

An automated fundus image analysis is used as a tool for the diagnosis of common retinal diseases. A good quality fundus image results in better diagnosis and hence discarding the degraded fundus images at the time of screening itself provides an opportunity to retake the adequate fundus photographs, which save both time and resources. In this paper, we propose a novel fundus image quality assessment (IQA) model using the convolutional neural network (CNN) based on the quality of optic disc (OD) visibility. We localize the OD by transfer learning with Inception v-3 model. Precise segmentation of OD is done using the GrabCut algorithm. Contour operations are applied to the segmented OD to approximate it to the nearest circle for finding its center and diameter. For training the model, we are using the publicly available fundus databases and a private hospital database. We have attained excellent classification accuracy for fundus IQA on DRIVE, CHASE-DB, and HRF databases. For the OD segmentation, we have experimented our method on DRINS-DB, DRISHTI-GS, and RIM-ONE v.3 databases and compared the results with existing state-of-the-art methods. Our proposed method outperforms existing methods for OD segmentation on Jaccard index and F-score metrics.

Original languageEnglish
Pages (from-to)816-827
Number of pages12
JournalInternational Journal of Electrical and Computer Engineering
Volume10
Issue number1
DOIs
Publication statusPublished - 01-01-2020

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Automated fundus image quality assessment and segmentation of optic disc using convolutional neural networks'. Together they form a unique fingerprint.

  • Cite this