Skip to main content

Research Repository

Advanced Search

Detection of COVID-19 Using Transfer Learning and Grad-CAM Visualization on Indigenously Collected X-ray Dataset

Umair, Muhammad; Khan, Muhammad Shahbaz; Ahmed, Fawad; Baothman, Fatmah; Alqahtani, Fehaid; Alian, Muhammad; Ahmad, Jawad

Authors

Muhammad Umair

Muhammad Shahbaz Khan

Fawad Ahmed

Fatmah Baothman

Fehaid Alqahtani

Muhammad Alian



Abstract

The COVID-19 outbreak began in December 2019 and has dreadfully affected our lives since then. More than three million lives have been engulfed by this newest member of the corona virus family. With the emergence of continuously mutating variants of this virus, it is still indispensable to successfully diagnose the virus at early stages. Although the primary technique for the diagnosis is the PCR test, the non-contact methods utilizing the chest radiographs and CT scans are always preferred. Artificial intelligence, in this regard, plays an essential role in the early and accurate detection of COVID-19 using pulmonary images. In this research, a transfer learning technique with fine tuning was utilized for the detection and classification of COVID-19. Four pre-trained models i.e., VGG16, DenseNet-121, ResNet-50, and MobileNet were used. The aforementioned deep neural networks were trained using the dataset (available on Kaggle) of 7232 (COVID-19 and normal) chest X-ray images. An indigenous dataset of 450 chest X-ray images of Pakistani patients was collected and used for testing and prediction purposes. Various important parameters, e.g., recall, specificity, F1-score, precision, loss graphs, and confusion matrices were calculated to validate the accuracy of the models. The achieved accuracies of VGG16, ResNet-50, DenseNet-121, and MobileNet are 83.27%, 92.48%, 96.49%, and 96.48%, respectively. In order to display feature maps that depict the decomposition process of an input image into various filters, a visualization of the intermediate activations is performed. Finally, the Grad-CAM technique was applied to create class-specific heatmap images in order to highlight the features extracted in the X-ray images. Various optimizers were used for error minimization purposes. DenseNet-121 outperformed the other three models in terms of both accuracy and prediction.

Citation

Umair, M., Khan, M. S., Ahmed, F., Baothman, F., Alqahtani, F., Alian, M., & Ahmad, J. (2021). Detection of COVID-19 Using Transfer Learning and Grad-CAM Visualization on Indigenously Collected X-ray Dataset. Sensors, 21(17), Article 5813. https://doi.org/10.3390/s21175813

Journal Article Type Article
Acceptance Date Aug 16, 2021
Online Publication Date Aug 29, 2021
Publication Date 2021-09
Deposit Date Sep 23, 2021
Publicly Available Date Sep 23, 2021
Journal Sensors
Electronic ISSN 1424-8220
Publisher MDPI
Peer Reviewed Peer Reviewed
Volume 21
Issue 17
Article Number 5813
DOI https://doi.org/10.3390/s21175813
Keywords COVID-19; artificial intelligence; transfer learning; CNN; X-ray images
Public URL http://researchrepository.napier.ac.uk/Output/2804703

Files

Detection Of COVID-19 Using Transfer Learning And Grad-CAM Visualization On Indigenously Collected X-ray Dataset (13 Mb)
PDF

Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/

Copyright Statement
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.





You might also like



Downloadable Citations