CheReS: A Deep Learning-based Multi-faceted System for Similarity Search of Chest X-rays

Authors
Ashery Mbilinyi and Heiko Schuldt
Type
In Proceedings
Date
2022/4
Appears in
Proceedings of the 37th ACM/SIGAPP Symposium On Applied Computing
Location
Brno, Czech Republic (held virtually)
Abstract

One of the fundamental tasks of radiologists is the interpretation of X-ray images. To do this, it is essential to search for similar cases that would help them in the decision-making process, especially when they face an ambiguous image to interpret. Traditionally, Content-Based Medical Image Retrieval (CBMIR) has been applied for this task. However, CBMIR systems sometimes retrieve images that are not clinically relevant and thus do not help the radiologist in their comparative analysis needed to back their decision. To tackle the limitations of CBMIR, this paper introduces CheReS, a novel multi-faceted approach that retrieves clinically similar cases by taking into account patients' demographics (such as age and gender), disease predictions on ambiguous images, and their visual contents. CheReS accomplishes that by employing two deep learning models: an unsupervised Autoencoder we trained to learn low-dimensional X-ray visual feature representations and a supervised Convolutional Neural Network we trained to predict common chest diseases on ambiguous images submitted by a radiologist. Lastly, we employed an algorithmic procedure that decides how to leverage information delivered by both models and patients' demographics to identify the most accurate similar X-rays cases that would significantly help a radiologist diagnose a case at hand. We have evaluated CheReS in two publicly available chest X-ray datasets, CheXpert and ChestX-ray14. Our results show that CheReS has a significant gain in identifying and retrieving similar X-ray cases compared to a traditional CBMIR approach, therefore, providing a better solution for augmenting radiologists in their workflow.