多尺度三元哈希用于医学图像检索。
Multi-scale Triplet Hashing for Medical Image Retrieval.
发表日期:2023 Feb 08
作者:
Yaxiong Chen, Yibo Tang, Jinghao Huang, Shengwu Xiong
来源:
COMPUTERS IN BIOLOGY AND MEDICINE
摘要:
在医学图像检索任务中,由于哈希码的检索效率优势,深度哈希算法广泛应用于大规模数据集用于辅助诊断。其中大部分关注于特征学习,而忽略了医学图像的判别区域和深度特征及哈希码的层次相似性。本文提出了一种新的多尺度三元组哈希(MTH)算法,可同时利用多尺度信息、卷积自注意力和层次相似性来学习有效的哈希码。MTH算法首先设计了多尺度DenseBlock模块来学习医学图像的多尺度信息,同时开发了卷积自注意力机制来执行通道域的信息交互,能够有效地捕捉医学图像的判别区域。在这两条路径上,提出了一种新的损失函数,不仅能在学习过程中保留深度特征的类别级信息和哈希码的语义信息,还能捕获深度特征和哈希码的层次相似性。对Curated X-ray数据集、皮肤癌MNIST数据集和COVID-19放射学数据集进行的大量实验表明,与其他先进的医学图像检索算法相比,MTH算法能进一步提高医疗检索的效果。Copyright © 2023 Elsevier Ltd. All rights reserved.
For medical image retrieval task, deep hashing algorithms are widely applied in large-scale datasets for auxiliary diagnosis due to the retrieval efficiency advantage of hash codes. Most of which focus on features learning, whilst neglecting the discriminate area of medical images and hierarchical similarity for deep features and hash codes. In this paper, we tackle these dilemmas with a new Multi-scale Triplet Hashing (MTH) algorithm, which can leverage multi-scale information, convolutional self-attention and hierarchical similarity to learn effective hash codes simultaneously. The MTH algorithm first designs multi-scale DenseBlock module to learn multi-scale information of medical images. Meanwhile, a convolutional self-attention mechanism is developed to perform information interaction of the channel domain, which can capture the discriminate area of medical images effectively. On top of the two paths, a novel loss function is proposed to not only conserve the category-level information of deep features and the semantic information of hash codes in the learning process, but also capture the hierarchical similarity for deep features and hash codes. Extensive experiments on the Curated X-ray Dataset, Skin Cancer MNIST Dataset and COVID-19 Radiography Dataset illustrate that the MTH algorithm can further enhance the effect of medical retrieval compared to other state-of-the-art medical image retrieval algorithms.Copyright © 2023 Elsevier Ltd. All rights reserved.