使用多特征融合神经网络进行对比增强乳腺X线摄影诊断乳腺癌。
Breast cancer diagnosis from contrast-enhanced mammography using multi-feature fusion neural network.
发表日期:2023 Aug 23
作者:
Nini Qian, Wei Jiang, Yu Guo, Jian Zhu, Jianfeng Qiu, Hui Yu, Xian Huang
来源:
EUROPEAN RADIOLOGY
摘要:
本研究的目标是开发一种全面的深度神经网络,用于对增强对比度乳腺X线摄影(CEM)图像进行分类,以促进临床乳腺癌诊断。本回顾性单中心研究纳入了2019年1月至2021年8月进行CEM检查的患者。通过使用大型CEM数据集进行训练和测试,结合低能量(LE)和双能量减法(DES)图像、双重视图以及双侧信息,构建了一个多特征融合网络,用于乳腺病变分类。该网络的泛化性能还在两个外部数据集上进行了评估。结果使用AUC、准确度、敏感度和特异度进行报告。共纳入2496名患者(平均年龄53岁±12(标准差)),分为训练集(1718例)、验证集(255例)和测试集(523例)。所提出的基于CEM的多特征融合网络在诊断性能方面表现最佳,AUC为0.96(95%置信区间(CI):0.95, 0.97),相比不融合模型、左右融合模型和仅使用LE图像输入的多特征融合网络。我们的模型在全数字化乳腺摄影(FFDM)外部数据集(86例患者)上达到了AUC为0.90(95%CI:0.85, 0.94),在CEM外部数据集(193例患者)上达到了AUC为0.92(95%CI:0.89, 0.95)。开发的多特征融合神经网络在CEM图像分类中取得了高性能,并能促进基于CEM的乳腺癌诊断。与低能量图像相比,CEM图像在恶性乳腺病变检测中具有更高的敏感性和相似的特异性。多特征融合神经网络是一种有前景的辅助临床乳腺癌诊断工具。·深度卷积神经网络有助于促进基于增强对比度乳腺X线摄影的乳腺癌诊断。·多特征融合神经网络在增强对比度乳腺X线摄影图像分类中取得了高准确率。·该模型是一种有前景的用于促进临床乳腺癌诊断的诊断工具。© 2023年,作者(们)独家授权给欧洲放射学会。
To develop an end-to-end deep neural network for the classification of contrast-enhanced mammography (CEM) images to facilitate breast cancer diagnosis in the clinic.In this retrospective mono-centric study, patients who underwent CEM examinations from January 2019 to August 2021 were enrolled. A multi-feature fusion network combining low-energy (LE) and dual-energy subtracted (DES) images and dual view, as well as bilateral information, was trained and tested using a large CEM dataset with a diversity of breast tumors for breast lesion classification. Its generalization performance was further evaluated on two external datasets. Results were reported using AUC, accuracy, sensitivity, and specificity.A total of 2496 patients (mean age, 53 years ± 12 (standard deviation)) were included and divided into a training set (1718), a validation set (255), and a testing set (523). The proposed CEM-based multi-feature fusion network achieved the best diagnosis performance with an AUC of 0.96 (95% confidence interval (CI): 0.95, 0.97), compared with the no-fusion model, the left-right fusion model, and the multi-feature fusion network with only LE image inputs. Our models reached an AUC of 0.90 (95% CI: 0.85, 0.94) on a full-field digital mammograph (FFDM) external dataset (86 patients), and an AUC of 0.92 (95% CI: 0.89, 0.95) on a CEM external dataset (193 patients).The developed multi-feature fusion neural network achieved high performance in CEM image classification and was able to facilitate CEM-based breast cancer diagnosis.Compared with low-energy images, CEM images have greater sensitivity and similar specificity in malignant breast lesion detection. The multi-feature fusion neural network is a promising computer-aided diagnostic tool for the clinical diagnosis of breast cancer.• Deep convolutional neural networks have the potential to facilitate contrast-enhanced mammography-based breast cancer diagnosis. • The multi-feature fusion neural network reaches high accuracies in the classification of contrast-enhanced mammography images. • The developed model is a promising diagnostic tool to facilitate clinical breast cancer diagnosis.© 2023. The Author(s), under exclusive licence to European Society of Radiology.