IE-CycleGAN:改进的循环一致对抗网络,用于未配对的 PET 图像增强。
IE-CycleGAN: improved cycle consistent adversarial network for unpaired PET image enhancement.
发表日期:2024 Jul 23
作者:
Jianan Cui, Yi Luo, Donghe Chen, Kuangyu Shi, Xinhui Su, Huafeng Liu
来源:
Eur J Nucl Med Mol I
摘要:
仪器技术的进步极大地促进了正电子发射断层扫描(PET)扫描仪的发展。最先进的 PET 扫描仪(例如 uEXPLORER)可以收集质量显着提高的 PET 图像。然而,由于制造和维护成本高昂,目前大多数当地医院都无法使用这些扫描仪。我们的研究旨在将普通 PET 扫描仪获取的低质量 PET 图像转换为与最先进的扫描仪获取的图像质量相当的图像,而不需要配对低质量和高质量 PET 图像。在本文中,我们提出了一种改进的 CycleGAN (IE-CycleGAN) 模型,用于未配对 PET 图像增强。所提出的方法基于 CycleGAN,并添加相关系数损失和患者特定先验损失来约束生成图像的结构。此外,我们定义了从normalX到advanced的训练策略来增强网络的泛化能力。所提出的方法在未配对的 uEXPLORER 数据集和 Biograph Vision 当地医院数据集上进行了验证。对于 uEXPLORER 数据集,所提出的方法取得了比非局部均值滤波 (NLM)、块匹配和 3D 滤波 (BM3D) 以及深度图像更好的结果先验(DIP),与 Unet(监督)和 CycleGAN(监督)相当。对于 Biograph Vision 当地医院数据集,所提出的方法比 NLM、BM3D 和 DIP 实现了更高的对比度噪声比 (CNR) 和肿瘤与背景 SUVmax 比 (TBR)。此外,当应用于来自不同扫描仪的图像时,所提出的方法显示出比 Unet(监督)和 CycleGAN(监督)更高的对比度、SUVmax 和 TBR。所提出的未配对 PET 图像增强方法优于 NLM、BM3D 和 DIP。此外,在当地医院数据集上实施时,它的性能优于 Unet(监督)和 CycleGAN(监督),这证明了其出色的泛化能力。© 2024。作者获得 Springer-Verlag GmbH 德国的独家许可,部分施普林格自然。
Technological advances in instruments have greatly promoted the development of positron emission tomography (PET) scanners. State-of-the-art PET scanners such as uEXPLORER can collect PET images of significantly higher quality. However, these scanners are not currently available in most local hospitals due to the high cost of manufacturing and maintenance. Our study aims to convert low-quality PET images acquired by common PET scanners into images of comparable quality to those obtained by state-of-the-art scanners without the need for paired low- and high-quality PET images.In this paper, we proposed an improved CycleGAN (IE-CycleGAN) model for unpaired PET image enhancement. The proposed method is based on CycleGAN, and the correlation coefficient loss and patient-specific prior loss were added to constrain the structure of the generated images. Furthermore, we defined a normalX-to-advanced training strategy to enhance the generalization ability of the network. The proposed method was validated on unpaired uEXPLORER datasets and Biograph Vision local hospital datasets.For the uEXPLORER dataset, the proposed method achieved better results than non-local mean filtering (NLM), block-matching and 3D filtering (BM3D), and deep image prior (DIP), which are comparable to Unet (supervised) and CycleGAN (supervised). For the Biograph Vision local hospital datasets, the proposed method achieved higher contrast-to-noise ratios (CNR) and tumor-to-background SUVmax ratios (TBR) than NLM, BM3D, and DIP. In addition, the proposed method showed higher contrast, SUVmax, and TBR than Unet (supervised) and CycleGAN (supervised) when applied to images from different scanners.The proposed unpaired PET image enhancement method outperforms NLM, BM3D, and DIP. Moreover, it performs better than the Unet (supervised) and CycleGAN (supervised) when implemented on local hospital datasets, which demonstrates its excellent generalization ability.© 2024. The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.