研究动态
Articles below are published ahead of final publication in an issue. Please cite articles in the following format: authors, (year), title, journal, DOI.

脑肿瘤分类结合了多层密集基于网络的特征提取和使用野马优化的超参数调整的注意力双残差生成对抗网络分类器的优点。

Brain tumor classification for combining the advantages of multilayer dense net-based feature extraction and hyper-parameters tuned attentive dual residual generative adversarial network classifier using wild horse optimization.

发表日期:2024 Aug 28
作者: Shenbagarajan Anantharajan, Shenbagalakshmi Gunasekaran, J Angela Jennifa Sujana
来源: NMR IN BIOMEDICINE

摘要:

在本手稿中,提出了使用野马优化算法优化的专注双残差生成对抗网络用于脑肿瘤检测(ADRGAN-WHOA-BTD)。此处,输入图像是使用 BraTS、RemBRANDT 和 Figshare 数据集收集的。最初,对图像进行预处理以提高图像质量并消除不需要的噪声。使用双树复小波变换(DTCWT)进行预处理。使用多层稠密网络方法提取测地线数据等图像特征和对比度、能量、相关性、均匀性和熵等纹理特征。然后,将提取的图像提供给注意力双残差生成对抗网络(ADRGAN)分类器以对大脑图像进行分类。 ADRGAN 权重参数基于野马优化算法(WHOA)进行调整。所提出的方法在 MATLAB 中执行。对于 BraTS 数据集,ADRGAN-WHOA-BTD 方法的准确度、灵敏度、特异性、F 测量、精密度和错误率分别为 99.85%、99.82%、98.92%、99.76%、99.45% 和 0.15%。然后,所提出的技术展示了 13 秒的运行时间,显着优于现有方法。© 2024 John Wiley
In this manuscript, attentive dual residual generative adversarial network optimized using wild horse optimization algorithm for brain tumor detection (ADRGAN-WHOA-BTD) is proposed. Here, the input imageries are gathered using BraTS, RemBRANDT, and Figshare datasets. Initially, the images are preprocessed to increase the quality of images and eliminate the unwanted noises. The preprocessing is performed with dual-tree complex wavelet transform (DTCWT). The image features like geodesic data and texture features like contrasts, energy, correlations, homogeneity, and entropy are extracted using multilayer dense net methods. Then, the extracted images are given to attentive dual residual generative adversarial network (ADRGAN) classifier for classifying the brain imageries. The ADRGAN weight parameters are tuned based on wild horse optimization algorithm (WHOA). The proposed method is executed in MATLAB. For the BraTS dataset, the ADRGAN-WHOA-BTD method achieved accuracy, sensitivity, specificity, F-measure, precision, and error rates of 99.85%, 99.82%, 98.92%, 99.76%, 99.45%, and 0.15%, respectively. Then, the proposed technique demonstrated a runtime of 13 s, significantly outperforming existing methods.© 2024 John Wiley & Sons Ltd.