FMRNet: A fused network of multiple tumoral regions for breast tumor classification with ultrasound images. Issue 1 (29th November 2021)
- Record Type:
- Journal Article
- Title:
- FMRNet: A fused network of multiple tumoral regions for breast tumor classification with ultrasound images. Issue 1 (29th November 2021)
- Main Title:
- FMRNet: A fused network of multiple tumoral regions for breast tumor classification with ultrasound images
- Authors:
- Cui, Wenju
Peng, Yunsong
Yuan, Gang
Cao, Weiwei
Cao, Yuzhu
Lu, Zhengda
Ni, Xinye
Yan, Zhuangzhi
Zheng, Jian - Abstract:
- ABSTRACT: Purpose: Recent studies have illustrated that the peritumoral regions of medical images have value for clinical diagnosis. However, the existing approaches using peritumoral regions mainly focus on the diagnostic capability of the single region and ignore the advantages of effectively fusing the intratumoral and peritumoral regions. In addition, these methods need accurate segmentation masks in the testing stage, which are tedious and inconvenient in clinical applications. To address these issues, we construct a deep convolutional neural network that can adaptively f use the information of m ultiple tumoral‐regions (FMRNet) for breast tumor classification using ultrasound (US) images without segmentation masks in the testing stage. Methods: To sufficiently excavate the potential relationship, we design a fused network and two independent modules to extract and fuse features of multiple regions simultaneously. First, we introduce two enhanced combined‐tumoral (EC) region modules, aiming to enhance the combined‐tumoral features gradually. Then, we further design a three‐branch module for extracting and fusing the features of intratumoral, peritumoral, and combined‐tumoral regions, denoted as the intratumoral, peritumoral, and combined‐tumoral module. Especially, we design a novel fusion module by introducing a channel attention module to adaptively fuse the features of three regions. The model is evaluated on two public datasets including UDIAT and BUSI with breastABSTRACT: Purpose: Recent studies have illustrated that the peritumoral regions of medical images have value for clinical diagnosis. However, the existing approaches using peritumoral regions mainly focus on the diagnostic capability of the single region and ignore the advantages of effectively fusing the intratumoral and peritumoral regions. In addition, these methods need accurate segmentation masks in the testing stage, which are tedious and inconvenient in clinical applications. To address these issues, we construct a deep convolutional neural network that can adaptively f use the information of m ultiple tumoral‐regions (FMRNet) for breast tumor classification using ultrasound (US) images without segmentation masks in the testing stage. Methods: To sufficiently excavate the potential relationship, we design a fused network and two independent modules to extract and fuse features of multiple regions simultaneously. First, we introduce two enhanced combined‐tumoral (EC) region modules, aiming to enhance the combined‐tumoral features gradually. Then, we further design a three‐branch module for extracting and fusing the features of intratumoral, peritumoral, and combined‐tumoral regions, denoted as the intratumoral, peritumoral, and combined‐tumoral module. Especially, we design a novel fusion module by introducing a channel attention module to adaptively fuse the features of three regions. The model is evaluated on two public datasets including UDIAT and BUSI with breast tumor ultrasound images. Two independent groups of experiments are performed on two respective datasets using the fivefold stratified cross‐validation strategy. Finally, we conduct ablation experiments on two datasets, in which BUSI is used as the training set and UDIAT is used as the testing set. Results: We conduct detailed ablation experiments about the proposed two modules and comparative experiments with other existing representative methods. The experimental results show that the proposed method yields state‐of‐the‐art performance on both two datasets. Especially, in the UDIAT dataset, the proposed FMRNet achieves a high accuracy of 0.945 and a specificity of 0.945, respectively. Moreover, the precision (PRE = 0.909) even dramatically improves by 21.6% on the BUSI dataset compared with the existing method of the best result. Conclusion: The proposed FMRNet shows good performance in breast tumor classification with US images, and proves its capability of exploiting and fusing the information of multiple tumoral‐regions. Furthermore, the FMRNet has potential value in classifying other types of cancers using multiple tumoral‐regions of other kinds of medical images. … (more)
- Is Part Of:
- Medical physics. Volume 49:Issue 1(2022)
- Journal:
- Medical physics
- Issue:
- Volume 49:Issue 1(2022)
- Issue Display:
- Volume 49, Issue 1 (2022)
- Year:
- 2022
- Volume:
- 49
- Issue:
- 1
- Issue Sort Value:
- 2022-0049-0001-0000
- Page Start:
- 144
- Page End:
- 157
- Publication Date:
- 2021-11-29
- Subjects:
- a fused network -- breast cancers -- convolutional neural networks -- multiple tumoral‐regions -- ultrasound images
Medical physics -- Periodicals
Medical physics
Geneeskunde
Natuurkunde
Toepassingen
Biophysics
Periodicals
Periodicals
Electronic journals
610.153 - Journal URLs:
- http://scitation.aip.org/content/aapm/journal/medphys ↗
https://aapm.onlinelibrary.wiley.com/journal/24734209 ↗
http://www.aip.org/ ↗ - DOI:
- 10.1002/mp.15341 ↗
- Languages:
- English
- ISSNs:
- 0094-2405
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 5531.130000
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 25817.xml