Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) is commonly used to detect breast cancer in high-risk patients. It is more sensitive than mammography and breast ultrasound, but not as specific. On the other hand, computerized texture analysis is widely used in DCE-MRI to differentiate benign from malignant breast tumors, but it can be affected by changes in image intensity. Addressing this, we propose a 3-D ranklet transform method to extract texture features invariant to gray-scale image transformations. The ranklet transform performs a non-parametric, multi-resolution, and orientation-selective analysis initially proposed for 2-D images, and we extend it to the 3-D case. Texture features based on the 3-D gray-level co-occurrence matrix (GLCM) are calculated from ranklet images, giving a full 3-D description of tumor volumes. The proposed method is validated on the public BreastDM dataset, which includes 232 patients with 147 malignant and 85 benign cases. The experimental results show that the proposed method achieves the best classification performance with a ranklet resolution of 4 and a multilayer perceptron as a classifier. The accuracy is 0.89, precision is 0.93, sensitivity is 0.90, specificity is 0.88, and the AUC is 0.89. Compared to 10 other classification methods involving deep learning models, our approach performs better using DCE-MRI scans with pre- and post-contrast and subtracting sequences, with improvements of 4.5%–10% for AUC, 9%–32% for accuracy, 12%–33% for precision, and 16%–100% for specificity. The latter index increases notably and is the highest, which could reduce unnecessary biopsies in benign tumors in clinical practice.

A 3-D ranklet-based texture analysis approach to classify breast tumors in DCE-MRI volumes

Scalco E.
Secondo
;
2025

Abstract

Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) is commonly used to detect breast cancer in high-risk patients. It is more sensitive than mammography and breast ultrasound, but not as specific. On the other hand, computerized texture analysis is widely used in DCE-MRI to differentiate benign from malignant breast tumors, but it can be affected by changes in image intensity. Addressing this, we propose a 3-D ranklet transform method to extract texture features invariant to gray-scale image transformations. The ranklet transform performs a non-parametric, multi-resolution, and orientation-selective analysis initially proposed for 2-D images, and we extend it to the 3-D case. Texture features based on the 3-D gray-level co-occurrence matrix (GLCM) are calculated from ranklet images, giving a full 3-D description of tumor volumes. The proposed method is validated on the public BreastDM dataset, which includes 232 patients with 147 malignant and 85 benign cases. The experimental results show that the proposed method achieves the best classification performance with a ranklet resolution of 4 and a multilayer perceptron as a classifier. The accuracy is 0.89, precision is 0.93, sensitivity is 0.90, specificity is 0.88, and the AUC is 0.89. Compared to 10 other classification methods involving deep learning models, our approach performs better using DCE-MRI scans with pre- and post-contrast and subtracting sequences, with improvements of 4.5%–10% for AUC, 9%–32% for accuracy, 12%–33% for precision, and 16%–100% for specificity. The latter index increases notably and is the highest, which could reduce unnecessary biopsies in benign tumors in clinical practice.
2025
Istituto di Tecnologie Biomediche - ITB
Breast cancer, DCE-MRI, Ranklet transform, Texture analysis, Tumor classification
File in questo prodotto:
File Dimensione Formato  
Gomez-Flores, 2025, BSPC.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 2.18 MB
Formato Adobe PDF
2.18 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/547389
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact