MAMILNet: advancing precision oncology with multi-scale attentional multi-instance learning for whole slide image analysis
- PMID: 38746682
- PMCID: PMC11092915
- DOI: 10.3389/fonc.2024.1275769
MAMILNet: advancing precision oncology with multi-scale attentional multi-instance learning for whole slide image analysis
Abstract
Background: Whole Slide Image (WSI) analysis, driven by deep learning algorithms, has the potential to revolutionize tumor detection, classification, and treatment response prediction. However, challenges persist, such as limited model generalizability across various cancer types, the labor-intensive nature of patch-level annotation, and the necessity of integrating multi-magnification information to attain a comprehensive understanding of pathological patterns.
Methods: In response to these challenges, we introduce MAMILNet, an innovative multi-scale attentional multi-instance learning framework for WSI analysis. The incorporation of attention mechanisms into MAMILNet contributes to its exceptional generalizability across diverse cancer types and prediction tasks. This model considers whole slides as "bags" and individual patches as "instances." By adopting this approach, MAMILNet effectively eliminates the requirement for intricate patch-level labeling, significantly reducing the manual workload for pathologists. To enhance prediction accuracy, the model employs a multi-scale "consultation" strategy, facilitating the aggregation of test outcomes from various magnifications.
Results: Our assessment of MAMILNet encompasses 1171 cases encompassing a wide range of cancer types, showcasing its effectiveness in predicting complex tasks. Remarkably, MAMILNet achieved impressive results in distinct domains: for breast cancer tumor detection, the Area Under the Curve (AUC) was 0.8872, with an Accuracy of 0.8760. In the realm of lung cancer typing diagnosis, it achieved an AUC of 0.9551 and an Accuracy of 0.9095. Furthermore, in predicting drug therapy responses for ovarian cancer, MAMILNet achieved an AUC of 0.7358 and an Accuracy of 0.7341.
Conclusion: The outcomes of this study underscore the potential of MAMILNet in driving the advancement of precision medicine and individualized treatment planning within the field of oncology. By effectively addressing challenges related to model generalization, annotation workload, and multi-magnification integration, MAMILNet shows promise in enhancing healthcare outcomes for cancer patients. The framework's success in accurately detecting breast tumors, diagnosing lung cancer types, and predicting ovarian cancer therapy responses highlights its significant contribution to the field and paves the way for improved patient care.
Keywords: cancer diagnosis; deep learning; multi-scale attention; multiple instance learning; whole slide image analysis.
Copyright © 2024 Wang, Bi, Qu, Deng, Wang, Zheng, Li, Meng and Miao.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures
![Figure 1](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/11092915/bin/fonc-14-1275769-g001.gif)
![Figure 2](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/11092915/bin/fonc-14-1275769-g002.gif)
![Figure 3](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/11092915/bin/fonc-14-1275769-g003.gif)
Similar articles
-
LESS: Label-efficient multi-scale learning for cytological whole slide image screening.Med Image Anal. 2024 May;94:103109. doi: 10.1016/j.media.2024.103109. Epub 2024 Feb 20. Med Image Anal. 2024. PMID: 38387243
-
Masked autoencoders with handcrafted feature predictions: Transformer for weakly supervised esophageal cancer classification.Comput Methods Programs Biomed. 2024 Feb;244:107936. doi: 10.1016/j.cmpb.2023.107936. Epub 2023 Nov 22. Comput Methods Programs Biomed. 2024. PMID: 38016392
-
Applications of discriminative and deep learning feature extraction methods for whole slide image analysis: A survey.J Pathol Inform. 2023 Sep 14;14:100335. doi: 10.1016/j.jpi.2023.100335. eCollection 2023. J Pathol Inform. 2023. PMID: 37928897 Free PMC article. Review.
-
TGMIL: A hybrid multi-instance learning model based on the Transformer and the Graph Attention Network for whole-slide images classification of renal cell carcinoma.Comput Methods Programs Biomed. 2023 Dec;242:107789. doi: 10.1016/j.cmpb.2023.107789. Epub 2023 Sep 3. Comput Methods Programs Biomed. 2023. PMID: 37722310
-
Recent developments in cervical cancer diagnosis using deep learning on whole slide images: An Overview of models, techniques, challenges and future directions.Micron. 2023 Oct;173:103520. doi: 10.1016/j.micron.2023.103520. Epub 2023 Jul 29. Micron. 2023. PMID: 37556898 Review.
References
-
- Qu L, Liu S, Liu X, Wang M, Song Z. Towards label-efficient automatic diagnosis and analysis: a comprehensive survey of advanced deep learning-based weakly-supervised, semi-supervised and self-supervised techniques in histopathological image analysis. Phys Med Biol. (2022) 67(20):20TR01. doi: 10.1088/1361-6560/ac910a - DOI - PubMed
-
- Rony J, Belharbi S, Dolz J, Ayed IB, McCaffrey L, Granger E. Deep weaklysupervised learning methods for classification and localization in histology images: a survey. arXiv preprint arXiv:1909.03354. (2019).
Grants and funding
LinkOut - more resources
Full Text Sources