Skip to content

Latest commit

 

History

History
48 lines (48 loc) · 1.89 KB

2024-11-17-keshvarikhojasteh24a.md

File metadata and controls

48 lines (48 loc) · 1.89 KB
title booktitle abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Multi-head Attention-based Deep Multiple Instance Learning
Proceedings of the MICCAI Workshop on Computational Pathology
This paper introduces MAD-MIL, a Multi-head Attention-based Deep Multiple Instance Learning model, designed for weakly supervised Whole Slide Images (WSIs) classification in digital pathology. Inspired by the multi-head attention mechanism of the Transformer, MAD-MIL simplifies model complexity while achieving competitive results against advanced models like CLAM and DS-MIL. Evaluated on the MNIST-BAGS and public datasets, including TUPAC16, TCGA BRCA, TCGA LUNG, and TCGA KIDNEY MAD-MIL consistently outperforms ABMIL. This demonstrates enhanced information diversity, interpretability, and efficiency in slide representation. The model’s effectiveness, coupled with fewer trainable parameters and lower computational complexity makes it a promising solution for automated pathology workflows. Our code is available at https://github.com/tueimage/MAD-MIL.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
keshvarikhojasteh24a
0
Multi-head Attention-based Deep Multiple Instance Learning
1
12
1-12
1
false
Keshvarikhojasteh, Hassan and Pluim, Josien P. W. and Veta, Mitko
given family
Hassan
Keshvarikhojasteh
given family
Josien P. W.
Pluim
given family
Mitko
Veta
2024-11-17
Proceedings of the MICCAI Workshop on Computational Pathology
254
inproceedings
date-parts
2024
11
17