title | booktitle | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Prediction of KRAS mutation status from H&E foundation model embeddings in non-small cell lung cancer |
Proceedings of the MICCAI Workshop on Computational Pathology |
We predicted KRAS mutation status on non-small cell lung cancer (NSCLC) H&E images from foundation model embeddings. We evaluated a variety of attention-based multiple instance learning (MIL) models and aggregation strategies for a tilewise linear classifier. MIL with self-attention performed the best (AUC=0.822) followed by the minimum over tiles classified with the linear model (AUC=0.810). Self-attention was necessary for MIL to surpass tilewise linear classification when a wide range of aggregation techniques was considered. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
robbins24a |
0 |
Prediction of {KRAS} mutation status from H&E foundation model embeddings in non-small cell lung cancer |
170 |
179 |
170-179 |
170 |
false |
Robbins, Marc and Loo, Jessica and Vyawahare, Saurabh and Wang, Yang Von and Mcneil, Carson and Steiner, Dave and Rao, Sudha and Wong, Pok Fai and Rivlin, Ehud and Weaver, Shamira and Goldenberg, Roman |
|
2024-11-17 |
Proceedings of the MICCAI Workshop on Computational Pathology |
254 |
inproceedings |
|