Skip to content

christopher-irw/proformer_ce

Repository files navigation

Proformer

Proformer is a transformer based model for process traces presented in "Structural Positional Encoding for Knowledge Integration in Transformer-based Medical Process Monitoring". Here we present the source code applied to the BPI 2012 challenge dataset.

Architecture

Architecture

Performance

BPI2012 No SPE SPE
Acc@1 0.8140 0.8601
Acc@3 0.9742 0.9800
Acc@5 0.9925 0.9939

Installation

To install the required library using pip:

git clone https://github.com/christopher-irw/proformer_ce.git
cd proformer_ce
pip install -r requirements.txt

Usage

To run a training using the best parameters use:

# run on BPI2012 dataset using SPE
python run_proformer.py --use_taxonomy
# run on BPI2012 dataset WITHOUT SPE    
python run_proformer.py 

Examples

The notebooks directory contains the dataset preprocessing. In general, it is sufficient to have a csv containing a case_id column representing a unique identifier for the cases and a activity column representing the applied actions.

Citation

@article{irwin2024structural,
  title={Structural Positional Encoding for knowledge integration in transformer-based medical process monitoring},
  author={Irwin, Christopher and Dossena, Marco and Leonardi, Giorgio and Montani, Stefania},
  journal={arXiv preprint arXiv:2403.08836},
  year={2024}
}

Contacts

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published