Skip to content

BorgwardtLab/proteinshake_models

Repository files navigation

ProteinShake Models

We build a graph neural network ("Graph"), PointNet++ ("Point"), and a 3D convolution network ("Voxel") as baseline methods and perform evaluation on the ProteinShake tasks. See the paper for more information on the architecture of the models.

Results

Table 1: Results of baseline models/representations (columns) on the ProteinShake tasks (rows). Best marked in bold, values are given as mean and standard deviation over 4 random seeds. The optimal choice of representation depends on the task. Results were obtained on the random split, see the paper supplemental for the other splits.

Task Graph Point Voxel
Binding Site 0.721 $\pm$ 0.010 0.609 $\pm$ 0.006 -
Enzyme Class 0.790 $\pm$ 0.007 0.712 $\pm$ 0.016 0.643 $\pm$ 0.026
Gene Ontology 0.704 $\pm$ 0.001 0.580 $\pm$ 0.002 0.602 $\pm$ 0.018
Ligand Affinity 0.670 $\pm$ 0.019 0.683 $\pm$ 0.003 0.689 $\pm$ 0.013
Protein Family 0.728 $\pm$ 0.004 0.609 $\pm$ 0.004 0.668 $\pm$ 0.005
Protein-Protein Interface 0.883 $\pm$ 0.050 0.974 $\pm$ 0.003 -
Structural Class 0.495 $\pm$ 0.012 0.293 $\pm$ 0.013 0.337 $\pm$ 0.011
Structure Similarity 0.598 $\pm$ 0.018 0.627 $\pm$ 0.006 0.645 $\pm$ 0.020

Figure 2: Comparison of random, sequence, and structure splits across tasks and representations. Models generalize less well to sequence and structure splits, respectively.

Figure 3: Relative improvement due to pre-training across tasks and representations. Performance is substantially improved by pre-training with AlphaFoldDB. Tasks are abbreviated with their initials. Values are relative to the metric values obtained from the supervised model without pre-training.

Installation

One can use conda, mamba or pip to download required packages. The main dependecies are:

proteinshake
pytorch
pyg
pytorch-lightning
hydra

An example for installing ProteinShake_eval with mamba (similar but faster than conda):

mamba create -n proteinshake
mamba activate proteinshake
mamba install pytorch pytorch-cuda=11.8 -c pytorch -c nvidia
mamba install pyg -c pyg
mamba install lightning
pip install hydra-core --upgrade
pip install proteinshake
pip install -e .

Model weights

The weights for pre-trained models are available in the repository.

Training

Supervised training/Finetuning

Train a graph neural network from scratch for the Enzyme Class prediction task:

python experiments/train.py task=enzyme_class representation=graph

Finetune a PointNet++ for the Ligand Affinity prediction task:

python experiments/train.py task=ligand_affinity representation=point_cloud pretrained=true

Use python experiments/train.py to see more details.

Pretraining with masked residue prediction

python experiments/pretrain_mask_residues.py representation=graph

Using pretrained models with custom prediction heads

You can easily use the provided models and weights to add your own prediction heads to the pretrained models. The below code example shows how to prepare the data and load the model weights for the graph model. See also proteinshake_eval/models/ for models of the other representations, and config/ for the correct parameters.

The pretrained models return an embedding of the entire protein (global_pool=mean) or of the residues (global_pool=None) which you can use in your own model. As it is a torch module, you can use it like any other layer.

from proteinshake.tasks import EnzymeClassTask
from proteinshake_eval.models.graph import GNN_encoder
from proteinshake_eval.transforms.graph import GraphTrainTransform
from torch_geometric.loader import DataLoader

# prepare the data
task = EnzymeClassTask()
data_transform = GraphTrainTransform(task)
task = task.to_graph(eps=8).pyg(transform=data_transform)

# load the model and pretrained weights
model = GNN_encoder(embed_dim=256, num_layers=5, global_pool='mean')
model.from_pretrained('pretrained/graph/model.pt')

# run some inference
loader = DataLoader(task.dataset)
print(model.forward(next(iter(loader))))

License

Code in this repository is licensed under BSD-3, the model weights are licensed under CC-BY-4.0.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •