Skip to content

Eipgen/Neural-Network-Models-for-Chemistry

Repository files navigation

Neural-Network-Models-for-Chemistry

Check Markdown links

A collection of Neural Network Models for chemistry

Quantum Chemistry Method

  • DeePKS, DeePHF
    DeePKS-kit is a program to generate accurate energy functionals for quantum chemistry systems, for both perturbative scheme (DeePHF) and self-consistent scheme (DeePKS).

  • NeuralXC
    Implementation of a machine-learned density functional.

  • MOB-ML
    Machine Learning for Molecular Orbital Theory.

  • DM21
    Pushing the Frontiers of Density Functionals by Solving the Fractional Electron Problem.

  • NN-GGA, NN-NRA, NN-meta-GGA, NN-LSDA
    Completing density functional theory by machine-learning hidden messages from molecules.

  • FemiNet
    FermiNet is a neural network for learning highly accurate ground state wavefunctions of atoms and molecules using a variational Monte Carlo approach.

  • DeePQMC
    DeepQMC implements variational quantum Monte Carlo for electrons in molecules, using deep neural networks written in PyTorch as trial wave functions.

  • PauliNet
    PauliNet builds upon HF or CASSCF orbitals as a physically meaningful baseline and takes a neural network approach to the SJB wavefunction in order tocorrect this baseline towards a high-accuracy solution.

  • DeePErwin
    DeepErwin is python package that implements and optimizes wave function models for numerical solutions to the multi-electron Schrödinger equation.

  • Jax-DFT
    JAX-DFT implements one-dimensional density functional theory (DFT) in JAX. It uses powerful JAX primitives to enable JIT compilation, automatical differentiation, and high-performance computation on GPUs.

  • sns-mp2
    Improving the accuracy of Moller-Plesset perturbation theory with neural networks

  • DeepH-pack
    Deep neural networks for density functional theory Hamiltonian. -DeepH-E3
    General framework for E(3)-equivariant neural network representation of density functional theory Hamiltonian

  • kdft
    The Kernel Density Functional (KDF) code allows generating ML-based DFT functionals.

  • ML-DFT
    ML-DFT: Machine learning for density functional approximations This repository contains the implementation for the kernel ridge regression based density functional approximation method described in the paper "Quantum chemical accuracy from density functional approximations via machine learning".

  • D4FT
    this work proposed a deep-learning approach to KS-DFT. First, in contrast to the conventional SCF loop, directly minimizing the total energy by reparameterizing the orthogonal constraint as a feed-forward computation. They prove that such an approach has the same expressivity as the SCF method yet reduces the computational complexity from O(N^4) to O(N^3)

  • SchOrb
    Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions

  • CiderPress
    Tools for training and evaluating CIDER functionals for use in Density Functional Theory calculations.
  • ML-RPA
    This work demonstrates how machine learning can extend the applicability of the RPA to larger system sizes, time scales, and chemical spaces.
  • ΔOF-MLFF
    a Δ-machine learning model for obtaining Kohn–Sham accuracy from orbital-free density functional theory (DFT) calculations
  • PairNet
    A molecular orbital based machine learning model for predicting accurate CCSD(T) correlation energies. The model, named as PairNet, shows excellent transferability on several public data sets using features inspired by pair natural orbitals(PNOs).

  • SPAHM(a,b)
    SPAHM(a,b): encoding the density information from guess Hamiltonian in quantum machine learning representations

  • GradDFT
    GradDFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation functionals using machine learning techniques.

  • lapnet
    A JAX implementation of the algorithm and calculations described in Forward Laplacian: A New Computational Framework for Neural Network-based Variational Monte Carlo.

  • M-OFDFT
    M-OFDFT is a deep-learning implementation of orbital-free density functional theory that achieves DFT-level accuracy on molecular systems but with lower cost complexity, and can extrapolate to much larger molecules than those seen during training

  • ANN for Schrodinger
    Artificial neural networks (NN) are universal function approximators and have shown great ability in computing the ground state energy of the electronic Schrödinger equation, yet NN has not established itself as a practical and accurate approach to solving the vibrational Schrödinger equation for realistic polyatomic molecules to obtain vibrational energies and wave functions for the excited states
  • equivariant_electron_density
    Generate and predict molecular electron densities with Euclidean Neural Networks
  • DeePDFT
    This is the official Implementation of the DeepDFT model for charge density prediction.
  • DFA_recommeder
    System-specific density functional recommender
  • EG-XC
    The accuracy of density functional theory hinges on the approximation of nonlocal contributions to the exchange-correlation (XC) functional. To date, machine-learned and human-designed approximations suffer from insufficient accuracy, limited scalability, or dependence on costly reference data. To address these issues, we introduce Equivariant Graph Exchange Correlation (EG-XC), a novel non-local XC functional based on equivariant graph neural network
  • scdp
    Machine learning methods are promising in significantly accelerating charge density prediction, yet existing approaches either lack accuracy or scalability. They propose a recipe that can achieve both. In particular, they identify three key ingredients: (1) representing the charge density with atomic and virtual orbitals (spherical fields centered at atom/virtual coordinates); (2) using expressive and learnable orbital basis sets (basis function for the spherical fields); and (3) using high-capacity equivariant neural network architecture
  • physics-informed-DFT
    We have developed an approach for physics-informed training of flexible empirical density functionals. In this approach, the “physics knowledge” is transferred from PBE, or any other exact-constraints-based functional, using local exchange−correlation energy density regularization, i.e., by adding its local energies into the training set

Green Function

  • DeepGreen
    The many-body Green's function provides access to electronic properties beyond density functional theory level in ab inito calculations. It present proof-of-concept benchmark results for both molecules and simple periodic systems, showing that our method is able to provide accurate estimate of physical observables such as energy and density of states based on the predicted Green's function.

Force Field Method

Kernel Method

  • wigner_kernel
    They propose a novel density-based method which involves computing “Wigner kernels”.

Descriptor Domain

  • DeePMD
    A package designed to minimize the effort required to build deep learning based model of interatomic potential energy and force field and to perform molecular dynamics.
  • Torch-ANI
    TorchANI is a pytorch implementation of ANI model.
  • mdgrad
    Pytorch differentiable molecular dynamics
  • PESPIP
    Mathematica programs for choosing the best basis of permutational invariant polynomials for fitting a potential energy surface
  • Schrodinger-ANI
    A neural network potential energy function for use in drug discovery, with chemical element support extended from 41% to 94% of druglike molecules based on ChEMBL.
  • NerualForceFild
    The Neural Force Field (NFF) code is an API based on SchNet, DimeNet, PaiNN and DANN. It provides an interface to train and evaluate neural networks for force fields. It can also be used as a property predictor that uses both 3D geometries and 2D graph information.
  • NNPOps
    The goal of this project is to promote the use of neural network potentials (NNPs) by providing highly optimized, open-source implementations of bottleneck operations that appear in popular potentials.
  • RuNNer
    A program package for constructing high-dimensional neural network potentials,4G-HDNNPs,3G-HDNNPs.
  • aenet
    The Atomic Energy NETwork (ænet) package is a collection of tools for the construction and application of atomic interaction potentials based on artificial neural networks.
  • sGDML
    Symmetric Gradient Domain Machine Learning
  • GAP
    This package is part of QUantum mechanics and Interatomic Potentials
  • QUIP
    The QUIP package is a collection of software tools to carry out molecular dynamics simulations. It implements a variety of interatomic potentials and tight binding quantum mechanics, and is also able to call external packages, and serve as plugins to other software such as LAMMPS, CP2K and also the python framework ASE.
  • NNP-MM
    NNP/MM embeds a Neural Network Potential into a conventional molecular mechanical (MM) model.
  • GAMD
    Data and code for Graph neural network Accelerated Molecular Dynamics.
  • PFP
    Here we report a development of universal NNP called PreFerred Potential (PFP), which is able to handle any combination of 45 elements. Particular emphasis is placed on the datasets, which include a diverse set of virtual structures used to attain the universality.
  • TeaNet
    universal neural network interatomic potential inspired by iterative electronic relaxations.
  • n2p2
    This repository provides ready-to-use software for high-dimensional neural network potentials in computational physics and chemistry.
  • AIMNET
    This repository contains reference AIMNet implementation along with some examples and menchmarks.
  • AIMNet2
    A general-purpose neural netrork potential for organic and element-organic molecules.
  • aevmod
    This package provides functionality for computing an atomic environment vector (AEV), as well as its Jacobian and Hessian.
  • charge3net
    Official implementation of ChargeE3Net, introduced in Higher-Order Equivariant Neural Networks for Charge Density Prediction in Materials.
  • jax-nb
    This is a JAX implementation of Polarizable Charge Equilibrium (PQEq) and DFT-D3 dispersion correction.

Graph Domain

  • Nequip
    NequIP is an open-source code for building E(3)-equivariant interatomic potentials.

  • E3NN
    Euclidean neural networks,The aim of this library is to help the development of E(3) equivariant neural networks. It contains fundamental mathematical operations such as tensor products and spherical harmonics.

  • SchNet
    SchNet is a deep learning architecture that allows for spatially and chemically resolved insights into quantum-mechanical observables of atomistic systems.

  • SchNetPack
    SchNetPack aims to provide accessible atomistic neural networks that can be trained and applied out-of-the-box, while still being extensible to custom atomistic architectures. contains schnet,painn,filedschnet,so3net

  • XequiNet XequiNet is an equivariant graph neural network for predicting the properties of chemical molecules or periodical systems.

  • G-SchNet
    Implementation of G-SchNet - a generative model for 3d molecular structures.

  • PhysNet
    PhysNet: A Neural Network for Predicting Energies, Forces, Dipole Moments and Partial Charges.

  • DimeNet
    Directional Message Passing Neural Network.

  • GemNet
    Universal Directional Graph Neural Networks for Molecules.

  • DeePMoleNet
    DeepMoleNet is a deep learning package for molecular properties prediction.

  • AirNet
    A new GNN-based deep molecular model by MindSpore.

  • TorchMD-Net
    TorchMD-NET provides graph neural networks and equivariant transformer neural networks potentials for learning molecular potentials.

  • AQML
    AQML is a mixed Python/Fortran/C++ package, intends to simulate quantum chemistry problems through the use of the fundamental building blocks of larger systems.

  • TensorMol
    A pakcages of NN model chemistry, contains Behler-Parrinello with electrostatics, Many Body Expansion Bonds in Molecules NN, Atomwise, Forces, Inductive Charges.

  • charge_transfer_nnp
    About Graph neural network potential with charge transfer with nequip model.

  • AMP
    Amp: A modular approach to machine learning in atomistic simulations(https://github.com/ulissigroup/amptorch)

  • SCFNN
    A self consistent field neural network (SCFNN) model.

  • jax-md
    JAX MD is a functional and data driven library. Data is stored in arrays or tuples of arrays and functions transform data from one state to another.

  • EANN
    Embedded Atomic Neural Network (EANN) is a physically-inspired neural network framework. The EANN package is implemented using the PyTorch framework used to train interatomic potentials, dipole moments, transition dipole moments and polarizabilities of various systems.

  • REANN
    Recursively embedded atom neural network (REANN) is a PyTorch-based end-to-end multi-functional Deep Neural Network Package for Molecular, Reactive and Periodic Systems.

  • FIREANN
    Field-induced Recursively embedded atom neural network (FIREANN) is a PyTorch-based end-to-end multi-functional Deep Neural Network Package for Molecular, Reactive and Periodic Systems under the presence of the external field with rigorous rotational equivariance.

  • MDsim
    Training and simulating MD with ML force fields

  • ForceNet
    We demonstrate that force-centric GNN models without any explicit physical constraints are able to predict atomic forces more accurately than state-of-the-art energy centric GNN models, while being faster both in training and inference.

  • DIG
    A library for graph deep learning research.

  • scn
    Spherical Channels for Modeling Atomic Interactions

  • spinconv
    Rotation Invariant Graph Neural Networks using Spin Convolutions.

  • HIPPYNN
    a modular library for atomistic machine learning with pytorch.

  • VisNet
    a scalable and accurate geometric deep learning potential for molecular dynamics simulation

  • flare
    FLARE is an open-source Python package for creating fast and accurate interatomic potentials.)

  • alignn
    The Atomistic Line Graph Neural Network (https://www.nature.com/articles/s41524-021-00650-1) introduces a new graph convolution layer that explicitly models both two and three body interactions in atomistic systems.

  • So3krates
    Repository for training, testing and developing machine learned force fields using the So3krates model.

  • spice-model-five-net
    Contains the five equivariant transformer models about the spice datasets(https://github.com/openmm/spice-dataset/releases/tag/1.1).

  • sake
    Spatial Attention Kinetic Networks with E(n)-Equivariance

  • eqgat
    Pytorch implementation for the manuscript Representation Learning on Biomolecular Structures using Equivariant Graph Attention

  • phast
    PyTorch implementation for PhAST: Physics-Aware, Scalable and Task-specific GNNs for Accelerated Catalyst Design

  • GNN-LF
    Graph Neural Network With Local Frame for Molecular Potential Energy Surface

  • Cormorant
    We propose Cormorant, a rotationally covariant neural network architecture for learning the behavior and properties of complex many-body physical systems.

  • LieConv
    Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

  • torchmd-net/ET
    Neural network potentials based on graph neural networks and equivariant transformers

  • torchmd-net/TensorNet+0.1S
    On the Inclusion of Charge and Spin States in Cartesian Tensor Neural Network Potentials

  • GemNet
    GemNet: Universal Directional Graph Neural Networks for Molecules

  • equiformer
    Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

  • VisNet-LSRM
    Inspired by fragmentation-based methods, we propose the Long-Short-Range Message-Passing (LSR-MP) framework as a generalization of the existing equivariant graph neural networks (EGNNs) with the intent to incorporate long-range interactions efficiently and effectively.

  • AP-net
    AP-Net: An atomic-pairwise neural network for smooth and transferable interaction potentials

  • MACE
    MACE provides fast and accurate machine learning interatomic potentials with higher order equivariant message passing.

  • MACE-OFF23
    This repository contains the MACE-OFF23 pre-traained transferable organic force fields.

  • Unimol+ Uni-Mol+ first generates a raw 3D molecule conformation from inexpensive methods such as RDKit. Then, the raw conformation is iteratively updated to its target DFT equilibrium conformation using neural networks, and the learned conformation will be used to predict the QC properties.

  • ColfNet
    Inspired by differential geometry and physics, we introduce equivariant local complete frames to graph neural networks, such that tensor information at given orders can be projected onto the frames.

  • AIRS
    AIRS is a collection of open-source software tools, datasets, and benchmarks associated with our paper entitled “Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems”.

  • nnp-pre-training
    Synthetic pre-training for neural-network interatomic potentials

  • AlF_dimer
    a global potential for AlF-AlF dimer

  • q-AQUA,q-AQUA-pol
    CCSD(T) potential for water, interfaced with TTM3-F

  • LeftNet
    A New Perspective on Building Efficient and Expressive 3D Equivariant Graph Neural Networks

  • mlp-train
    General machine learning potentials (MLP) training for molecular systems in gas phase and solution

  • ARROW-NN
    The simulation conda package contains the InterX ARBALEST molecular dynamics simulation software along with all the necessary database files to run ARROW-NN molecular simulations

  • SO3krates with transformer
    we propose a transformer architecture called SO3krates that combines sparse equivariant representations

  • AMOEBA+NN
    It present an integrated non-reactive hybrid model, AMOEBA+NN, which employs the AMOEBA potential for the short- and long-range non-bonded interactions and an NNP to capture the remaining local (covalent) contributions

  • LEIGNN A lightweight equivariant interaction graph neural network (LEIGNN) that can enable accurate and efficient interatomic potential and force predictions in crystals. Rather than relying on higher-order representations, LEIGNN employs a scalar-vector dual representation to encode equivariant feature.

  • Arrow NN
    A hybrid wide-coverage intermolecular interaction model consisting of an analytically polarizable force field combined with a short-range neural network correction for the total intermolecular interaction energy.

  • PAMNet
    PAMNet(Physics-aware Multiplex Graph Neural Network) is an improved version of MXMNet and outperforms state-of-the-art baselines regarding both accuracy and efficiency in diverse tasks including small molecule property prediction, RNA 3D structure prediction, and protein-ligand binding affinity prediction.

  • Multi-fidelity GNNs
    Multi-fidelity GNNs for drug discovery and quantum mechanics

  • GPIP
    GPIP: Geometry-enhanced Pre-training on Interatomic Potentials.they propose a geometric structure learning framework that leverages the unlabeled configurations to improve the performance of MLIPs. Their framework consists of two stages: firstly, using CMD simulations to generate unlabeled configurations of the target molecular system; and secondly, applying geometry-enhanced self-supervised learning techniques, including masking, denoising, and contrastive learning, to capture structural information

  • ictp
    Official repository for the paper "Higher Rank Irreducible Cartesian Tensors for Equivariant Message Passing". It is built upon the ALEBREW repository and implements irreducible Cartesian tensors and their products.

  • CHGNet
    A pretrained universal neural network potential for charge-informed atomistic modeling (see publication)

  • GPTFF
    GPTFF: A high-accuracy out-of-the-box universal AI force field for arbitrary inorganic materials

  • rascaline
    Rascaline is a library for the efficient computing of representations for atomistic machine learning also called "descriptors" or "fingerprints". These representations can be used for atomistic machine learning (ml) models including ml potentials, visualization or similarity analysis.

  • PairNet-OPs/PairFE-Net
    In PairFE-Net, an atomic structure is encoded using pairwise nuclear repulsion forces

Transformer Domain

  • SpookyNet
    Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects.
  • trip
    Transformer Interatomic Potential (TrIP): a chemically sound potential based on the SE(3)-Transformer

Empirical force field

  • grappa
    A machine-learned molecular mechanics force field using a deep graph attentional network
  • espaloma
    Extensible Surrogate Potential of Ab initio Learned and Optimized by Message-passing Algorithm.
  • FeNNol
    FeNNol is a library for building, training and running neural network potentials for molecular simulations. It is based on the JAX library and is designed to be fast and flexible.

Semi-Empirical Quantum Mechanical Method

  • OrbNet; OrbNet Denali
    OrbNet Denali: A machine learning potential for biological and organic chemistry with semi-empirical cost and DFT accuracy.
  • OrbNet-Equi
    INFORMING GEOMETRIC DEEP LEARNING WITH ELECTRONIC INTERACTIONS TO ACCELERATE QUANTUM CHEMISTRY
  • OrbNet-Spin
    OrbNet-Spin incorporates a spin-polarized treatment into the underlying semiempirical quantum mechanics orbital featurization and adjusts the model architecture accordingly while maintaining the geometrical constraints.

  • AIQM1
    Artificial intelligence-enhanced quantum chemical method with broad applicability.

  • BpopNN
    Incorporating Electronic Information into Machine Learning Potential Energy Surfaces via Approaching the Ground-State Electronic Energy as a Function of Atom-Based Electronic Populations.
  • Delfta
    The DelFTa application is an easy-to-use, open-source toolbox for predicting quantum-mechanical properties of drug-like molecules. Using either ∆-learning (with a GFN2-xTB baseline) or direct-learning (without a baseline), the application accurately approximates DFT reference values (ωB97X-D/def2-SVP).
  • PYSEQM
    PYSEQM is a Semi-Empirical Quantum Mechanics package implemented in PyTorch.
  • DFTBML
    DFTBML provides a systematic way to parameterize the Density Functional-based Tight Binding (DFTB) semiempirical quantum chemical method for different chemical systems by learning the underlying Hamiltonian parameters rather than fitting the potential energy surface directly.
  • mopac-ml
    MOPAC-ML implements the PM6-ML method, a semiempirical quantum-mechanical computational method that augments PM6 with a machine learning (ML) correction. It acts as a wrapper calling a modified version of MOPAC, to which it provides the ML correction.

Coarse-Grained Method

  • cgnet
    Coarse graining for molecular dynamics
  • SchNet-CG
    We explore the application of SchNet models to obtain a CG potential for liquid benzene, investigating the effect of model architecture and hyperparameters on the thermodynamic, dynamical, and structural properties of the simulated CG systems, reporting and discussing challenges encountered and future directions envisioned.
  • CG-SchNET
    By combining recent deep learning methods with a large and diverse training set of all-atom protein simulations, we here develop a bottom-up CG force field with chemical transferability, which can be used for extrapolative molecular dynamics on new sequences not used during model parametrization.
  • torchmd-protein-thermodynamics
    This repository contains code, data and tutarial for reproducing the paper "Machine Learning Coarse-Grained Potentials of Protein Thermodynamics". https://arxiv.org/abs/2212.07492
  • torchmd-exp This repository contains a method for training a neural network potential for coarse-grained proteins using unsupervised learning
  • AICG
    Learning coarse-grained force fields for fibrogenesis modeling(https://doi.org/10.1016/j.cpc.2023.108964)

Enhanced Sampling Method

  • Enhanced Sampling with Machine Learning: A Review
    we highlight successful strategies like dimensionality reduction, reinforcement learning, and fl ow-based methods. Finally, we discuss open problems at the exciting ML-enhanced MD interface
  • mlcolvar
    mlcolvar is a Python library aimed to help design data-driven collective-variables (CVs) for enhanced sampling simulations.

QM/MM Model

  • NNP-MM
    NNP/MM embeds a Neural Network Potential into a conventional molecular mechanical (MM) model. We have implemented this using the Custom QM/MM features of NAMD 2.13, which interface NAMD with the TorchANI NNP python library developed by the Roitberg and Isayev groups.
  • DeeP-HP
    Scalable hybrid deep neural networks/polarizable potentials biomolecular simulations including long-range effects
  • PairF-Net
    Here, we further develop the PairF-Net model to intrinsically incorporate energy conservation and couple the model to a molecular mechanical (MM) environment within the OpenMM package
  • embedding
    This work presents a variant of an electrostatic embedding scheme that allows the embedding of arbitrary machine learned potentials trained on molecular systems in vacuo.
  • field_schnet
    FieldSchNet provides a deep neural network for modeling the interaction of molecules and external environments as described.
  • MLMM
    This repository contains data and software regarding the paper submited to JCIM, entitled "Assessment of embedding schemes in a hybrid machine learning/classical potentials (ML/MM) approach".

Charge Model

  • gimlet
    Graph Inference on Molecular Topology. A package for modelling, learning, and inference on molecular topological space written in Python and TensorFlow.

Post-HF Method