Introduction

OMAT24 is a collection of models released by Meta's FAIR Chemistry team. These models are based on the EquiformerV2 architecture and are available in various sizes and configurations. They are designed for tasks in computational chemistry and materials science.

Architecture

The models utilize the EquiformerV2 architecture, which is detailed in the paper arxiv:2306.12059. The implementation is available in the FAIR-Chem repository.

Training

The models are available in three sizes: 31M (S), 86M (M), and 153M (L). They are trained on different datasets, including OMat, MPtrj, and sAlexandria. Some models incorporate denoising augmentation objectives (DeNS).

Guide: Running Locally

  1. Installation: Follow the instructions in the FAIR-Chem documentation to install the necessary software.
  2. Download Checkpoints: Obtain the desired model checkpoint from the OMAT24 page.
  3. Setup: Use the FAIR-Chem library to configure the model as shown:
    from fairchem.core import OCPCalculator
    from ase.optimize import FIRE
    from ase.filters import FrechetCellFilter
    from ase.io import read
    
    atoms = read("atoms.xyz")
    calc = OCPCalculator(checkpoint_path="eqV2_31M_omat_mp_salex.pt")
    atoms.calc = calc
    
    dyn = FIRE(FrechetCellFilter(atoms))
    dyn.run(fmax=0.05)
    
  4. Cloud GPUs: Consider using cloud services like AWS, Google Cloud, or Azure for GPU resources to expedite training and inference.

License

The models are provided under a permissive license that allows both commercial and non-commercial use. Full license details can be found here.

More Related APIs