In the context of combating climate change and rapidly advancing next-generation computational technologies, the discovery of new materials is exceptionally critical. However, current computational and experimental approaches face significant limitations when exploring vast chemical spaces. Although artificial intelligence has become an effective tool for material discovery, the absence of accessible data and open pretrained models remains a major bottleneck. Density Functional Theory (DFT) calculations are vital for investigating material stability and performance, but their high computational costs restrict their extensive application in material search spaces.
Recently, researchers from Meta's Fundamental AI Research (FAIR) launched the Open Materials 2024 (OMat24) dataset, which comprises over 110 million DFT calculation results, making it one of the largest publicly available datasets in the field. Concurrently, they introduced EquiformerV2, a cutting-edge Graph Neural Network (GNN) model trained on the OMat24 dataset. This model has achieved leading results on the Matbench Discovery leaderboard, solidifying a robust foundation in the materials science arena.
The OMat24 dataset comprises a diverse array of atomic structures sampled from both equilibrium and non-equilibrium configurations, containing over 118 million atomic structures annotated with energy, force, and cell stress labels. These structures were generated using techniques such as Boltzmann sampling, ab initio molecular dynamics (AIMD), and perturbed structure relaxation, with a special emphasis on non-equilibrium structures to ensure that models trained on OMat24 are applicable to dynamic and far-from-equilibrium properties. The dataset's elemental composition covers a significant portion of the periodic table, focusing on inorganic block materials.
The EquiformerV2 model, trained on the OMat24 dataset and other datasets such as MPtraj and Alexandria, has demonstrated exceptional performance. By incorporating denoising objectives into the training process, the model's predictive capabilities have been significantly enhanced. In the Matbench Discovery benchmark, the EquiformerV2 model trained on OMat24 achieved an F1 score of 0.916 and a mean absolute error (MAE) of 20 meV/atom, establishing a new record for material stability prediction. These outcomes significantly outperform other similar models, underscoring the benefits of pretraining on a large-scale, diverse dataset like OMat24.
The introduction of the OMat24 dataset and associated models marks a significant advancement in AI-assisted materials science. These models are capable of highly accurate predictions of critical properties, such as formation energy, which are invaluable for accelerating material discovery. Furthermore, this open-source release allows the research community to build upon this foundation, further enhancing the role of artificial intelligence in addressing global challenges through the discovery of new materials.
Currently, the OMat24 dataset and models are available on Hugging Face, including checkpoints of the pretrained models, providing valuable resources for AI researchers in materials science. Meta's FAIR chemistry team has made these resources accessible under a permissive license to facilitate broader adoption and use. Additionally, the latest updates from the OpenCatalyst team on X offer further background information, showcasing how these models are continually pushing the boundaries of material stability prediction.
This series of advancements not only brings new breakthroughs to the field of materials science but also provides robust support for addressing climate change and advancing next-generation computational technologies.