Abstract Scope |
I will describe the recent invention of a robust universal machine learning interatomic potential that covers much of the periodic table. More than one thousand GPU years were used to generate the ab initio training data guided by active learning. Diverse test simulations have shown this machine learning potential has outstanding performance, with energy error significantly less than the chemical accuracy (43 meV/atom) for even chemically very complex systems. This universal potential can run over 10,000 times faster than DFT when dealing with several thousand atoms, and allows for more than 10,000 atoms of arbitrary combinations of 72 elements to be simulated together. Going from a few hundred atoms in DFT to up to 55,000 atoms in Matlantis, one can study realistic microstructures such as extended defects with curvatures and their interactions, realistic phase transformations, plastic deformation and damage evolution, electrochemical interfaces, etc. [J Materiomics 9 (2023) 447-454] |