Algorithm Development in Materials Science and Engineering: Interatomic Potential Developments and Atomistic Modeling II
Sponsored by: TMS Materials Processing and Manufacturing Division, TMS: Integrated Computational Materials Engineering Committee, TMS: Phase Transformations Committee, TMS: Computational Materials Science and Engineering Committee
Program Organizers: Mohsen Asle Zaeem, Colorado School of Mines; Garritt Tucker, Colorado School of Mines; Charudatta Phatak, Argonne National Laboratory; Bryan Wong, University of California, Riverside; Mikhail Mendelev, NASA ARC; Bryce Meredig, Travertine Labs LLC; Ebrahim Asadi, University of Memphis; Francesca Tavazza, National Institute of Standards and Technology

Tuesday 8:30 AM
February 25, 2020
Room: 31C
Location: San Diego Convention Ctr

Session Chair: Mohsen Asle Zaeem, Colorado School of Mines; Mikhail Mendelev, NASA ARC


8:30 AM  Invited
Interatomic Potentials as Physically-informed Artificial Neural Networks: James Hickman1; Ganga P. Purja Pun2; Vesselin Yamakov3; Yuri Mishin2; 1National Institute of Standarsd and Technology; 2George Mason University; 3National Institute of Airospace
    We present a new approach to the development of classical interatomic potentials using physically-informed neural networks (PINN) combined with an analytical bond-order atomic interaction model. Due to the strong physical underpinnings, the PINN potentials demonstrate much better transferability than the existing machine-learning potentials while drastically improving the accuracy in comparison with traditional potentials. PINN potentials can be constructed for both metallic and covalent materials in a unified manner. We demonstrate a number of applications of PINN potentials to large-scale molecular dynamics and Monte Carlo simulations and calculation of thermal and mechanical properties of diverse materials. Some of the specific materials systems include silicon, aluminum, as well as alloys and compounds. Computations aspects of PINN potentials are discussed and future developments in this field are outlined.

9:00 AM  
Development and Validation of Interatomic Potential for Tantalum using Physically-informed Artificial Neural Networks: Yi-Shen Lin1; Ganga Purja Pun1; Yuri Mishin1; 1George Mason University
    The recently developed physically-informed neural network (PINN) method permits the construction of interatomic potentials fitted to large DFT databases with nearly-DFT accuracy and significantly improved transferability in comparison with the existing machine-learning potentials. The superior transferability is due to the underlying analytical bond-order atomic interaction model which properly captures the nature of interatomic bonding in both metallic and covalent systems. We present a new PINN potential for tantalum, which shows excellent accuracy for many different environments including a variety of crystal structures, vacancies, self-interstitials, generalized stacking faults, grain boundaries, surfaces, clusters, deformation paths, highly compressed states occurring in shock wave experiments, and liquid. The transferability of the new potential is demonstrated by application to the dislocation core structure, dislocation glide motion, and a number of other cases.

9:20 AM  Invited
Machine-learned Interatomic Potentials for Alloy Modeling and Phase Diagrams: Gus Hart1; Conrad Rosenbrock1; Konstantin Gubaev2; Alexander Shapeev2; Livia Pártay3; Noam Bernstein4; Gábor Csányi5; 1Brigham Young University; 2Skolkovo Institute of Science and Technology; 3University of Reading; 4Naval Research Laboratory; 5Cambridge University
    We demonstrate the power of machine-learned potentials to model multicomponent systems. We compare two different approaches: Moment tensor potentials (MTP) and he Gaussian Approximation Potential (GAP) framework (kernel regression + the Smooth Overlap of Atomic Positions (SOAP) representation). Both types of potentials give excellent accuracy for a wide range of compositions and rival the accuracy of cluster expansion. While both models are perform well, SOAP-GAP excels at transferability as shown by sensible transformation paths between configurations, and MTP allows, due to its lower computational cost, the calculation of compositional phase diagrams. Given the fact that both methods perform as well as cluster expansion would but yield off-lattice models, we expect them to open new avenues in computational materials modeling for alloys. We show compositional phase diagrams, phonon dispersions, new superalloy phases, all predicted using these two machine-learned interatomic potentials for binary and ternary alloys.

9:50 AM  
Physically-motivated Requirements of Machine Learning Potentials: Jared Stimac1; Jeremy Mason1; 1University of California, Davis
    Machine learning potentials (MLPs) for molecular dynamics simulations have been found to be capable of approaching ab initio accuracy with the computational efficiency of empirical potentials. The accuracy and performance of an MLP is highly dependent upon the choice of descriptors used to characterize the local atomic environment though, as well as on the training data that the algorithm uses to predict the energies and forces. This work uses Gaussian process regression and investigates the effect of the choice of descriptors and the covariance function on the accuracy of the resulting potential. The descriptors should respect the symmetries of the atomic environment, be differentiable with respect to the atomic coordinates, and contain sufficient information. The covariance function should impose minimal constraints on the potential to reduce the risk of systematic error. A MLP for bulk copper is considered for specificity.

10:10 AM Break

10:30 AM  Invited
Molecular Simulations You Can Trust and Reproduce: the OpenKIM Framework : Ellad Tadmor1; Ryan Elliott1; 1University of Minnesota
    The quality of classical molecular and multiscale simulations hinges on the suitability of the employed interatomic model (IM) for a given application. Reproducibility of simulations depends on the ability of researchers to retrieve the original IM that was used. These two issues are addressed by the Open Knowledgebase of Interatomic Models project (https://openkim.org). OpenKIM curates IMs with full provenance control, issues them DOIs so that they can be cited in publications, and tests them exhaustively using "KIM Tests" that compute a host of material properties and "Verification Checks" on coding correctness. OpenKIM is integrated into major simulation packages (including ASE, DL_POLY, GULP and LAMMPS) allowing users to easily use OpenKIM IMs and query their predictions. Machine learning based tools for selecting an IM and assessing uncertainty are under development. OpenKIM functionality provides major benefits to researchers and promises to improve the reliability and reproducibility of molecular simulations of materials.

11:00 AM  
Scale Bridging from DFT to MD with Machine Learning: Mitchell Wood1; Mary Alice Cusentino1; Aidan Thompson1; 1Sandia National Laboratories
    The abundance of data generated from electronic structure calculations lends itself as an excellent starting point for machine learned interatomic potentials(ML-IAP) used in classical Molecular Dynamics(MD) simulations. Unlike empirical potentials used within MD, ML-IAP are capable to express a tunable accuracy with respect to the higher fidelity training data, this of course comes at a greater computational cost. In this talk recent advances in the atomic environment descriptors, model form and computational implementation of the Spectral Neighborhood Analysis Potential (SNAP)will be presented. Each of these developments has been shown to improve the overall accuracy or performance (or both) on modern supercomputing systems. Applications of SNAP ML-IAP to materials in extremes of pressure and radiation damage will be discussed as well.

11:20 AM  
An Active Learning Approach for the Generation of Force Fields from DFT Calculations: Nathan Wilson1; Yang Yang2; Raymundo Arroyave1; Xiaofeng Qian1; 1Texas A&M University; 2Xi'an Jiaotong
    Molecular dynamics(MD) is a powerful tool to explore the transport properties and dynamics of materials, but it is severely limited by the availability of an accurate force field for a given material. Recent advances in machine learning have sped up the development of force fields using first-principles density functional theory(DFT) calculations. However, many of the current approaches require extensive amounts of data to produce accurate force fields, which involves a large number of DFT calculations. Here we present an active learning approach to adaptively select structures that are most informative to the machine learning model. This allows for the generation of accurate force fields using only a small number of DFT calculations, thereby significantly reducing the computational cost. This adaptive learning force field approach can be used to speed up MD studies of complex structural and functional materials.