AI/Data Informatics: Computational Model Development, Validation, and Uncertainty Quantification: Material Design I
Sponsored by: TMS Materials Processing and Manufacturing Division, TMS: Computational Materials Science and Engineering Committee
Program Organizers: Saurabh Puri, Microstructure Engineering; Dennis Dimiduk, BlueQuartz Software LLC; Darren Pagan, Pennsylvania State University; Anthony Rollett, Carnegie Mellon University; Francesca Tavazza, National Institute of Standards and Technology; Christopher Woodward, Air Force Research Laboratory

Monday 2:00 PM
February 28, 2022
Room: 256A
Location: Anaheim Convention Center

Session Chair: Anjana Talapatra, Los Alamos National Laboratory

2:00 PM  Invited
Autonomous Research Systems: Benji Maruyama1; 1US Air Force
     The current materials research process is slow and expensive; taking decades from invention to commercialization. The Air Force Research Laboratory pioneered ARES™, the first autonomous research systems for materials development. Researchers are now exploiting advances in artificial intelligence (AI), autonomy & robotics, along with modeling and simulation to create research robots capable of doing iterative experimentation orders of magnitude faster than today. We will discuss concepts and advances in autonomous experimentation in general, and associated hardware, software and autonomous methods. We expect autonomous research to revolutionize the research process, and propose a “Moore’s Law for the Speed of Research,” where the rate of advancement increases exponentially, and the cost of research drops exponentially. We also consider a renaissance in “Citizen Science” where access to online research robots makes science widely available. This presentation will highlight advances in autonomous research and consider the implications of AI-driven experimentation on the materials landscape.

2:30 PM  Invited
AI/ML/DL Approaches for Accelerating Materials Discovery and Design: Ankit Agrawal1; 1Northwestern University
    The increasing availability of data from the first three paradigms of science (experiments, theory, and simulations), along with advances in artificial intelligence and machine learning (AI/ML) techniques has offered unprecedented opportunities for data-driven science and discovery, which is the fourth paradigm of science. Within the arena of AI/ML, deep learning (DL) has emerged as a game-changing technique in recent years with its ability to effectively work on raw big data, bypassing the (otherwise crucial) manual feature engineering step traditionally required for building accurate ML models, thus enabling numerous real-world applications, such as autonomous driving. In this talk, I will present ongoing AI research in our group with illustrative applications in materials science. In particular, we will discuss approaches to gainfully apply AI/ML/DL on big data as well as small data in the context of materials science. I will also demonstrate some of the materials informatics tools developed in our group.

3:00 PM  
Ensemble of State-of-the-art Property Prediction Machine Learning Algorithms: Sterling Baird1; Hasan Muhammad Sayeed1; Taylor Sparks1; 1University of Utah
    Many models have been developed which continue to increase predictive performance for various machine learning tasks such as bulk modulus, band gap, formation energy, and metallicity. As new algorithms continue to be developed, it becomes clear that certain algorithms perform better for certain tasks. For example, CGCNN excels in formation energy prediction and metallicity while CrabNet has the best-in-class performance for experimental and computational band gap predictions. In the last few years, progress has been incremental, with many algorithms displaying results for certain tasks only slightly better than prior work. This begs the question, does each algorithm learn the same information, or does each algorithm have its own contribution in terms of what is learned? We explore and compare compound-wise and chemical class-wise (rather than task-wise) performance for various machine learning models such as CGCNN, CrabNet, Automatminer, MODNet, MEGNet, and DimeNet++ and present an ensemble of these state-of-the-art models.

3:20 PM Break

3:40 PM  
Band Gap Predictions of Novel Double Perovskite Oxides: Anjana Talapatra1; Blas Uberuaga1; Christopher Stanek1; Ghanshyam Pilania1; 1Los Alamos National Laboratory
    The compositional and structural variety inherent to oxide perovskites and their fascinating properties spawn wide-ranging applications from electromechanical devices to opto-electronic materials for radiation detection. The band gap in these materials can be optimally controlled by varying the composition. Here, we use a novel hierarchical screening process, wherein we build four machine learning (ML) models, designed to be applied sequentially to a very large chemical space, to yield novel double oxide perovskite chemistries that are predicted to be experimentally formable, thermodynamically stable and are insulator materials with a wide band gap. We identify a tractable set of promising candidates with high confidence and computationally verify their stability and band gaps. Our multi-step hierarchical screening approach, which may be generalized to investigate other classes of materials in addition to the oxide perovskites examined here, provides further impetus to the application of physics-based ML models to the discovery of novel functional materials.

4:00 PM  
Closed-loop Discovery of the Composition-structure-properties Relationships of Superconductors: Christopher Stiles1; Nam Le1; Ian McCue1; Alexander New1; Christine Piatko1; Janna Domenico1; Eddie Gienger1; Kyle McElroy1; Ivelisse Cabrera1; Daniel Rose1; Timothy Montalbano1; Michael Pekala1; Christine Chung1; Tyrel McQueen2; Elizabeth Pogue2; Christopher Ratto1; Andrew Lennon1; 1Johns Hopkins University Applied Physics Laboratory; 2Johns Hopkins University
    Materials discovery is currently an expensive and slow process, driven by trial and error. Recent research has focused on applying machine learning (ML) to existing datasets to efficiently discover materials with desired properties. However, these studies treat materials databases as fixed snapshots, rather than expanding knowledge stores. They also do not intrinsically incorporate physics and domain knowledge into their models. Here, we provide a case-study in developing a closed-loop system - ML, synthesis, characterization - to map and explore the application-rich space of superconducting compounds. We focused on superconductivity given its incomplete theory and unambiguous property measurement that does not require statistical sampling. By combining materials-related descriptors with a novel ML approach (e.g., RooSt), we screened datasets such as Materials Project to predict new compounds, which were then fabricated and characterized. We carried out several closed-loop iterations, updating our models with experimental results, which enabled discovery of novel superconducting compounds.

4:20 PM  
Topological Class Detection with Attention-based Neural Network: Hasan Muhammad Sayeed1; Taylor D. Sparks1; 1University of Utah
    Identifying topological materials is an important frontier in condensed matter physics. Availability of databases with thousands of topological materials thanks to symmetry indicator-based theoretical approaches and ab initio calculations makes it possible to leverage state-of-the-art machine learning techniques to predict topology of a given material. Recent works have shown the capability of ML in predicting the topology using descriptors derived from the compound’s stoichiometric formula. We use structure-agnostic compositionally-restricted attention-based network (CrabNet) to overcome the obstacles encountered by previous works. We add global quantities like crystal symmetry as features as it is evident that other than coarse-grained chemical compositions, topology depends on global quantities. CrabNet architecture has been successfully used for regression tasks before. We modify the architecture for classification tasks and predict topological class by learning inter-element interactions within the compound. We use the interpretability of CrabNet to probe into the relevance of different properties for prediction of topology.