About this Abstract |
Meeting |
2026 TMS Annual Meeting & Exhibition
|
Symposium
|
AI/ML/Data Informatics for Materials Discovery: Bridging Experiment, Theory, and Modeling
|
Presentation Title |
Large Language Models for Materials Science: From Sparse Data Classification to Tensorial Property Prediction and Inverse Design |
Author(s) |
Tongqi Wen, Siyu Liu, David J. Srolovitz |
On-Site Speaker (Planned) |
Tongqi Wen |
Abstract Scope |
We present a unified large language model (LLM)-based framework for materials classification, property prediction, and inverse design. Using prompt engineering and deep learning, we transform sparse composition data into textual descriptors for accurate classification, achieving up to 463% improvement over traditional methods in metallic glass formability prediction. For quantitative tasks, we introduce ElaTBot, an LLM that predicts full elastic constant tensors and bulk moduli, outperforming conventional models with up to 33.1% lower error. Integration with Retrieval-Augmented Generation (RAG) further enhances predictive accuracy without retraining. Additionally, coupling ElaTBot with general LLMs enables inverse design of materials with target properties, such as biomedical alloys and corrosion-resistant coatings. This LLM-driven approach offers a powerful, interpretable, and user-friendly platform that bridges data-scarce challenges and high-fidelity materials modeling, advancing autonomous materials discovery. |
Proceedings Inclusion? |
Planned: |
Keywords |
Machine Learning, Mechanical Properties, Modeling and Simulation |