About this Abstract |
Meeting |
2023 TMS Annual Meeting & Exhibition
|
Symposium
|
AI/Data Informatics: Computational Model Development, Validation, and Uncertainty Quantification
|
Presentation Title |
How Do You Optimize Your Parameters? Realistically Complex Hyperparameter Optimization of 23 Parameters of a Black Box Function over a Realistically Low Budget of 100 Iterations |
Author(s) |
Sterling G. Baird, Marianne Liu, Taylor D. Sparks |
On-Site Speaker (Planned) |
Sterling G. Baird |
Abstract Scope |
Expensive-to-train deep learning models can benefit from an optimization of the hyperparameters that determine the model architecture. We optimize 23 hyperparameters of Compositionally-Restricted Attention-Based Network (CrabNet), over 100 adaptive design iterations using two models within the Adaptive Experimentation (Ax) Platform. This includes a recently developed Bayesian optimization (BO) algorithm, sparse axis-aligned subspaces Bayesian optimization (SAASBO), which has shown exciting performance on high-dimensional optimization tasks. Using SAASBO to optimize hyperparameters, we demonstrate a new state-of-the-art on the experimental band gap regression task within the materials informatics benchmarking platform, Matbench (∼4.5 % decrease in mean absolute error (MAE) relative to incumbent). Characteristics of the adaptive design scheme as well as feature importances are described for each of the Ax models. SAASBO has great potential to both improve existing surrogate models, as shown in this work, and in future work, to efficiently discover new, high-performing materials in high-dimensional materials science search spaces. (https://github.com/sparks-baird/crabnet-hyperparameter) |
Proceedings Inclusion? |
Planned: |
Keywords |
Computational Materials Science & Engineering, Machine Learning, Other |