Symposium DS06: Integrating Machine Learning with Simulations for Accelerated Materials Modeling
Symposium QT03: Higher-Order Topological Structures in Real Space—From Charge to Spin

Symposium DS06: Integrating Machine Learning with Simulations for Accelerated Materials Modeling

Sergei Manzhos, Tokyo Institute of Technology

Neural Networks with Optimized Neuron Activation Functions and Without Nonlinear Optimization or How to Prevent Overfitting, Cut CPU Cost and Get Physical Insight All at Once

Written by Matthew Nakamura

Sergei Manzhos, a professor at Tokyo Institute of Technology, explained the challenges and innovations in applying neural networks (NN) to materials science and computational chemistry. Emphasizing NN's vital role in diverse applications, Manzhos highlighted their expressive power and generality, albeit at the expense of CPU-intensive parameter optimization and susceptibility to overfitting. Addressing these issues, he proposed a method involving rule-based parameter definitions, eliminating the need for nonlinear optimization. Additionally, optimal neuron activation functions tailored to specific neurons were introduced, enhancing NN's expressiveness. By leveraging additive Gaussian process regression, Manzhos demonstrated a novel approach combining NN's power with linear regression’s robustness. Notably, the method showcased resistance to overfitting with an increased number of neurons. The talk underscored the versatility of this approach, facilitating insights in physics and computational chemistry through modified parameter rules.

Comments

The comments to this entry are closed.