Login / Signup

Many-Body Function Corrected Neural Network with Atomic Attention (MBNN-att) for Molecular Property Prediction.

Zheng-Xin YangXin-Tian XiePei-Lin KangZhen-Xiong WangCheng ShangZhi-Pan Liu
Published in: Journal of chemical theory and computation (2024)
Recent years have seen a surge of machine learning (ML) in chemistry for predicting chemical properties, but a low-cost, general-purpose, and high-performance model, desirable to be accessible on central processing unit (CPU) devices, remains not available. For this purpose, here we introduce an atomic attention mechanism into many-body function corrected neural network (MBNN), namely, MBNN-att ML model, to predict both the extensive and intensive properties of molecules and materials. The MBNN-att uses explicit function descriptors as the inputs for the atom-based feed-forward neural network (NN). The output of the NN is designed to be a vector to implement the multihead self-attention mechanism. This vector is split into two parts: the atomic attention weight part and the many-body-function part. The final property is obtained by summing the products of each atomic attention weight and the corresponding many-body function. We show that MBNN-att performs well on all QM9 properties, i.e., errors on all properties, below chemical accuracy, and, in particular, achieves the top performance for the energy-related extensive properties. By systematically comparing with other explicit-function-type descriptor ML models and the graph representation ML models, we demonstrate that the many-body-function framework and atomic attention mechanism are key ingredients for the high performance and the good transferability of MBNN-att in molecular property prediction.
Keyphrases
  • neural network
  • working memory
  • machine learning
  • body mass index
  • low cost
  • physical activity
  • emergency department
  • weight gain
  • quality improvement