Login / Signup

MolGPT: Molecular Generation Using a Transformer-Decoder Model.

Viraj BagalRishal AggarwalP K VinodU Deva Priyakumar
Published in: Journal of chemical information and modeling (2021)
Application of deep learning techniques for de novo generation of molecules, termed as inverse molecular design, has been gaining enormous traction in drug design. The representation of molecules in SMILES notation as a string of characters enables the usage of state of the art models in natural language processing, such as Transformers, for molecular design in general. Inspired by generative pre-training (GPT) models that have been shown to be successful in generating meaningful text, we train a transformer-decoder on the next token prediction task using masked self-attention for the generation of druglike molecules in this study. We show that our model, MolGPT, performs on par with other previously proposed modern machine learning frameworks for molecular generation in terms of generating valid, unique, and novel molecules. Furthermore, we demonstrate that the model can be trained conditionally to control multiple properties of the generated molecules. We also show that the model can be used to generate molecules with desired scaffolds as well as desired molecular properties by conditioning the generation on scaffold SMILES strings of desired scaffolds and property values. Using saliency maps, we highlight the interpretability of the generative process of the model.
Keyphrases
  • machine learning
  • deep learning
  • single molecule
  • emergency department
  • autism spectrum disorder
  • artificial intelligence
  • high resolution
  • adverse drug